Everybody wants a language that best models their own mind and work habits and mistake patterns. No language will ever satisfy everybody and debates over which features are better will never be settled. Maybe someday there will be a language creator whereby you select which features you want and don't want, and the language generator then generates the language with the features you select. (However, the hard part would be to get everyone else to use it.)
As a whimsical thought experiment, here is a hypothetical form with feature selections. You can get ideas for features from topics such as LetsDesignProgrammingLanguage, ItsTimeToDumpCeeSyntax, IdealProgrammingLanguage, SyntaxFollowsSemantics, RethinkingCompilerDesign, FutureOfProgrammingLanguages.
Features in each group are mutually-exclusive if they have round brackets, but not if square brackets.
Maybe someday new technology or an overzealous programmer will actually build such a thing.
--top
(Square brackets are check-boxes and round ones are radio-buttons)
Compile Level
- (__) Compiled: product of language targeted to direct run by a machine (possibly a virtual machine) or operating system.
- (__) Interpreted: language is processed as data by a process called an interpreter, especially in some sort of ReadEvalPrintLoop. May support a hidden JustInTimeCompilation, but that should be considered an implementation detail rather than a language detail.
- (__) Both but select on Program: language allows programs to be written for scripting or written for compilation, but programs written for compilation might not work for interpretation (e.g. due to late-binding dependencies) and programs written for interpretation might not work for compilation (e.g. due to top-level commands). This provides a lot of flexibility. However, it also partitions the user community because sharing will be difficult.
- (__) Both for any Program: language ensures that every program is suitable for both compilation and interpretation. Supporting interpretation requires abandoning a model of external linking and requires ensuring the syntax is suitable for one-pass reading (i.e. preprocessors are a bad thing). Suitability for compilation implies abandoning (or at least distinguishing) imperative commands at the toplevel: those that are to be executed at compile-time vs. those that are to be executed at runtime.
Model for Linking and Modularity
- (__) No Linking: product is always fully declared from a single page or file. No linking occurs. There is no model for modularity. Many EsotericProgrammingLanguages have this model. Support for CompileTimeResolution may provide an interesting workaround.
- (__) Includes: product may 'include' other pages, which are linked in place. There might be a system to selectively avoid duplicate includes. All pages are effectively parsed with each compile or execution.
- (__) Single Page, Modular: language semantics allow 'importing' or 'loading' of other components. Allows useful optimizations such as compilation of components... or at least preprocessing and pre-parsing, of language components, because the semantics of the 'import' are held independent of the page into which it was imported.
- (__) External Linking: a linker can combine a product by pulling multiple components together, but does so from outside the language. This model is incompatible with interpretation, and offers no real benefits over the 'Single Page, Modular' model except that it can be hacked into a language that doesn't have a concept of modularity (as was done for CeeLanguage).
Modularity and Linkage Features
- [__] Resolution of Circular Dependencies: the language or linker will correctly handle circular dependencies, ideally with minimal reliance on forward declarations.
- [__] Provisionals and Overrides: the language allows you to override global objects, values, or procedure that are declared in other modules in a manner such that those other modules use the overrides, too. Ideal form would be as though they had always used the override, but this ideal form is somewhat difficult to reconcile this feature with interpretation.
- [__] Inverted Constructs: one can automatically build or construct a shared value from scattered bits and pieces. Normal modularity works the other direction, allowing other components to access a value or function provided by a module, but this opposite direction is useful for building registries and dynamic methods. Often performed procedurally.
- [__] Properly Ordered Startup and Shutdown Procedures: the language can determine a proper order in which to initialize and destroy global objects based on forming a dependency graph between them. This would ensure, for example, that a global mutex object isn't destroyed before it is no longer in use. (Note: The easiest means of achieving this is to reject the notion of startup and shutdown execution, but that may harm other features.)
- [__] Framework abstraction techniques: a semantic mechanism exists to modularly factor the decision-making process for issuing a behavior from the behavior itself. We call these things 'frameworks'. Example solutions include FunctorObject, function pointers, and HigherOrderFunctions.
- [__] Decentralized framework abstraction techniques: as per framework abstraction, but the decision on which behavior to select is itself constructed modularly instead of all in one place. The best known approach to achieve this is MultiMethods (as opposed to regular polymorphism).
- [__] Delimited Exports: programmer control over exactly what is exported from a module. This provides encapsulation and helps prevent implementation details from leaking or being coupled to in other people's projects, and thus makes easier the maintenance of shared modules. Coupling to implementation details can generally be considered a bad thing, as it raises backwards compatibility issues.
- [__] Namespaces: support for namespaces (even just 'modules ARE namespaces') vastly reduces probability of naming collisions. Proper namespaces are more necessary with 'includes' based modularity.
Resource Management:
- (__) Fully Unmanaged: programmer is responsible for destruction/closing of all resources.
- (__) Partially Managed: some resources will be automatically cleaned up. C++ stack operates in this manner. Programmer is expected to manage others.
- (__) Fully Managed: programmer is expected to leak resources, which will generally be collected by the system at some point. Programmer might be able to release resources earlier.
- (__) Strictly Managed: same as fully managed, except that the programmer has no ability to clean up resources directly.
Language-supported Models of Computation: (requires a convenient syntax, and support via the standard language or standard libraries)
- [__] ImperativeProgramming: describe program behavior in terms of actions to perform and contingencies based on immediately observable phenomena, usually via mutations on cells or variables and the sending/receiving of messages. Support for describing procedures and sub-procedures usually qualifies.
- [__] FunctionalProgramming: describe program behavior as an expression whose value needs to be calculated.
- [__] ConstraintLogicProgramming: automated searches in an ambiguous space for results meeting a specific patterns. Inverted evaluation ordering from functional (e.g. can determine parameters to a function given its output). Backtracking implied.
- [__] ObjectOrientedProgramming: describe a program in terms of components living in the language runtime that are dynamically built and hooked together to form something of a 'virtual machine'. These components then interact by passing either messages (immutable values) or components between one another.
- [__] DataflowProgramming: declaratively hook together components between data sources and data sinks to produce workflows. Not quite object-oriented, in that the dataflow and the components themselves might not be dynamically constructed... but combines very nicely with asynchronous OOP.
- [__] ReactiveProgramming: a persisted, event-driven form of FlowBasedProgramming where updates or changes are streamed continuously.
- [__] LazyProgramming?: language support for delaying computations (and even communications) until need arises. Explicit (laziness requires extra syntax) or implicit (lazy by default). Note that this is semantic laziness, which is different from laziness or reordering as an optimization: semantic laziness allows infinite structures.
Language Primary Decision Logic (the one integrated with common libraries and predicates)
- (__) Boolean
- (__) ThreeValuedLogic
- (__) FuzzyLogic
- (__) Higher logic that Tracks causes for Unknowns (intuitionist, modal, etc.)
- (__) Full Confidence, Belief, and Knowledge System (you've probably got a logic language...)
Language support for Collections: (assuming
SymmetryOfLanguage: features should apply to
all 'standard' collections).
- [__] Support for dynamically-sized collections, be they represented as values or objects.
- [__] Support for efficient iteration over collection values.
- [__] Support for high-level collection operations: union, intersect, difference, append, fold, sum, join, product, etc.
- [__] Support for efficient small deltas over collections; ability to share structure
- [__] Support for very large collections that shatter the memory limits of the language runtime.
- [__] Support for automatically parallelizing operations over very large collections (though these operations may need to have certain properties like associative ordering and merging.)
- [__] Adaptive and/or Declarative indexing of collection objects based on needs (e.g. 'index on ...'). Automatic maintenance of these indexes as collections are updated.
- [__] Support for fuzzy likeness and nearness indexing (e.g. spatial clustering)
- [__] Language Integrated Queries, automatically Optimized based on available indexes.
- [__] Support for efficient views over collections, in particular without copying the larger collections.
- [__] Support for efficient updating of views over collections, e.g. ReactiveProgramming with DataDeltaIsolation.
- [__] Support for efficient versioning of collections... which really requires cheap deltas with structure sharing. Necessary for efficient SoftwareTransactionalMemory or FirstClassUndo.
Language support for pattern-matching:
- [__] Case Equality with Dispatch. If-then statements and CeeLanguage 'switch' statements qualify.
- [__] Pattern Matching with Variables over arbitrary values, e.g. Haskell or OCaml style.
- [__] Regular expressions: simple pattern matching over simple collections.
- [__] Full value-based EBNF parsers: arbitrary pattern matching and folding over arbitrary value collections.
- [__] Fuzzy matching and heuristic nearness (aka 'best match')
Language standard support for User Interface:
- [__] Console input and output (read, print; console is 'special' assumed to always be there)
- [__] Standard support for dialogs (a'la TK)
- [__] Standard support for formatted text displays (HTML, PS, TEX, SceneGraph based on DocumentObjectModel)
- [__] DirectManipulation or ObjectBrowser UI (viewed objects directly represent language runtime objects; SmallTalk)
- [__] Standard support for arbitrary 2D graphics (LogoLanguage, PS)
- [__] Standard support for arbitrary 3D graphics (PovRay, VRML)
- [__] Standard support for InteractiveSceneGraphs (pointing, clicking, zooming, performing actions according to rules)
- [__] My language doesn't have a standard User Interface, and I think that is a feature.
Language standard User Interface Features:
- [__] User Interface support subject to stylistic interpretation layers on output (e.g. CSS for dialogs or scenegraphs)
- [__] User Interface support subject to stylistic interpretive indirection on input (useful for handicapped accessibility)
- [__] Language supports versioning and FirstClassUndo over internal constructs, offering these by default
- [__] Language UI follows principles of CapabilityUserInterface
- [__] Default support for ProgressiveDisclosure, LevelOfDetail?, ZoomableUserInterface
- [__] ImmediateModeGui: the UI refers entirely to internal state and (emphatically) not the process by which it got there. This enables the UI for persistence, regeneration, and other features.
- [__] User Interface subject to queries to determine the 'view' (especially neat with ObjectBrowser); requires ImmediateModeGui
Language support for Parallel Operation:
- [__] FirstClass threads or processes: language provides a standard, portable version of a semantically asynchronous components.
- [__] SeparateIoFromCalculation: language-enforced SeparationOfConcerns between communication and calculation. Can be done by dividing a 'pure functional' component from a 'pure imperative' component. This division exposes a great deal of latent parallelism because referentially transparent functional evaluations can occur in arbitrary orders. Note that one might be able to take advantage of latent parallelism even if the language has no semantic asynchronous components.
- [__] Asynchronous Message Send: language provides features for asynchronous delivery of messages, including message queues.
- [__] Synchronous Message Send: language provides features for synchronous delivery of messages; queues can be synthesized via separate threads. Replies to earlier message sends are, themselves, also message sends.
- [__] Promise Pipelining: passing promises to get stuff done or return certain values back and forth so that one can do the more interesting stuff immediately.
- [__] RealTime: language is designed to make and keep promises about when certain actions will occur even with a large amount of resource competition. Fairly cross-cutting.
- [__] Concurrent Logic and Data Programming: Language automatically exploits heavy parallelism when performing searches, joins, filters, and other operations over large collections.
Language support for Safe Shared Memory access:
- (__) No Safety: No language support for safety. Not a problem if there is no parallel operation. Also includes cases where safety and parallel operation is non-standard.
- (__) Low Level (e.g. Mutexes): Simple exclusions, semaphores, etc. These can be used to achieve safety, but cannot be composed readily without high risk of deadlock. See LanguageInhibitsRefactoring.
- (__) SoftwareTransactionalMemory: transactions are provided over memory with at least Atomic and Isolation semantics and whatever degree of durability is achieved by the memory involved. Using hierarchical transactions allows one to safely compose operations.
- (__) No Shared Memory: Memory is safe because it isn't shared. Communication occurs by passing of immutable message values. Any state exists internally to each process. Erlang chooses this model, but it isn't as good as one might initially believe for distribution (one still occasionally ends up emulating shared memory by sharing references to process handles, and one then must emulate mutexes or transactions...).
- (__) TransactionalActorModel: DistributedTransactions? between FirstClass processes, with No Shared Memory between them. Essentially, even the unshared memory is SoftwareTransactionalMemory, and even the process life-cycle is subject to transactions (i.e. if a new process is started as part of a transaction, that is undone if the transaction is aborted).
Support for Distributed Operation:
- [__] Official representation of Process Object: A process object can be folded up, serialized, and transported to a remote machine, then continued.
- [__] Distribution of State: locality of state must be modeled and enforced or all state in the language is global. A pointer to someone else's local variable is prevented.
- [__] Distributed Resource Management: state and process handles that become globally known are collected globally. Not entirely necessary, but the other option is to rely far more on heuristics and deal with broken links.
- [__] Persistence: support for persisting operations and objects in a delay-tolerant and disruption-tolerant manner; handles spurious connectivity.
- [__] Model for parallel operation: it is difficult to optimize a synchronous programming model in an inherently asynchronous medium.
- [__] Shun Shared Mutable State: while mutable state may be possible, disfavoring its use within the language standard libraries is better for distributed operation.
- [__] Stateless Services: support in standard library the notion of stateless unsynchronized service objects, represented as immutable values, that can be copied, containing references only to global state. These may 'distribute' via duplication rather than migration, such that copies are available any place they might be needed.
- [__] State Services: recognize and specialize sate-services as opposed to implementing them as regular process objects. This allows optimizations such as orthogonal security, distribution, caching, redundancy, change of ownership, collection, versioning, merging after offline use, etc.
- [__] Automatic Distribution: language has model of locality such that you can assert (in component configurations) that certain objects have certain locations, most likely specified relative to one another (object A is near object B in the configuration) instead of absolute locations. A single object-configuration could, when constructed, be split into two or more parts and distributed to different machines. Objects with unspecified location subject to decisions by optimizer.
- [__] Redundancy, Resilience, and Regeneration: language support for redundancy without explicit implementation by users... i.e. such that a distributed process-object graph can usually automatically recover after a partial failure, node loss, or a GarbageCollection heuristics failure (since DistributedProgramming cannot support perfect GarbageCollection).
- [__] Support for Security: programmers can toggle and transport communications channel encryption and other security things easily, at least when working distributed within the language. This can be optimized to avoid encryption when on a single computer. Distribution may be restricted based on secrecy requirements (see discussion in DigitalRightsManagement).
- [__] DisruptionTolerantNetworking: the language internal network protocols are designed with minimal handshaking and minimal need for hearbeats or keepalives. GracefulDegradation and recovery during disruptions or delays.
- [__] Offline Support: design to go 'offline' in a prepared way; language runtime can track the things it needs to download to perform offline work, and the language itself is designed to avoid external dependencies. There is a big difference between recovering gracefully from an offline experience and preparing to continue work in anticipation of going offline.
Language Supported Forms of Implicit Context:
- [__] LocalVariable?: one can (via a 'let' statement or in a scope) declare local variables that are then accessible to code written within that scope.
- [__] GlobalVariable: variables accessible to all parts of a process.
- [__] DynamicScoping: DynamicScoping for all variables.
- [__] SpecialVariable: DynamicScoping for some variables.
- [__] Thread Local Storage: support for variables global to the process but local to the thread.
- [__] Implicitly Threaded ContextObject: Haskell-style syntactic sugar for threading a large object (like a Monad) that represents the current context. (see also ExplicitManagementOfImplicitContext).
Syntactic Abstraction
- (__) None
- (__) Keyword Based (most forms of macros)
- (__) Full Syntax Manipulation (new operators, new keywords, etc.)
- (__) Non-Monotonic Syntax Manipulation (as Full, but can also remove old operators, keywords, etc.)
Syntactic Abstraction Features:
- [__] Hygienic by default (avoids variable capture without explicit symbol generation)
- [__] Allows variable capture
- [__] Support for spatially limited syntax support (e.g. just for a block or namespace)
- [__] Support for modular syntax manipulation (syntax can be an export from a module)
- [__] TuringComplete; capable of fulfilling arbitrary requirements
- [__] Structure aware (as opposed to source text substitution)
- [__] Semantics aware (e.g. type-sensitive macro dispatch; generally elides preprocessor-based)
- [__] Design capable of tracking original source for debugging (generally elides preprocessor-based)
Semantics Manipulation
- (__) None (this is usual)
- (__) 'Implementation-based' Extension Support (e.g. C++ #pragma) - not good for portable code!
- (__) Manipulation or overrides of some runtime components
- (__) Manipulation of Postprocess pipeline (e.g. add extra stages of processing, AST transforms, etc.)
Semantics Manipulation Features
- [__] Homoiconic: an official AbstractSyntaxTree is available from within the programming environment.
- [__] FirstClassTypes: ability to construct new types functionally.
- [__] Reflection and Decomposition: operators exist to break down constructs. E.g. ability to ask objects for all their fields, or break down a function.
- [__] Evaluation/Compilation: ability to build a FirstClass construct (like a function) from its Homoiconic AST or a text description, this object having the same properties as one built at initial parse time.
- [__] CompileTimeResolution: (e.g. ability to query a database to obtain a schema at compile-time)
- [__] Continuations: ability to construct one's own operation ordering structures; ability to return to arbitrary locations in program.
- [__] MetaObjectProtocol
- [__] Support for spatially limited manipulations (as opposed to global only)
- [__] Ability to add and process (in context) entirely new AST components (requires advanced syntax manipulation)
Language Versioning Support (prepare to fail to get it right the first time...)
- (__) None; compatibility is ad-hoc or just not a concern for language application.
- (__) Language version statement near top of page. Compiler/Interpreter keeps old versions.
- (__) Leverage non-monotonic syntax extension. Syntax is set from with in the language.
Default Failure Handling: internal handling of errors (as observed in standard libraries)
- (__) Crash: standard libraries crash when misused.
- (__) Limp: no mechanism to detect or internally report errors
- (__) Die (crash... gracefully and explicitly)
- (__) Error codes; reserve part of the return range to indicate errors.
- (__) Exceptions
- (__) Exceptions with continuations (AbortRetryIgnore) so another part of the code can decide policy for returning to work.
- (__) Restart/Condition Mechanism (as in CommonLisp)
- (__) Transaction Failure (with rollback).
Default Failure Reporting: support for external indications of errors
- (__) No Support; errors will need to be reported via regular actions
- (__) Error Output Stream (e.g. 'cerr' or STDERR in C++/C respectively)
- (__) Alert message box
- (__) Exposure of error-reporting hooks (useful for frameworks or standard libraries) so users can capture and process errors.
- (__) Distributed Exceptions or Exceptions with Continuations (probably a PinkyAndTheBrainLanguage)
Failure Reporting Features: as observed in the standard mechanism
- [__] Verbosity Levels and Semantics (error cause) for filtering
- [__] Support for dual logging of error reports.
Syntax Design Principles: (perhaps order these in terms of priority?)
- [__] Familiarity with existing language. You'll accept ugly syntax if it's familiar.
- [__] Full Compatibility with existing language. Do realize that you'll be updating your language continuously if the other language is still versioning.
- [__] Error Resistance (ability to catch certain errors by static analysis)
- [__] Leverage/Avoid Ambiguity (see ambiguity handling, below)
- [__] Terseness (keep programs small)... but do keep in mind that terseness is more readily achieved by good semantics that ensure ideas need to be stated OnceAndOnlyOnce.
- [__] Readability: you think your program will be more readable because of the syntax you're choosing. Not that you've confirmed this.
- [__] Easy Computation: language aimed to be easy to parse, refactor, or perform syntax highlighting
- [__] GraphicalProgramming: hook code together with reduced use of explicit names.
Ambiguity Handling: ambiguity is not always an enemy; it can be used to simplify syntax and even leveraged (e.g. Polymorphism is designed to take advantage of ambiguity).
- (__) Ambiguity Prevented (prevented in syntax)
- (__) Ambiguity is Error (error reports: "action on line X is ambiguous; please clarify")
- (__) Ambiguity is Heuristic Best Bet (chooses any one meaning that makes sense in context, then applies it everywhere else)
- (__) Ambiguity is CompileTime Polymorphism (similar to C++ programming using templates for polymorphism; meaning varies at each place of application).
- (__) Ambiguity is Runtime Polymorphism (actual dispatch is deferred; meaning varies at each time of application).
Block Syntax
- (__) C-style braces
- (__) Pascal-style bracing (explicit BEGIN/END pairs)
- (__) Modula-2/Oberon-style bracing (FOR/END, IF/END, WHILE/END, etc.)
- (__) End-X style (see TheRightWayToDoWordyBlocks)
- (__) Significant whitespace (python/ruby)
- (__) Parenthesis (Lisp-style)
- (__) Square Brackets (TCL-style)
- (__) Compiler-detected based on context (e.g., as with Haskell, or AlternativesToCeeSyntax)
- Perhaps we should make the actual blocking symbol(s) a parameter rather than a radio button for each possible character.
Statement Delineation
- (__) Token separates statements without support for empty statements
- (__) Token separates statements with support for empty statements
- (__) Token terminates statements
Statement Delineation Token
- (__) Semicolon
- (__) Colon
- (__) Period
- (__) Natural line with [ ] (please specify) wrap marker (e.g., use _ for VB-style)
- (__) Parse-based determination
Type Indicator Syntax
- (__) Before variable
- (__) After variable (Pascal-style)
- (__) Unspecified (DynamicTyping or ImplicitTyping)
- (__) Unspecified with optional aspecting (optional type requirements or partial type requirements).
Native Null-Handling
- (__) No native nulls,
- (__) Null is a special 'invalid' reference.
- (__) Null is semantic overloading of a domain or range value, such as using the empty string or a simple string containing "<null>".
- (__) Null supported in tagged union (e.g. Haskell 'Maybe' monad).
- (__) NullObject pattern followed in standard libraries; using a NullObject for certain operations can result in exceptions, but it may be okay for others (e.g. the 'length' of a null string may be zero, but the 'isNull()' property may be different than for an empty string).
- (__) NullObject pattern is language-enforced; for each class, you can build a companion 'Null' class with its own method implementations, with a default 'Null' class being built automatically. Assigning 'null' to a pointer automatically results in a pointer to its null-class object.
Dynamic Type Enforcement
- (__) Dynamic catch-all "Object" or "Variant" variable types are not allowed
- (__) Variables must be explicitly declared as dynamically-typed
- (__) Dynamic variable type(s) are implied by lack of explicit type declaration
- (__) All variables are dynamically-typed (no explicit type declarations)
Type-Tag Sensitivity
- (__) Language primitives do not depend on or are not affected by a type tag (using only value-based "typing"). In other words, language can be implemented without any type tags. They may be added internally for efficiency, but it wouldn't change outward behavior (output).
- (__) Applies only to scalar variables
- (__) Applies to all variables, including compound variables such as arrays. In other words, arrays can be treated as scalars and vice verse without an explicit declaration or "change" operation.
- (__) At least some language primitives are type-tag sensitive. Thus, their behavior may vary depending on the value of the type tag independent of the value of the variable. Operations exist that supply the value of or indicate the type tag. (Note that being "numeric" and being parse-able or cast-able as "numeric" are not necessarily the same thing.)
Etc...
Symbol allocation:
References (such as C's "&"): _____
Object path element separator (such as "." in Java) _______
String concatenation: ______
- If "+" is used, is it overloaded with math addition?: [_]
Name-space element separator/indicator: ____________
(Etc.)
I note how much of the above is about picking and choosing syntax. SyntaxMatters, but not that much.
- The original list was only meant as a starting point. You are welcome to add non-syntax features.
- Suggestion accepted.
[Indeed. Syntax is about as important as what kind of keyboard you use -- it might have an effect on individual productivity, but it's of no theoretical or conceptual significance.]
{Unfortunately, theory and reality are only theoretically related. SyntaxMatters a great deal. Get used to it.}
That "just a theory" fallacy is so damn tired and old. Gravity, and the whole notion that something will fall if you drop it, is also only a theory. And so is 'SyntaxMatters'.
Syntax does matter. It's a vehicle for semantics, and good syntax abstraction (e.g. RealMacros) can help where the semantic abstractions of the language are insufficient. But the people who think SyntaxMatters a "great deal" tend to be victims of Parkinson's law of triviality. People will argue for weeks on braces vs. whitespace because that's all they really know about language design.
{This is naturally contradictory. If syntax has impacts on productivity, as is claimed above, and productivity of a coder translates into income of a corporation, then clearly syntax must have an impact on a company's bottom-line. This is very, very, very, very, very, very, very, very far from being theoretical.}
I do not doubt that it does have an impact. But the nature of that impact is quite theoretical at the moment. At best, we can usefully predict how certain decisions in syntax will catch or be prone to particular errors, support or hinder particular patterns of syntactic abstraction, and simplify or make more complicated given use cases. Considering overall productivity? not a chance. Of course, the people who argue for weeks on braces vs. whitespace and such rarely even know about these predictable impacts, and their frivolous pursuit of syntax is more religion than sound engineering.
In any case, choice of syntax doesn't make nearly the amount of difference as does design and careful choice of semantics. Compared to semantics, syntax is a very shallow subject. People who believe syntax is all there is to language are people who haven't seriously studied language.
{Remember that a happy coder makes better code, just as happy cows make better cheese.}
I wouldn't be surprised if coder happiness and code quality are weakly correlated, but I doubt there is a causal relationship involved in the direction you imply.
I generally agree with the "happy cows" statement (not my statement). It's a variation of PsychologyMatters. A closer fit to a person's brain is better text-to-human communication, and better communication makes for less misunderstandings and perception mistakes. --top
Here is a glass-half-empty vs. glass-half-full question for you, top. Are you naturally inclined to believe that happy cows make better cheese because they're happy? or are you naturally inclined to reject such a conclusion as premature, thinking that cow happiness and quality of cow cheese could both have a positive causal relation with, say, cow health?
You've been peddling pseudoscientific beliefs like 'PsychologyMatters' on this wiki since you got here, despite neither making any useful or falsifiable predictions nor having any evidence to back you on it. If you're the type of person who jumps to premature conclusions, I can, perhaps, see how you come to insist on these beliefs as though they were more than speculation.
{ In support of the happy cows hypothesis, there appears to be a distinct correlation between adoption of XP practices, individual programmer happiness, and product quality. The practices, ultimately, are what enforces product quality, but nonetheless, coders seem happiest when practicing XP. Perhaps the next best example would be Scrum. Meanwhile, those which rely on BDUF-methodologies seem pretty consistently unhappy, and the quality of the product, with very rare exception, seems to be horrible.
However, there is a rarity -- that of the SpaceShuttle's software, which is the epitome of BDUF, and yet has stellar quality. And, if public record is to be believed, the coders on the project are happy to be there. They have a process which works for them, and they've expressed no desire to abandon it for greener pastures. Again, a correlation between happiness and product quality.
The Shuttle software team has one benefit that the rest of us in BDUF-land do not have - their spec are set in stone - no deviations are allowed, unless a full impact study is done, and all schedules adjusted in the proper direction. They are not trying to hit a moving target.
There are, in fact, many instances in history (not just software related) where the reverse occurs (nonhappiness closely correlated with lack of product quality). The most extreme, of course, is when Verner Von Braun was working on the V2 program in Nazi Germany, where the slave labor urinated in the mechanisms and electronics, left screws untightened or over-tightened, et. al., all in an effort to deliberately sabotage the project. A bit less extreme an example might well be Microsoft Windows itself. }
[There may indeed be a correlation between developer happiness and product quality -- it's hard to say, without experimental results that clearly isolate happiness from other factors that may affect quality and vice versa -- but correlation obviously does not imply causality. The above suggests to me that successful projects may be causal in producing both developer happiness and high quality, but again, without experimental evidence this is mere speculation, as is the reasonable -- but unproven -- supposition that happy people (for whatever reason) are more likely to produce good quality than unhappy people. However, I'm not sure what this has to do with the significance of syntax. Anecdotally, I've observed that experienced developers with numerous languages under their belt are far less likely than junior programmers (typically familiar with only a few languages) to be concerned with trivialities of syntax, such as block and statement delimiters and delineation. Indeed, experienced developers are more likely to regard (for example) one statically-typed, manifest-typed, compiled, imperative, object-oriented, structured language as pretty much interchangeable with any other regardless of syntax (and therefore be equally happy or unhappy with any of them), while the junior developer may naively regard C# as completely different from Java purely on the basis of syntax, or ObjectPascal completely different from C++ (and therefore be happier with one over the other), with the distinctions between (say) C++ and Haskell -- which are conceptually significant -- barely comprehended.]
What's a "vague type"? (BTW, all languages can be implemented without type tags.)
I've changed it to "dynamic type", as I can only assume that is what was meant.
I'm afraid I don't recognise much in the "Type-Tag Sensitivity" section. A "type tag" is a specific implementation-level mechanism used in some languages. Even if we assume "type-tag" is intended to mean "type reference", it still doesn't make much sense.
CategoryLanguageDesign
SeptemberZeroEight