The use of words, grunts, and sounds to communicate ideas has been around longer than recorded history. Over time, we've built enough different words that we couldn't teach them by experience alone... we had to use words to describe words. There is some difficulty of bootstrapping that system that anyone reading this passage likely began to do shortly after being born.
When we use words to describe other words in terms of their purpose, origin, and application, we call that 'explanation'. When we use words to describe other words in terms of its meaning (the properties required to qualify for use of the word, which also happens to be the same set of properties communicated by use of the word) we call that 'definition'.
Strictly, 'word' as used here can also be any word-phrase or idiom that is capable of receiving distinct definition that isn't wholly implied by the underlying words.
Some things to consider:
Natural vs. Formal Language Definitions: Loosely - In natural languages, the use of the word comes before any definition of it, and so those people writing the definition must do their very best to adapt the definition to the actual usage of the word (upon failure, the definition is incorrect), whereas in formal languages the definition comes before any use of the word and so usage of the word must strictly adhere to the definition (upon failure, the user of the word is incorrect). In practice, there is no powerful barrier between these two: experts tend to formalize the meaning of natural language words as a matter of course, and laymen never really understand what the experts are saying and so will use the word incorrectly and give rise to the natural language definition. In a sense, experts and laymen will end up using the same words to speak two distinct (but remarkably similar) languages, and, thus, much confusion will ensue.
Trivially, a 'good' definition for a natural language word would need to be accurate to its usage; it wouldn't be very reasonable to call a wrong definition a 'GoodDefinition'. For a formal language, 'accuracy' is a complete non-issue: the definition comes before use of the word, so the definition is by nature 'accurate'. Instead, for a formal language a GoodDefinition ought to define a useful word.
Of course, when experts formalize existing natural language words, it is considered courteous to pay homage to both the layman's use of the word, any of the word's etymological roots, and prior approaches to formalizing the word. If the 'formal' definition is largely consistent (i.e. not contradictory in too many instances) with what the layman would tell you, then a layman won't become too confused at your use of the term, and probably won't mess it up too badly. If there is no word consistent with an expert's concept, it is better to invent a new term than reuse an old one, and to avoid creating words that will mislead the potential audience.
When a word or word-phrase becomes associated with marketing buzz, there are additional political and economic agendas that will cause even experts to duke it out among themselves as to whether they have the right to call their product 'ObjectOriented' or 'ParadigmFAD'. To my observation, it seems this just leads to a great deal of confusion and dilution of an idea since groups staking claim to the word make it ever more ambiguous and vague and, ultimately, useless for purpose of any real discourse.
What makes a useful 'word'? The fundamental purpose of a word, hearkening back to the days before writing, is to support the rapid communication of ideas, commands, requests, trades and barter and opportunities, strategies, dangers, risks, etc. After writing was invented, an additional purpose became the persistence of the same, allowing communications across vast times and distances. Then logic and math were eventually created and the word, used properly, could additionally support the disciplined discovery of new ideas - new conclusions - from known premises, at least so long as certain well-proven rules are followed (those forming a 'logic'). And more recently, of course, the application of math and logic were automated, giving more power to words than ever before. But, still, underlying all of this is that 'word' is fundamentally for communication - for, ideally, rapid and accurate communication. Therefore, a useful word must carry a meaning that (a) entities actually want to communicate, that (b) compresses more information into a smaller space, and that (c) does not lead to unnecessary miscommunications (including vagueness or ambiguity).
Obviously, being 'useful' is not a criterion for a good natural language definition. Useful or not, the word needs to be defined if one is going to attempt any comprehensive dictionary of the natural language. BuzzWords and EverythingIsa words and redundant words aren't particularly useful - perhaps even of negative utility if one measures the value of a word in its compression of useful communication - but they still require a definition (or at least an explanation).
However, being 'useful' is of special concern for the formal language definitions. Since the definition comes before the formal use of the word, one must consider when crafting the definition whether the word will actually be useful. There is severely diminished utility in words that are redundant, ambiguous, vague, unnecessarily subjective, that encompass too much (doesn't make useful distinctions), that encompass too little (making them less applicable), or that are difficult to decide (e.g. determining whether the word applies is intractable or undecidable). And, regarding compression, any word that is longer than a complete description of its meaning would be completely pointless.
So, what is a GoodDefinition?
Well, for natural language, it simply means 'accurate'. The language can change, and the definition will need to change to match. The word doesn't need to be useful. If the word is truly redundant in practice (no 'subtle' differences in meaning or use), one can simply call it a 'synonym of' and be done with it.
For a formal language, it means 'useful'. A number of properties that make the word useful and diminish its utility are described above. There is no proper notion of 'accurate' since a formal language definition doesn't answer to any authority, though one might consider issues of 'courtesy'. (Since there is no proper notion of 'accurate', any natural language dictionary is irrelevant once the formal language definition has been constructed.)
I'd vote this page be called ExcellentDefinition?.. because good isn't GoodEnough. ;-) But there are some misunderstandings and incorrect statements that need to be addressed before this page title can be changed to ExcellentDefinition?. A dictionary such as Oxford is a formal reference of formal definitions. You've mixed up informal with formal. Saying that the dictionary is irrelevant once the formal definition has been constructed is lunacy.
The Oxford English Dictionary attempts to be comprehensive, but does not primarily contain 'formal definitions', and especially not 'formal language definitions' - English is not a formal language. The definition provided by an English dictionary is only relevant to a formal language insofar as courtesies are observed; the dictionary might later be updated to possess the word if it enters common English, but even then the dictionary does not determine or reflect the formal use of the term - it only reflects the term as laymen understand and use it. The definition of 'function' is a fine example - it is overloaded in a great many different formal languages to mean everything from 'procedure' to 'purpose' to its mathematical meaning, but it often only has one very precise meaning in any given formal language - e.g. the CeeLanguage definition of a 'function' is not affected, in any way, by what the dictionary says a 'function' should be. That literally makes the dictionary irrelevant to the CeeLanguage definition of 'function' even if you'd wish it to be otherwise.
The dictionary will and should be updated (fixed) according to the formal definition. Quotes below back up my claims. Citing references from authorities and well known institutions is also a good way to maintain form, authority, and sincerity if there is doubt that what one is saying is correct (which is why someone has so graciously provided the references below).
Some quotes for you from sample random websites to give you an idea:
Another quote:
Now, if one is looking for a formal definition of a programming language (such as a Backus–Naur specification), one will not find it in the dictionary since the dictionary is for words. The dictionary as a formal reference for words, however, remains extremely useful; as do encyclopedias, especially not the wikipedia (although wikipedia is still useful). C2 wiki and wikipedia are examples of informal sources which sometimes contain a small bit of form to them, most times not.
My informal thoughts for today: the high quality dictionaries are like police men who stop an ignoramus from making up shit as he goes.
Perhaps, but only just. Dictionaries, no matter how respectable, are mere records of popular use of terminology, and generally that found in particular socio-economic groups. If one ignoramus "makes up shit", and enough ignoramuses (ignoramii?) use his or her terminology and definitions such that they are considered popular usage, then that is what dictionaries will reflect.
By the way, above, you appear to have confused "formal" with "semi-official". I'm not aware of any popular dictionary that contains formal definitions in any rigorous sense of the word "formal".
Re: "The definition provided by an English dictionary is only relevant to a formal language insofar as courtesies are observed; the dictionary might later be updated to possess the word if it enters common English, but even then the dictionary does not determine or reflect the formal use of the term - it only reflects the term as laymen understand and use it."
Laymen typically do not look up phrases in the dictionary and rather rely on gut instincts or what they've read somewhere in a business brochure, or on some informal C2 pattern wiki. Laymen also argue that the opinion may differ or that dictionary is irrelavent. Rigorous critical people cite references from dictionaries, encyclopedias, authorities, formal specifications, etc (none of which include C2 pattern pages written by many crackpots).
The English dictionary will ultimately come to reflect the 'gut instinct' and 'business brochure' definitions rather than the formal definition.
Re: "That literally makes the dictionary irrelevant to the CeeLanguage definition of 'function' even if you'd wish it to be otherwise."
A function in Cee language is actually a subroutine or subprogram that functions according to the vaguer less mathematical definition of function in the dictionary (Cee language designers take function to mean something that does something, and this vaguer definition of function is indeed in the dictionary. I rest my case. They did not stray too far from the dictionary. If they had strayed far from the definition in the dictionary, they would have called a function a screwdriver. And if culture absorbed screwdriver to mean function then the dictionary should be updated, as culture does use computers.
A CeeLanguage function does not 'function' according to the definition found in ANY English dictionary. It doesn't depend on a dictionary, and if the dictionary changes the CeeLanguage won't automatically change with it. That really isn't a point upon which you can rest your case. The most you can say is that the people who chose to call subroutines 'functions' instead of 'screwdrivers' were observing common courtesies to avoid confusion; they could have, indeed, called them 'screwdrivers' if they so wished.
The dictionary, a formal reference, will be updated with time and is not fixed - and computers have changed the dictionary over time. If you compare an older dictionary to a newer one, you'll see that the formal dictionary has been updated to accord with newer computer phrases which have been introduced by inventors - and yes, this means that even Lisp's car could end up in a formal dictionary - although formalists might not take Lisp seriously enough to consider adding that alternate definition of car to the dictionary - especially if not enough of the English culture uses Lisp - for the same reason if not many people use BrainFuck, then terminology in that language won't be placed into a formal reference like the dictionary either).
An English dictionary is a formal reference to a natural language. That does not mean it possesses formal language definitions. The fact that the dictionary needs to be 'fixed' rather than the users is sufficient proof that the dictionary contains what I described above as 'natural language definitions'. And the dictionary remains 'irrelevant' to the formal language because the dictionary does not affect the formal language. You argue here that the reverse is true: the formal language can affect the dictionary. That's fine. It's just irrelevant.
Re: The Oxford English Dictionary attempts to be comprehensive, but does not primarily contain 'formal definitions', and especially not 'formal language definitions' - English is not a formal language.
Then neither is Cee language a formal language especially since it was designed by hackers working for a telephone company who thought up crackpot ideas such as == to mean informally equality.
Not relevant. They could have formally defined the equality test as being '#=^_^=#' and it would still be formal. It is formal because they gave it definition before using it. In fact, all computer languages (or at least every one to which I've had exposure) are formal languages - pretty much by necessity. For every single one of them, a definition of any 'word', be it a function or type or variable or keyword or operator, along with a specification of the grammar, must be provided before the language and word can be utilized in any meaningful way.
Over time, equality operator has been updated to mean whatever people want it to mean - in the informal languages such as PHP, Cee, Ruby, etc. Your argument that English is not a formal language is a straw man - since just about every language can be declared not formal (therefore - nothing is formal, because I said so, especially Cee and English). In fact, most languages were pulled out of hats by creative people. They do, however, have formal guides and references - despite your rejections to formalities (which is hypocritical since you seem to be in favor of defining form). Languages change and do have form - English has form.
Having 'form' doesn't make a language 'formal', at least according to how I used the terms above. The 'form' associated with English was only ever described after-the-fact and is capable of change much as is the meanings of words. That isn't to say spoken languages can't be formal, but most in common use certainly are not.
Some are more formal than others. Programming languages are similar. Having undefined ambiguous features and behavior (even allowing it in a standard) is not so formal. Note how GoodDefinition is related to undefined behavior since the root word of undefined is definition.
That would be an erroneous conclusion. A formal language doesn't need to be a complete, general purpose language. A formal language can leave most things undefined so long as those things aren't relevant to its domain. It is true that 'undefined' things lack 'GoodDefinition', but that does not affect the formality of the language. If use of the formal language depends upon the definition of undefined things, and the formal language does not also formally specify an abstraction mechanism by which those things can be defined later (technically creating an 'abstract' language), that would be problematic. But that is not the case being discussed here. Proper CeeLanguage programs don't depend upon any undefined behaviors. If division by zero is undefined in mathematics, the proper answer is to never try dividing by zero. If adding "poohbear" to a matrix is undefined (potentially resulting in "neopoohbear" and potentially resulting in a sudden visit from Agent Smith), then you shouldn't try adding "poohbear" to the matrix.
Programming languages don't need to make every possible operation a 'valid' operation in order to be 'formal'. If they are to be called 'safe', they do need to guarantee that no invalid operations ever occur - often achieved via static or dynamic TypeSafety - but that is an entirely different subject than whether the language is formal or whether a word possesses a GoodDefinition.
Computer languages can be more formal, however, than spoken languages - since they are computable and more factual via 1's and 0's. See quote:
See quote:
The next interesting question would be: does a formal language allow one to redefine it with a preprocessor, because that allows a lot of debasement, ambiguity, and other forms of misuse - going against form.
There isn't any problem with a formal language defining the mechanisms for providing its own redefinition; it isn't much different from saying: "For the remainder of this dialog, I shall say "foobar" and you shall interpret it to mean "spotted leaping scorpion". And regarding your question exactly, if you're using a preprocessor you're (generally) using two different formal languages: the preprocessor language, which takes text as input and is intended to produce program text as output, and the text language being passed to the preprocessor... which often looks similar to the target program language (though that isn't required).
Does having undefined behavior, even acknowledging it in a standard, cause us to question whether many programming languages really are formal?
Sure, it can cause you to question. But a little digging will reveal that the answer is: having undefined behavior does not affect the formality of a programming language.
I think the world is more complex than black and white: formal vs informal. It goes deeper. One can choose a specific subset of English to remain more formal, and one can choose to avoid certain features in Cee to remain more formal. Does pure formality exist? Is formality itself slightly informal. Why is it that formal specifications are many times written with English splattered in, and not another pure formal language without any English whatsoever in it?
Why do formal specifications have scattered English? Partially because people, not being machines, need English explanations to support our comprehension when confronted with mathematical proofs, pure UML, programs without any comments, etc.. And partially because we don't have formal languages for specifying everything we need in every possible domain. Anyhow, I don't really concern myself with "pure formal"; the above discusses "formal language", which is a word phrase with a rather strict meaning that is really opposed to "natural language" (not 'informal' language). There are, indeed, pure "formal languages" - most assemblies and programming languages included.
My conclusion is that informal vs formal is like a black vs white argument. There are grays. However, when one is choosing to write more formal English, he'd best choose a dictionary as a useful reference - and take note of the labels in the dictionary which may define certain words as slang or less formal.
It is to note that this page is about what a good definition is.. and its quite obvious that a good definition of a word will involve the dictionary at some point.
See Also: DefinitionSmells, EverythingIsa, WhatIs