Degeneralization of words occurs when a term that has general meaning is given a specific meaning within a certain context.
Some Examples:
Cons: The reader's previous understanding of the term is brought into their minds whenever they hear the term.
When the reader sees a new name, they are forced to acknowledge that there is a new idea. When you reuse an existing name, you run the risk that the reader will confuse old and new ideas.
On the other hand, it can be a "pro" to springboard off of the reader's previous knowledge, usually metaphorically. Metaphors are great, but metaphors break down. Why not use the metaphor to introduce the term, instead of *being* the term?
See:
The alternative, creating new terms, also has problems. One of the accusations in TheCurseOfXanadu was that TedNelson insisted on using new terms for new ideas. When an old idea turned out to have newly understood winkles, it needed a new name. This wrought havoc with the code and contributed to AnalysisParalysis.
Maybe, though, the problem was the obscurity of the terms. Quoting from the article:
We share this naming problem with physics, another discipline that likes to take "normal" words and give them different meanings: spin, color, etc. No matter what ordinary word you chose for a new concept, the word's old meaning is going to be at least partly wrong. Your other choice is to coin entirely new words, as does biology and the other descriptive sciences. But it's often better to let the hearer have a partly wrong idea of what you're talking about, based upon the conventional meaning of the word, than to leave the hearer entirely in the dark by using a totally new word.
The main difference is in the level of precision used in giving these things meaning. There is a shedload of group theory behind the notion of the color charge. No-one seriously argues a case with a physicist over what colour charge is all about without trying to understand QCD. However, computing science hangs its jargon from the lowest branch, and every part of it without a mathematical background is in dispute from those who can reach. And the little maths that is left is dismissed as irrelevant to real world experience...despite the continuing widespread use of such irrelevances as compilers (irony alert). How have we come to the point where people think that CS can be completely understood by analogy?
Keep each concept in its proper context and this shouldn't be a problem. Just make sure that the context is clear... -- EdwardKiser
Why force future users of the word to surround it with context, when you can embed the context in the word itself?
Context is by definition that which surrounds a thing. If you want a word to stand for something particular regardless of context, you must create a new word, and a definition in the desired context; you are not then "embedding" context but avoiding meaning in other contexts.