- Vision
- This just don't look right to me ...
- Hearing
- Listen to what Smalltalk is telling you ...
- Touch
- I've got a bad feeling about this ...
- Smell
- This code stinks ... (see CodeSmell)
- Taste
- Yucch!
- ESP
- My resources say this isn't the way
If code sensing is about the apprehending information in the code structure, and converting it to knowledge of what to do, then we need to consider at least these two aspects of the topic:
- Different senses support very different levels of information. In humans, there is at least one order of magnitude less information available through hearing than through sight, and touch, smell, and taste are at least another order down. While you can send a message with perfume, candy, or a caress, you can only send a couple of bits.
- Different people use different sensory modes. This may cause some metaphors to be significantly more successful than others.
In various modern communications recommendations, we are advised to express our ideas in the primary sensory modality of the listener. "I don't see what you are getting at"; "I hear you saying"; "Can't touch this". Is this the best approach for
CodeSensing, or is it best to pick a sense that is
not primary, to get the student's attention?
Maybe there should be a page for AutomatedCodeSensing?. Is it possible to develop a program that analyses code in a given language and say "hey, there's a CodeSmell here and there", or even "you should have implemented this stack using a linked list rather than an array", or "this loop may not terminate if (such and such condition)" ? That would mean a program that have a true codic sensory modality.
How cool it would be ! This is of course a very big project that might use advanced AI...
Touch: I've got a bad feeling about this ...
To me, "feeling" in this example means an inner anxiety or intuition rather than a physical feeling. "Rough around the edges" or "This feels brittle" are some alternatives I can think of. MikeWeller.