Its All Just Switches

An abstraction meant to de-mystify computers for students.

My computer language classes nearly always began with my assertion that "ItsAllJustSwitches" wherein I evolved programming from simple light switches to more complex arrangements where you can turn on the light at one door and turn it off at another, to even more complex arrangements (switchboards, photocell feedback circuits) into relays, transistors (which replaced them) and so on, until we have millions of electronic switches that are only either on or off, and which can be arranged to control other arrays of switches (...) arriving at the modern computer.

For some of the students, this was enlightening. Others were bored by it. In the end, the great majority of them were able to set aside this abstraction and learn programming anyway. They really never had a problem with differentiating code and data. It was enough of a challenge to grasp that "adjective" and "adverb" were now replaced by "parameter" without reaching for the "oneness of it all."

I disagree with that assertion. The implementation may just all be switches in practice, but that is an execution detail. Maybe smart hamsters or specially-bred bacteria could execute Smalltalk code instead of an Intel chip. There is no law that says only chips and only switches can execute code instructions. The future may use different techniques. I fear you are lost in the present :-)

Teaching should always mention at least hypothetical implementation details like this for the sake of concreteness. Students who learn only the abstract, and never the concrete, never quite get the abstractions completely correctly. This is the classic SymbolGroundingProblem?.

[might be related to YouCantLearnSomethingUntilYouAlreadyAlmostKnowIt]

Yes. I found that I had an advantage that most of my students didn't. I had a background in tube and transistor electronics, radio, telegraphy, and had earned my own TelexOperator? spurs -- I could read PunchedTape? the way others read type on paper. The idea of storing values as binary codes in sequential memory addresses and treating some of them as commands and others as data was quite painless for me -- I had lived with the physical models. But I had in my classes students with "more advanced" training who yet lacked the paleolithic foundations that I took for granted. So we spent a couple of weeks learning stone axes.

Is this debate about how to teach or the actual nature of software?

Debate? I must have missed something. This was something I did along the lines of LetsBeginWithLevelFlight? (a JonathanLivingstonSeagull quote) to ensure my students had some common grounding. I never proposed it as a "teaching method" for general use.

However, it does provide a slightly different look at the effort to abstract all code to be data and vice versa, a commentary on the usefulness (or not) of trying the generalize the codeness of data or dataness of code. You might consider it an anti-philosophical dig.

However, now I've hung my entire self-respect and self-image from this paradigm, and if it's not immediately and broadly accepted within the wiki, then I'm DOOMED (doomed ...).

You were doomed already and you know it. :-)

ItsAllJustSwitches is a deep statement about reductionism, and in a reductionist sense, it is quite true, not questionable depending on implementation ("switch" is a very general abstraction, and cannot be avoided by changing implementations.

It's also true that reductionism misses out on EmergentPhenomenon?, so ItsAllJustSwitches is only one of the true and insightful ways to view the subject. -- DougMerritt


It's true but useless except to the complete neophyte who has to start with the lowest-level details. It is akin to introducing psychology as "it's just DNA and molecules bumping around", and completely ignoring mental processes. If someone wants to think programming is about switches instead of design patterns and expressiveness, then fine - the job market is tough and the less real competition the better for me!

When we say "students" here are we talking about WhatSortOfComputationWouldInterestJuniorSchoolChildren or those trying to learn FreshmansFirstLanguage?

Good question - I'd answer with "whichever students they are talking about when they say people should learn the low-level fundamentals first - then the high level."

FreshmansFirstLanguage?

These were college students. Some worked for a living and had to learn this stuff for their new responsibilities. Some were raw out of high school. Some had degrees in related subjects and were treating this as a post-grad supplement. Some took the courses to enable their new hobby. All of my classes had at least one CIS pre-requisite. Somehow, even though ComputerLiteracy and BasicLogic? and sometimes another language were supposedly already "learned" the common phenomenon was that the most basic fundamental understandings of what computer is were somehow bypassed. I was unwilling to paint over this unprimered surface and just hand out grades.


ItsAllJustSwitches, while useful in some ways, ignores the EmergentBehavior inherent in computers. Computer systems are interesting, being a phenomenon that exhibits significant, complex, and powerful EmergentBehavior, yet being fully understandable to humans. Not coincidentally, computers are the only system I can think of with such EmergentBehavior that is the creation of humanity and not nature.

But imagine if all knowledge on topics such as elementary Boolean algebra, semiconductor device physics, microprocessor architecture, numerous branches of computer science, electromagnetic fields, and numerous other fields were wiped out, and one discovered this thing called a "computer" and wanted to discover how it worked. (Further assume that a supply of identical computer systems were available, and destructive investigations were permitted). The models such a person--or even a team of highly intelligent and train scientists--would likely come up with would be as woefully and pitifully inadequate as our current models of chemical and biological processes (to say nothing of less deterministic sciences such as economics or psychology).


'The implementation may just all be switches in practice, but that is an execution detail.'

Hmm, I'm not sure, if switch was meant literally. I think it was used metaphorically to refer to an element, that can be switched by a switch - so to say. This may be a transistor, but also a relay or a mechanical 'switching' element, like in KonradZuses Z1. Or tamed hamsters, if they behave that way for that matter. All you need for a 'computer' are a switchable switch, a delay element and a power source all appropriatly connected (IO is just external connections). -- GunnarZarncke

For that matter, the delay is sometimes provided free by physics, and the power source might be skipped if it were externally operated by e.g. a human (as with various mechanical computers, like teaching-toys). So all you really need are switches and an external operator. -- DougMerritt

No. without the 'external operator' I'd say it is no computer any longer - it becomes static. Removing the operator (or power source) removes too much - only the plan of a computer remains. Otherwise we could remove the switches too and argue that the operator could only think of the operations. -- .gz


CategoryDiscussion CategoryHardware (in a way)


EditText of this page (last edited March 13, 2009) or FindPage with title or text search