Computation As Signal Processing

All physically realizable computing machines are signal processors - which Turing called C-Machines. None of the TuringMachine limits apply to C-Machines. They apply to A-Machines which are not physically realizable.



So what computability limits are there on C-Machines?

There are none known beyond the information theoretic limits of physical scale and energetics. Combinatorial complexity represents the dominant scale limitation. If you take the suggestion on GeneralHaltingProblemProblemProblem seriously (There are two exformal operations necessary to dispense with combinatorial complexity. The first is that the bits represented by your computer must represent the most significant empirical distinctions on the signal under interpretation. The second is that the remaining state required for your computation is represented by the source of signal itself, not by your computer.) then there may be a way to go about such operations. SaulAmarel's CannibalsAndMissionaries provides an example, but no general approach to such reformulation has been published.

For the record, Pete wrote the above, not me; the way the page is laid out currently, it looked as if I'd written the whole thing, but no. -- DougMerritt

I did do some cutting out words and into sentences pasting them. Please feel free to ReFactor, Doug!

You miss the point. The primary paragraphs above were authored by you, the primary paragraphs below were authored by me, that's all, and I wouldn't want someone to think that your claims were made by me, since I disagree with them. Thus my disclaimer.

I'm interested in working our dialog into a consensus

What are the physical limits?

There are a lot of fairly old results in general relativity in this direction; the information content in a boundary is what is considered to be responsible for Unruh-Hawking radiation of accellerated objects, a special case of which is Hawking's celebrated result about radiation from the "surface" of black holes. A related old result is that black holes contain information only in proportion to their surface area.

Even those old results are not absolutely proven.

Newer topics in this direction: 't Hooft-Susskind holographic hypothesis from circa 1991, and Bousso's improvement on it circa 1999. There was a Scientific American article that talked about this, but I don't remember when.

My buddy John Baez discusses the topic briefly in http://math.ucr.edu/home/baez/week57.html

The bottom line is that many people find that Statistical Mechanics makes it, let's say, extremely reasonable to assume that the total maximum information in a volume of space is proportional to the surface area of that volume, but that it's only reasonable and workable thus far, not absolutely proven.

BTW when reading about this, note that entropy and information are the same thing; two sides of the same coin. Formally and literally, not metaphorically. -- DougMerritt

Oops! Forgot the important part: on the other hand, it is true that the amount of information concerning a volume of space available to an external observer is limited by and proportional to the surface area of that volume. Let's say hypothetically that a volume of space actually contains an infinite amount of information. An external observer will never be able to ascertain that; they will always observe only a finite amount of information transmitted through the boundary. This is a result of an equivalence relation between an isoclinic boundary and an infinite set of possible elements contained within that are identical under the isoclinic equivalence relation.

So the result that I stated before may or may not be literally true, but it ends up being true for all pragmatic intents and purposes. -- DougMerritt

Can't fractal boundaries encode as much information as you like? There may be energetic impracticalities at sensing it ...

Good question, that goes right to the heart of it. The question is how much fractal detail can be seen from a distance, in essence. If observation is done via analytic functions (static fields, including electrostatic), then you can draw isoclines around the object at greater and greater distances, and the isoclines get smoother and smoother the further away you get, losing more and more detail -- and already have some smoothness (no discontinuities) at any finite distance at all. Well, that could be nitpicked. Let's say at most a finite number of discontinuities, which still implies finite information.

Any given isocline boundary, of any shape whatsoever, can be created by any one of an infinite number of different objects within the boundary. You can visualize this just by staring at the classic isoclines around the MandelbrotSet, or more prosaicly, by visualizing wrapping a highly detailed sculpture in thicker and thicker layers of foam rubber.

A similar problem comes up if observation is done via non-analytic functions, such as dynamic fields with compact support (the ability to have discontinuities over time and space), e.g. physically realistic light as opposed to Newtonian geometric optics. Here similar results follow from things like the fact that a finite observing aperture can only capture finite information -- a basic theorem of Fourier optics. -- DougMerritt


You know, and feel free to move/delete, this idea that entropy and information are equivalent seems bogus. As hinted in the above point, just because there is a combinatorial limit to the number of states in a closed system, doesn't necessarily mean there is a limit to the amount of information that may be interpreted from those states. That must depend on the observer i.e. on some external structure. If its an open system, then the information content may be resolution contrained, but not semantically constrained. Thus, I think, the problem people have reconciling order with 'low information', and randomness with 'high information'. Just a thought :).

Ah yes, the orthodoxy, my sworn enemies. The channel capacity relates to the number of distinct states, which Shannon called Entropy http://www.lecb.ncifcrf.gov/~toms/information.is.not.uncertainty.html .

But information is not necessarily equivalent to data since information presumably has a semantic interpretive context i.e. 1 bit can supply an enormous amount of information. In an open system this is highly significant.

Equally my mumblings here may be highly entropic but carry very little information. ---RichardHenderson (for it is he).

Okay. Words are bendy things. I can live with that, though "data" may be a more accurate term. More orthodox too in some circles. Perhaps even more meaningful. I think the difficulties of semantics are to do with cultural barriers and implicit assumptions. It is possible to be quite simple about these things if you do not ascribe ego to the interpreter but simply identity. But I digress....as ever.....the key point is one of separation. Since state exists outside the computational boundary, but accessible by it, then the channel capacity becomes essentially unlimited over a sufficient period. If the channel includes 'program', then the machine itself becomes unconstrained up to the limit of its 'power' i.e. the degree which it can combine elements simultaneously.....which wanders about your discussion above in a way....

Hmm. This seems like the long way round (maybe the long way round has to be exhibited for the short way round to become visible to those of us less smart). So, what about SignalProcessingAsComputation


Regarding information as entropy, here are some concrete examples from computer programming:


EditText of this page (last edited June 19, 2006) or FindPage with title or text search