There Is No Stasis

In the natural world (see WhatIsNatural), only ProCesses exist; mere existence is the process of transitioning from having become to being no more. Metaphysically, some entity might exist that is not in the process of interacting with the cosmos, but such an entity would be unknown to the cosmos and, in a very real and literal sense, would not actually exist (though it might "re-exist" if it began to interact with the cosmos). Similarly, in human systems of all sorts, including computer systems, the system is always in operation, never at rest. Of course, systems may seem to be "doing nothing", and that may be "correct" or "incorrect" functioning, but if the system still exists, it is actively involved in the process of continuing to exist. This does not mean that "existence" is a process by definition, for that would be a circular argument; it means that a system requires ongoing interaction with its context if it is to continue to exist.

Therefore: modelling the real world in terms of states and objects, rather than processes, is an unnatural simplification that is liable to result in defective implementation.

Oh, come off it. If you're a functional languages fan, just say so. Your conclusion doesn't follow from your arguments above, though.

I'm not sure whether I'm a fan, since I haven't tried any yet. I prefer to define all function in an equivalent fashion: typically InputProcessOutput? (or take this, do your stuff, and put the results there), but this is just a habit I picked up from Mrs Beeton (AnalogiesFromCookery). For software, as an antidote to HIPO, I conceived of a style called ProduceUsing. This seems to work very well for large systems and low-level operations -- wherever the rationale is dominant (you're very clear what you're trying to do). It rates output (what you produce) above input (what you use to produce the output), and black boxes the process (how you produce what you produce using what you use) as an implementation detail.


"This does not mean that "existence" is a process by definition". Actually, it does; every language in the world has a mode of expressing existence as a verb. Certainly in English "be" is a verb.

[Verb does not necessarily imply process; "verb" is merely a syntactic classification that correlates strongly with process.]

[Actually I'm asserting that there is no split between process and state because there are no states (ThereIsNoStasis). In linguistics, I would argue that there is only a formal distinction between nouns and verbs, which is why "action" and "process" are nouns rather than verbs. I object to the suggestion that the existence of a verb implies that the referent is a process by definition. In short: existence is not a process, existence requires a process.]

In the philosophy of science, also, it has often been claimed that existence depends on the rest of the universe; Mach and Einstein, for instance, each discussed some famous aspects of this, and "everything is a process" is a common way of describing phenomenon in subatomic physics.

[Agreed]

However, what's the point of all this? In software we model the salient aspects of a domain. It is frequently salient to model at-rest states, and arguments from philosophy, linguistics, and physics have nothing whatsoever to do with the effectiveness of that approach. Software is not about the search for absolute truth, it is about making things work.

[As long as we agree that the state/transition dichotomy is a convenient fiction, we simply need to remember that in the real world we require a process to implement the pseudo at-rest "state". In other words, we can defer considering the process of maintaining the state as easily as we can defer thinking about any other sub-process; we cannot simply forget it, however. I make no argument from philosophy, linguistics and physics, however, I simply observe that computer systems are not different in this regard from any other kind of system]

[That would be: '...we simply need to remember that in the real world we require a process to implement the pseudo at-rest "state"...we cannot simply forget it...']

I also note that some people claim philosophy is the search for absolute truth; to the extent that that is true, philosophy largely fails at those goals, while much-maligned software largely succeeds at its respective goals.

[I certainly would not agree with people who make such a claim about philosophy. Understanding why things work and why, sometimes, they do not is a legitimate concern of software developers, however. And relating software theory to wider systems theory could be a fruitful pursuit.]

The everything is process view of the natural world is intimately tied to the illusion of flowing time. If you look at the universe as a single indivisible block, you lose the notion of process, and you gain the same sort of interrelationships between objects that you find within a mathematical theory, between mathematical objects. The important point of 'everything is process' is that everything is interrelated, but this is only a narrow conception of a much broader vision.

Agreed.


Again: what's the point of this page?

[Like any Wiki page, this page is a process. Perhaps the results of this process will become more apparent as the process interacts with the context within which it operates.]

That's the kind of answer that tends to get pages deleted, FYI. It pretends to be deep, but isn't, and by convention, pages are supposed to be of value here at the time of creation.

[The same applies to comments :) The thing is, if I understood what conclusions should be drawn from these observations, I would happily draw them. Perhaps it doesn't matter if software's foundations are fiction rather than reality, but it feels to me that it should. Now I could rampage over Wiki sowing doubts here and there wherever the problems discussed seem as though they might relate to this fundamental mismatch, but I disapprove of such behavior. Instead, I prefer to express the objection OnceAndOnceOnly, casting a pebble into the pond and watching the ripples, as the consequences become apparent (or not).]


I seem to draw the opposite conclusion to the proposal at the top of the page. Focus on processes seems to reinforce the concept of objects not diminish it. An object de-emphasizes the data, the entity, and exposes the processes that are performed. It is in procedural decomposition that I must worry about defining data structures and passing them around independent of the processes that may act upon them.

Perhaps I am missing the intended point, but I see an object as consisting of a set of processes unified by the conceptual data element the processes operate upon.

Maybe. Maybe the point is, that ThereIsNoObject (in the sense of ThereIsNoSpoon), only processes giving the impression of there being an object.


But conceptually, in a computer, ThereIsStasis?. Sure, from the physical perspective, a CPU that executes an instruction is in the process of moving to the next one. But from the programmer's view, which is arguably what matters in programming, there is exactly one point in time at which the CPU executes exactly one particular instruction, keeping exactly one particular set of values in its registers. The same applies to memory and all the other hardware you might access. So in the context of programming, this whole discussion is a bit weird. (Everything becomes a bit more complicated with multprocessing, of course, but the point isn't invalidated by it, either.) -- MatthiasBenkard?

These things are simply artifacts of the VonNeumannArchitecture. Take ComputationAsSignalProcessing seriously and there appear to be viable alternatives. ProCess and disinction (WhatsaDistinction) lead to a dynamic worldview and technology view in which system behaviours can be thought of as fractals. And before you ask, no, I can't point you at an implementation of same. BraitenbergVehicles, perhaps, point the way. --Pete.


I see people raising questions like 'yes, but what does this *mean*?'

On a site like this, such lack of understanding disappoints me, because I think this has some very far-reaching implications for computer science which we have barely begun to explore, which could help tame the complexity and interoperability problems in ways where ObjectOrientedProgramming (with its focus on 'thingness' and 'state') fails.

I think one of the most important implications of 'everything is a process' is that we should seriously look at modelling data structures literally *as processes* in the computing sense - as something like concurrently executing functions, a la the ActorModel. We need to 'bake in' parallelism into our programming *and* our 'data representation' in a way which current ObjectOrientedProgramming languages don't allow. We need to get rid of thoughtless serialism, and allow every data structure to be its own thread or process.

If the idea of 'everything is a process, even data' boggles your mind, good.

We need to understand that there is no difference between 'programming' (state transition) and 'data representation', and take seriously languages (such as Lisp) which make it easy to intermix the two. Languages such as C or Java which enforce a rigid distinction between 'code' and 'data' make this very difficult. We need to grow beyond them.

Another way of saying this is to say that, in a computer system seen from a suitable perspective (such as the Internet, a whole system), there is never any such thing as 'constant' data. There is only data *which is in the process of being modified*. You might think that perhaps 'program code' or 'firmware' might be constant, but they are not - they change over time, just like program data does, as patches are applied or as new features are added. Constancy is relative.

Remember that no computer system exists alone - it floats in a sea of input/output and in a sea of software specs and code changes, all of which are cybernetic systems themselves. Current programming practices still enforce a model when the 'program' - even if it's composed of objects - is seen as a standalone, finished thing. It isn't. It's a tiny component in a wider system, and it's just as much in flux as any of its component variables or processes.

Do our programming languages and toolchains currently reflect the self-similarity we see in natural, organic systems between large and small? Given that very visible software architecture movements currently draw such artificial distinctions as 'programming in the large' vs 'programming in the small', and claim that we should use very different languages and techniques for each one, no, I don't think they do. I think we've very much failed to learn a basic lesson of how the universe is organised, and we won't advance until we do.

To me, these are all very significant findings from ThereIsNoStasis -- NateCull?


For processes see http:wiki?search=process

For definitions see NoProcess, ProcessAndaThing

See also YouCantUnderstandaProcessByStoppingIt, TimeSynchronousProcessing

For wiki related see WikiNow, WikiTime, FunnyWikiProcesses

CategoryTime


EditText of this page (last edited January 4, 2009) or FindPage with title or text search