Mark Janssen

Python programmer designing self-generating, self-organizing systems of information. I am an authorized messenger for the system made by Mar|cos while being one with same. Relationship of Wiki to Cosmology? Grokked. Hit me up. I have mastery of human health and am a HolisticDoctor?. As the page said, a good HD can diagnose and cure nearly anything without invasive procedures.

Member and leader of LambdaLambdaLambda? (or "tri-Lams"). FreeSoftware advocate. Graduated with ComputerEngineering from Pacific Lutheran University. Email dreamingforward(at)gmail(dot)com

Without the female I can only be dying or hate. It's a yin/yang thing.

My curated content follows. Follow the original links for discussion.




Noting the irresolvable debates (ThreadMess) and HolyWars revolving around Objects and terminology AND noticing that the profession of ComputerScience has been fractured (since the 60s) within itself and away from the programmer community, I declare that there's been a half-century of deferred maintenance. The whole field needs to be re-centered.

After much analysis, I identify two major groups, heretofore un- and under-differentiated. They were both working under the common grouping of "Computer Science" but they work in two separate domains. Up until now, they have been unconsciously sharing a common lexicon, but this has been a major source of the confusion and "everyone has their own opinion" is not a sufficient excuse for an academic milieu. Language needs to be clarified so that this boundary can be clearer and the field stronger. Once this is accomplished, it will resolve all the confusion between "Objects", "Types", "Classes", "Languages", etc. Can we agree that that is a good thing?

There are two major camps in Computer Science. These two camps are separated by very different ModelsOfComputation. The idea that one can be expressed by the other (along the lines of ChurchTuringThesis) actually misinforms the intuition. In fact, under (and after) this analysis, it probably renders this *thesis* invalid.

In the first camp, which I'll call the OldSchoolComputerScience?, we have what is nominally called "mathematics", but historically originates *strictly* from philosophy, specifically SymbolicLogic. Here we find LambdaCalculus and the desire for functional compaction. As such, a defining feature here is recursion. Arithmetic is not a defining feature nor is there any metaphor of physical computation (the laws of physics do not apply, only the "logikos")

The second major camp, I'll call, for the moment, just ComputerScience. It originates from the computation that evolved mostly after 20th Century military history and is rooted in binary (or BooleanLogic) and TuringMachines (generally VonNeumannArchitecture). This computation is rooted in friggin *physical reality*, not predicate calculus; i.e. the computational hardware has to respect the laws of physics. It even has a simile (a physical tape) rooted in physical reality. I argue that it is here, in the Turing Machine, the field needs to be re-centered. Throw out symbolic logic, completely, except as a cross-disciplinary interest and historical curiosity. Punt it back to Philosophy. Not because it's not useful, but because it confuses the thinking, like forgetting the i in a complex equation.

The first camp worries about provable code, the second camp does not -- it has I/O to see whether programs work or not. (The former camp hardly ever works with I/O except at the very edge of its computations.)


I'm proposing the concept of Kolmogorov Quotient as a calculable number that measures the amount a high-level language reduces the complex. That is, the amount of expressivity of a programming language. This idea of "simplification" is a factor of text-wise reduction (fewer characters needed to express a complex concept, à la AlgorithmicInformationTheory) and some other less-easy to quantify concept of maintainability. Fleshing out this latter concept, it is clear it has to do with how easily one can establish programmer consensus for the given task.

It is a Quotient so that higher Kolmogorov numbers for a given language denote a reduction in the complexity of solving the problem in the given language.

Once the basic premise/methodology above is agreed to, it is only a matter of a Constant factor of difference for any specific implementation. (As long as the implementation is the same across all measurements, the number should be valid and comparable.) But it could go something like this: Pick a language "close to the machine", like C or Assembly, and measure the amount of bytes of machine code it used to implement a standard "suite of common programming tasks"*. Then code the exact same functionality in the language you are wanting to measure (without using external libraries) and count the number of bytes of source code.

KQuotient = base_language_count / test_language_count.

I'd guess Python or Ruby has the highest Kolmogorov Quotient.


This is the technique for internet listings with voting (like Digg) to prevent positive feedback loops that continue raising popular items forever above the rest. It's based on evolutionary theory. The idea is for every contribution, tally a count of all votes, then each contribution gets weighted from this total and a weighted throw is made to pick a given item.


The probability of being shown first becomes, respectively:

for a total of: 1.0

Bingo! Problem solved! No-one gets left out, and no positive reinforcement loops!


For the wiki to grow and stay healthy, it may be interesting to note that all four elements from Alchemy must be present.

When all four are present, the wiki can turn into a "TreeOfLife".


Still in process....


Welcome to Wiki, Mark!

Wow! Thanks user from Class A IP user number 4!

Note taken from PangaiaProject:

Hi Mark, regarding ObjectOrientedRefactored: I HaveThisPattern too. I also think that the promise of ObjectOriented was never fulfilled. And I also think that is due to some missing ingredient. Fields and methods in classes are not enough. Some layer(ing) is missing. Some say that this can be remedied by libraries, but I disagree. Those will never make an object reusable. But I also think that BlowUpTheWorld? with a grand new thing will work. Your vision may be like mine, but if the vision is shared too late or too little your explanation will depend too heavily on your own PrivateLanguage of it (I can see that from your terminology use, e.g. 'fraktal'). One part that I think is missing is the ability to let one object have multiple classes. And I don't mean implement multiple classes but that a) the class of an object can change without the identity (!) of the object to change and b) these classes can be added later and independently (for composition and extension). See details in ClassIsomorphisms. What do you think? -- GunnarZarncke

I like it. Yes, fields and methods are insufficient and make for a fenced forest of objects that become monolithic monuments to their Creator. Your later point eludes to a way to define composition, but to do this non-arbitrarily requires something else - something that defines how the individual relates to "the group". One that is set, then you're golden.

The other issue is that Objects are ultimately existent with other objects. This implies a (need for a) message-passing "ecosystem", but there's no unified way yet for objects to pass data amongst themselves. That's my analysis. Objects engage computation and in the process should have need/desire to communicate to other objects. A common syntax like C++'s ">>" and "<<" for passing simple (i.e. atomic or a series of atomic) data in and out. This would make it universal among all objects. In addition, a simple say to query state: "?myObject" would return some state datum. Building these into the language would encourage an organic growth of loosely-coupled, modular components, like Unix pipes did for simple OS commands and reinvent whole language libraries.

That you shun the mathematical and theoretical branches of ComputerScience and consider yourself a pure pragmatist (oxymoron intentional) is fine, but please don't crap on pages like LambdaCalculus.

I didn't cr*p on that page, sir. I pointed something out that confuses many who come from the TuringMachine ModelOfComputation creating unnecessary ThreadMess and HolyWars. To leave it go unnoticed and unmentioned would be borderline unethical. Further, you show a bias by implying that only LambdaCalculus is theoretical ComputerScience. I simply come from a different camp, founded on an abstract theoretical realm of digital logic, no less important really and perhaps not more important but the field has been conflating them.

Your edits had a clearly deprecatory tone, which is entirely unnecessarily divisive and does nothing to bridge any gap between the "two camps" of ComputerScience -- assuming they exist, and I'm not convinced they do.

Please do not confuse friendly debate over minutiae here, or the early misunderstandings of beginning programmers, with a genuine divide. I'm guessing you are convinced that object oriented approaches are the best way to a DataEcosystem and functional approaches present an obstacle to that, but they do not. Functional approaches are equally valid. In particular, approaches based on the RelationalModel probably show more promise than OO for creating a data exchange utopia.

While I appreciate your words and knowledge, I think you haven't realized that I'm typing these words on a computer, a piece of physical hardware. And like nearly every piece of hardware in this world, does not have an architecture to do efficiently do recursion. I will always consider LambdaTheUltimate, but I must remain in the world of TuringMachines.

Recursion can often be optimised away into pure iteration (see TailCallOptimization), in many cases its efficiency is not an issue (it's probable that the OperatingSystem you're using, regardless what it is, employs some recursive algorithms), and FunctionalProgramming is not synonymous with recursion. By the way, the "world of TuringMachines" defines an infinite tape; it's no more "real" than LambdaCalculus and is computationally equivalent. Real machines inevitably impose limits on theoretical possibilities. In real architectures, for example, the overhead for inheritance-based polymorphism is isomorphic to the overhead for recursion, but that's not (in and of itself) a reason to deprecate ObjectOrientedProgramming.

I'm familiar with TailCallOptimization. I don't think it's probable that much that Linux, for example, uses some "recursive algorithms". FunctionalProgramming isn't strictly synonymous with recursion, so I was a bit sloppy, yes. Well, when I used the term "the world" I am referring to a physical one which as you're pointing out can only approximate an infinite tape (but yet it does, doesn't it?). What you haven't understood is the different ModelsOfComputation which make them necessarily distinct.

The Linux kernel uses recursion. See The different ModelsOfComputation must be approximated on real hardware, but in practice this is not a problem, regardless of computational model. There is nothing inherent in the choice of underlying model that makes a C++ superior to a Haskell, or vice versa. The merits of programming languages -- influenced or not by particular computational models -- lie in how they let us express solutions to problems, not in how they relate to the underlying machine. Optimization techniques are sufficiently sophisticated that for most purposes, the compiler-generated code (or interpreter run-time) that runs on the machine simply isn't a concern.

By the way, isn't it a bit disingenuous to sanitise "cr*p", but not "shit" in "[a]re you just making shit up?", on KolmogorovQuotient?

It must have been a WikiGnome -- I didn't sanitise "crap".

Probably GrammarVandal did -- I note he's done some editing under your UserName.

Yeah! What's up with that?

It's the only way he gets to edit these days. Sad, really.


EditText of this page (last edited November 10, 2014) or FindPage with title or text search