What's the slowest implementation of a Wiki? Years ago, in a company I once worked for, we had a competition for the slowest implementation of the AckermannFunction. I'd like to suggest one for a wiki. Note: this is not a BogoWikiContest? in that you're not permitted to make one run slow deliberately; it should be a reasonably "natural" implementation for the language/environment chosen. So this is a call to implement a Wiki in some unlikely technologies...
(If I recall correctly, the slowest Ackermann was in a local equivalent of 'troff' - working out how to do recursion was the hardest bit...)
It seems to me the slowest wiki, under the definitions given above would be the Wiki with the largest number of pages and links, Or the greatest amount of "feature creep", or one being hit by the most users at the same or nearly same time.
Maybe. Assume (apart from feature creep, maybe) that these are the same for all implementations.
I'm working on one that would qualify called ejwiki for experimental java wiki. Its a pet project to learn some new technologies. These include:
Implement a Wiki on a TuringMachine and let First Year ComputerScience students execute it using PenAndPaper. A fresh stack of DIN A4 paper, bought from the nearest stationery shop, can serve as external database. Since students are sometimes considered cheap or unpaid labour for the university, I also submit this entry to the CheapestWikiContest. ;-) -- ChristianRenz
Hmmm... Here, only second year students know what a turing machine is. But your idea sounds promising. Anyone dare to write a TuringMachineCompiler??
I think this one is approaching being a BogoWiki.
Well, this would mean running a wiki on a real computer instead of a digital one :)
Slowest wiki? Well, that would have to be one that uses Morse code. Here is how it would work (radio jargon in parens) ...
Oh, we can beat that. "BioWiki". Conserved sequences are the pages. Editing is great fun but very slow. -- GenesShmenes
Well, WikiPedia has been awfully slow (April '02) for a while. That's because the number of users has jumped dramatically all of a sudden.
-- jtnelson
Apologies to UseModWiki, but it's 3,000 lines of Perl 4 style Perl, and on my 180MHz PPC, it took a good 10 seconds to load a page before I replaced it with something consisting of 75 lines. -- ScottWalters
A good old DOS .bat implementation should do. Anybody for using FOR %%1 loops and DIR commands to get pages? hmmm could actually be turned into a functional cgi I suppose...
Come to think of it, would probably not be that slow ;) -- SvenNeumann
Seriously? I would love to use something like that on my 486. That's actuly a possibility. It could use copy con and type. Maybe I'll program it... :-) -JamesGecko? ---
Anyone going to implement one using smoke signals?
How about extending Unlambda Language to allow multiple streams and writing a miniature HTTP server in it?
GraffitiWiki: Each link displays a visible instruction as to where to find the target "page", which could be anywhere in the world. Of course it's not very reliable, due to constant architectural changes, overwriting by taggers, and erasure by anti-graffiti efforts...