If wiki was created as the UltimateTestForJointOwnership then we need to know how it's doing
Some of the more basic metrics for success/failure might be:
1. the number of people who have looked at a wiki page once 2. the number who came back 3. the number that have authored 4. the average number of authors per page 5. the maximum number of authors per page 6. the rate of edits/page creations attributed to RichardDrake :-)(IP addresses would do)
There are surely better metrics than these - but TomGilb taught me long ago that measuring badly is far better than not measuring at all.
We would also want to list possible outcomes of the measurements and list the conclusions that we would make from each before we perform the test.
Or you could consider wiki to be a work of art. As such it would be enough to have visitors question their assumptions about ownership after having experienced the site.
Now here's something else great about Wiki. It has taken me a year to think about these last two responses and now I feel that I have something to say back that may be valuable. (Whether it's the year's gap or the value now that you think is most important will no doubt depend on your point of view...)
I take these to be two responses from two different people (and that's also cool ... maybe one broad-minded AnonymousHero has give me the freedom to deal with his two points this way).
Now what was it?
The first point is that of the well trained (not to say pedantic) statistician. You must know what you are measuring for and the conclusions you would draw from this or that or the other before you choose your tests and those you must not change. Vital for real statistics and indeed simulation of trading strategies for financial markets, as I know to my cost. But quite different from my purpose in suggesting the metrics above last February ... which was quite simply to learn interesting stuff about joint authorship (or anything else interesting about Wiki come to that).
This touches deeply on how NickSimons and I combine TomGilb's pioneering emphasis on EvolutionaryDelivery with the (still much less appreciated, even on Wiki) GilbMeasurabilityPrinciple. Even if you and the customers don't know the right targets for metrics or the conclusions you will draw from the numbers you find, just start to measure stuff as soon as possible (initially just as a thought experiment but as soon as possible using real life and the early versions of the real system). Then the benefits of the evolutionary process take over, with real users hanging around the development and saying "hey that's interesting, I would have never have predicted that" at some result or other or "well if that's the number we now know that's not good enough from experience ...".
So now let's think about the
Or you could consider wiki to be a work of art. As such it would be enough to have visitors question their assumptions about ownership after having experienced the site.
But you could still measure this through sampling. It's worth thinking about. Numbers are still art.