Volume Limited Edits

I propose VolumeLimitedEdits as a simple - though partial - solution to WikiSpam and EditWars.

If each change to a page is limited to a certain size, EditWars and WikiSpam could be reduced, because

The size of the change should be measured as follows: This policy naturally encourages many small changes instead on large ones, which I think is a GoodThing. Of course, if a large page is to be refactored, this refactoring has to be done in small steps too. And to do really large changes, at least two coordinated WikiZen are needed.

I propose a limiting diff-size of below 2000 bytes (which is rather large considering most edits). Another possibility would be to use a fraction of the page being modified, e.g. 20%. Or to use the higher of the two values.

-- GunnarZarncke

I agree on limits for adding not removing. Wiki is getting out of shape. And if one wants to add more information they can always add a hyperlink to external resource. Alas, that is precisely what the spammers are doing. A hyperlink would not hurt anybody. Especially when it's easier to delete it than add. Spammers are coming with tons of massive links+text.

The whole idea would need more thought. There would be little effect on EditWars, which are typically over quite small changes. [counterpoints merged below]

This proposal will not solve all problems, but it has nearly no disadvantages. The disadvantages, that remain - concerning huge refactoring - are quite in the spirit of this wiki: I think large refactorings should be done in many small steps (IncrementalDevelopment. IterativeDevelopment), possibly confirmed each by a fellow WikiZen (PairProgramming). -- GunnarZarncke

Counterpoints:

I see here some misunderstanding of my proposal. It's simpler and more logical to use a more specific approach, such as WikiSpamBlocker.

It seems, that WikiSpamBlocker is susceptible to the same arguments as given above: More pages with fewer links, using multiple IPs each adding one link (or a few), having much more time. But those attacks simply don't seem to occur (yet?). On the other hand, I do not see a reason, why both approaches could not be combined. Both are quite simple.

Zombie-machine spamming is a tough cookie to crack because there is no practical way to isolate them as being one person or organization. We may have to think about solutions that don't involve tracing activities to any one user or spammer. In other words, pretend that the spam will come from random sources. --top


See WikiWikiSuggestionsMedium, EditsRequireKarma


CategoryWikiMaintenance CategoryWikiEditing


EditText of this page (last edited January 20, 2005) or FindPage with title or text search