Why Try To Delete Wiki

Why would anybody try to delete a wiki?


One reason might be the content of a site:

What would happen if someone put up a wiki site that attracted vituperative dissent calling for strong censorship of the view/info on the site? The anyone-can-edit puts the tools of that censorship into their hands (to some extent at least).

Or a journalist might write a review of wiki, comment on its self-healing properties and then give the URL to everyone who might take that. ["... as a challenge."?] And then all but one of the would-be-attackers find that it's already happened. Or so they would if they (and the first one) hadn't all seen that coming beforehand, and thus not attacked it in the first place.

People who don't enjoy software discussion don't usually want to prevent it - they just want it to happen away from them ;) - so this site is immune from attack from that sort of motive. -- RiVer

DisagreeByDeleting existed long before wiki. There will always be those who attempt to "win" an argument (in their own minds, at least) by preventing the other side from being expressed, especially when the other side is in fact correct.


However, you do see random page deletions on various topics (GreatLinuxFeatures is one example). Deleting a whole wiki is much more difficult. A WikiMaster keeps a copy of the database, and can roll back massive deletions.

If you were operating a contentious wiki, you might allow only logged-in members to contribute. This limits the number of viewpoints, but prevents the UnwashedMasses from messing with your wiki.

Of course you can hide in plain sight and keep your fingers crossed like WardsWiki.

Don't expect anyone is going to save this wiki from a massive deletion. That will never happen. You're just fooling yourselves. All you can do is not encourage anyone to do that by being civil, friendly, and open-minded.

If Wiki is killed, it will simply grow anew. The tendency for the DanglingLinks to be clicked, new discussions to be started, and more ReFactoring to occur seems likely to outlast any attack without scripts or only using manual scripts. And if scripts were used, there are certainly a hundred lines of WikiSource any one of which, when changed, would stop a script attack but would let 'normal' Wiki use to continue (require a space somewhere in the URL, say).

If you strike Wiki down, it will become more powerful than you can possibly imagine.

Ok, if so... let's strike it down (what do you mean by "more powerful"?) It's a line from an obscure movie.


I strongly disagree with the idea of being able to stop script users by doing subtle changes to Wiki. The problem is that a script has an owner who can adapt to the changes.

In the wiki of an IRC channel I frequent, for example someone does not want a certain word to appear in the wiki. Whenever it appears on a page, the name of the page is added to a list in his script and the pages get deleted from that moment on. It does not matter if pages are reconstructed without the word or not, the pages will get deleted again and again.

We cannot prevent the attacks, because we cannot tell wether a change is made by a human or not. If the wiki was to check this, it would require AI, since everything that can be checked for by the wiki can apparently be implemented in a script. -- HenrikPaulini?

There is at least one way (what I know of) to distinguish between a human user and a script: to allow edit, wiki could generate small picture with some text password, distorted by ImageMagick. Password from the screen needs be entered to enable edit. Still easily readable for humans, but scripts are spotted dead. Yahoo email uses this trick to prevent script users to create loads of email accounts for free. -- PeterMasiar?

That doesn't distinguish between a human and a script, it distinguishes between a non-blind human and blind ones. It doesn't work for sight-impaired individuals. Nor does it work for text-only web browsing.

That is a good trick, but, like anything else, has a limited life span. Eventually the nasties will figure out a way to incorporate automated OCR scanning of graphics and defeat that mechanism, too. <sigh>

That's how things move forward though. Many technologies and their useful spinoffs today wouldn't exist if it wasn't for other people overcoming their ancestors.


If you really want to prevent this sort of thing in the future, you need to have a multi-pronged protection against it. a) Revision history. So that when a wiki is deleted or altered signifcantly, it can be fixed by anyone. b) Flagging of large changes. When a wiki is altered, it's rare that it's altered substantially. Certainly not completely erased often, for example. Look at the way normal wiki changes, figure out what the average change is, flag anything substantially more than that. You could do this by simple size in bytes of the change, or some other method. Then have the wiki program make a page with large changes on it so that wiki users can police the changes themselves rather easily. Kiddies erasing things will be found more quickly then if you have just a few people policing the system. c) Time limitations and other road blocks to scripting changes. Have a simple time limit, such that no single IP can change more than one wiki within, say, 1 minute. This doesn't stop scripting, just slows it down and reduces the amount of damage done before a "policeman" catches it and can get a better ban on the person doing it into place. There's other ways to throw up road blocks to scripts that are not hazards to real users.

Of course, these have ways around them as well, and the whole concept behind wiki pretty much rules out completely shutting down attacks. But you can make them prohibitatively difficult to perform, and have user self correction in place to repair the damage rapidly.


See WhyNobodyDeletesWiki


CategoryWiki


EditText of this page (last edited April 14, 2005) or FindPage with title or text search