Shark Bot

SharkBot is a page-restoration program written by DaveVoorhis and designed to counteract page edits by GrammarVandal ("Nomad") - see AnonIsStillBanned, CeaseAndDesist, ZeroTolerance, PissedOffAndExtremelyAngry for context.

Its edits can be identified as originating from the machine shark.armchair.mb.ca and/or computing-technology.derby.ac.uk.

To avoid being caught by the Shark:

Note: A succession of attempts by GrammarVandal to delete this page has corrupted the display of editing history in NewRecentChanges.


SharkBot is developed, monitored, accessed and controlled by me and me alone, for the express and sole purpose of enforcing HardBans issued by the stewards -- in particular that on GrammarVandal. I am not, nor have I ever been, a steward of WardsWiki. The stewards do not have, and never have had, access to, control over, or anything to do with the SharkBot. SharkBot is multifaceted; some components operate continuously and autonomously, some operate periodically and autonomously, and some involve my participation or intervention. -- DaveVoorhis


Dave, there's one thing I don't understand. When the shark zaps the GV edits - hooray - why doesn't it also edit RecentChanges to remove its own spam? -- Pete

That would probably be because I haven't written anything to do that. -- DaveVoorhis

Well it didn't occur to me till today either. I confess to feeling rather stupid but then ... I am rather stupid so that's really not so bad I guess. I've really enjoyed the last few days of GV-ess-ness and it would be very cool if RC was like that all the time ...


Yikes, call off your shark! I can't edit NetworkExtensibleWindowSystem! -- IanOsgood

I recommend not editing with your UserName cookie set. That will significantly reduce the likelihood of this occurring in the future, as GrammarVandal has been known to spoof UserNames. -- DaveVoorhis

Surely, your script is UserName agnostic? I'm not going to start editing anonymously. -- IanOsgood

SharkBot is UserName agnostic. GrammarVandal is not. He has, it would appear, spoofed your UserName on more than one occasion to make your edit and his edit appear to be the same overall edit in RecentChanges. Hence, I would encourage you to not edit with your UserName cookie set, because the "IanOsgood" that appears in RecentChanges isn't necessarily (entirely) you. -- DaveVoorhis

By the way, there are obvious flaws to using this as a SharkBot bug report channel (quick, Anon, report every page), so I would like to ask that all further reports be sent to me via email as specified on my HomePage. Anonymous and/or unverifiable reports will be given a brief but courteous reply, but will otherwise be ignored. Presentations of opinion or suggestions will be duly read and considered, but quibbles and arguments will not be entertained or sustained. -- DaveVoorhis

In my opinion, your shark should only revert an edit if it consists of nothing but nitpicks. Dave, I don't envy you the position you are taking on yourself: arbiter of all gnoming. -- IanOsgood

SharkBot is intended to revert edits by banned individuals, thus enforcing the ban. Whether a banned individual makes picayune edits or writes a solid proof that P = NP, the effect will be the same. -- DaveVoorhis


Sorry if I have missed something more important, but is spamming in RecentChanges the most significant reason you are reverting the careful work of individual by bot? To me it seems unbalanced emotional response that does not help to grow wiki.

Among the reasons listed toward the end of the GrammarVandal page, none is more significant than another, though the last one is certainly offensive. However, the reason for developing the 'bot is more fundamental: GrammarVandal is subject to a HardBan. Comments were made that there is no way to enforce a HardBan; now there is. Internet fora do not survive without a means to block significantly disruptive individuals.

It seems to me, by the way, that Wiki grows from contributions, not punctuation changes. Does changing '--' to '-', or "useable" to "usable", help grow Wiki? Does obstructing active discussions with pointless changes help grow Wiki? Does badgering an active participant until he leaves -- which GrammarVandal did -- help grow Wiki? -- DaveVoorhis

I recognize WikiWiki as free game; if you wish to fight, do so. As You have better command of English than myself, I assume You are not against improving grammar here, but enforcing social norms. For various reasons I do not frequent wiki in last years, but too accurate grammar was never one of them. I would still suggest looking at each edit, to verify if text got better because of it. In many cases it's like that. -- GirtsKalnins (delete thread when read)

I am certainly not against the improvement of grammar or anything else. In this case, however, the improvement (or not) of text is not the issue. For reasons listed on the GrammarVandal page, he is HardBanned. The ban is enforced. The content, therefore, is irrelevant. Had GrammarVandal not obviously acted in an anti-community manner, he would not have been banned, and he would be free to improve Wiki. Now, the improvements will be done by others. -- DaveVoorhis

April 2007: How sad that after all these months GrammarVandal is still banging his head against the sharkwall. -- EarleMartin

August 2007: Folks, this page is active. I have no idea if these are good or sad news. -- GirtsKalnins


Attempts to engage GrammarVandal in PositiveDialogue have been made numerous times. The most recent was my request that he email me. He did not do so, and instead posted an email address for me to email him. That was contrary to the conditions of the request, as I indicated at the time. The page was then deleted. GrammarVandal has clearly indicated his unwillingness to compromise or alter his behaviour. PositiveDialogue works when the participants have a common goal. GrammarVandal's goals are clearly something other. Furthermore, GrammarVandal has been issued a HardBan by the stewards. This clearly indicates the time for dialogue is over. -- DaveVoorhis


Notice: old cruft on this page may be arbitrarily removed and discussion refactored, as is usual procedure on this site. -- EarleMartin


Any chance of a shark-vetted spin-off of RecentChanges - either a black-list-filtered one that shows all edits that haven't been made or reversed by SharkBot, or a white-list one that shows the edits that have been passed by SharkBot, again omitting edits by the SharkBot itself? If the current situation has come down to a contest of patience between Anon and the rest of Wiki, then anything that makes the wait less unpleasant for the rest of us is conducive to victory as well as comfort.

The 'bot is a collection of heuristics. As such, it can and sometimes does trigger a false positive, in almost precisely the same manner that virus scanners and the like sometimes trigger a false positive. I will tune the heuristics to hopefully reduce the likelihood of this occurring in the future.


If anyone is interested, I've created a GreaseMonkey (MozillaFirefox plugin) script that hides the shark entries from the RecentChanges page. It's very nice.

http://userscripts.org/scripts/show/15058

-- AndrewNelis?

Sweet! -- DaveVoorhis


Returning after an absence, I crossed paths with shark.armchair.mb.ca zealously reverting changes in a way that I found troublesome. Questioning the SharkBot in WikiVandals I was called ignorant or troll. WardsWiki used to be a place where PositiveDialogue and AssumeGoodFaith where the normal. HarshWords? would be avoided or removed by a WikiGnome. I searched for shark.armchair.mb.ca and found no matches. After reading about the SharkBot, I'm not convinced the cure is better than the disease. WikiGnome has stopped, the cure might be killing the patient -- MartinSpamer.

Since making this comment I’ve received an enlightening email that has convinced me that SharkBot is necessary. I’m still disappointed that the first response to my initial comment was attack. Though I’m starting to appreciate the damage that has been done to this community that made this possible. This wouldn’t have been case in the past and I’d like to appeal to everybody to try harder to restore WardsWiki to a PositiveDialogueCommunity. –- MartinSpamer.

How do the rest of us know that that last comment was really you, though? My email address is easily found; feel free to email me for confirmation. -- MartinSpamer.

Additional comments moved from WikiVandals.

I forgot where I read it, I think it might have been personal e-mail, but I think this wiki will eventually be retrofitted to use some kind of public, open identification system to track legitimate users, and block known offenders. I wonder what the status of that is. -- SamuelFalvo?

The "shark" thing is an (attempted) response to the GrammarVandal. I don't think a sign-in system will provent GrammarVandal-like attacks for the same reason that spam is such a difficult problem, mainly spoofing and zombies. GrammarVandal is a determined person. I would guess the only way to stop him/her/it would be to trace the zombies back to the source and bust him in the courts for zombying. Many are just hoping he will tire and go away.


Need the Shark over at TheAdjunct! Is it possible to get some help over there? We've got a spammer that just won't quit. -- BrucePennington


Dave, you have let the vandal mess up the version of the sandbox so that the ascii-art is now fouled when you, or your bot, restores it. Please do not reverse my changes. -- Dial1.Seattle1.Level3.net

That was me. Sorry. I thought I was smarter than my 'bot, but my 'bot is smarter than me. I should have left well enough alone and let it do its job... -- DV

Thanks! -- Dial1.Seattle1.Level3.net


DV, why is ChangesInWeek?, and RecentChanges such a mess? Is anyone going to fix this, because so far I see your bot is insisting on reverting whatever it is that someone is trying to do. Should I just jump in and fix the mess, or are you going to do that? -- Dial1.Seattle1.Level3.net

GrammarVandal has been doing ChangesInWeek?<x>. I allowed it for a while. Now, no more. I'll fix it when GrammarVandal is gone. -- DV

JeffGrigg did ChangesIn pages for a few years until GV took over that job. -- Eliz

A confirmation: Yes, I, JeffGrigg came up with the ChangesIn<Week> pages, to replace the old ChangesIn<Month> pages when the monthly pages started to get crazy big. But I stopped maintaining them several years ago, and was happy to see that others (even if it was the GrammarVandal) took it up. -- JeffGrigg

Someone or something has broken ChangesInWeekNineteen?. The entries are not there, and neither are they on any of the entries at /wiki/history/ChangesInWeekNineteen?. If this is SharkBot, then I suggest that the cure is now more harmful than the disease.

I don't have it set to auto-revert ChangesInWeekNineteen?, and I don't recall using the SharkBot to manually revert any edit to ChangesInWeekNineteen?, though it's possible (but rather unlikely) that I did so in a fit of absent-mindedness. The SharkBot has no special access to /wiki/history, and uses precisely the same mechanisms that a human editor uses. If there's a failure in /wiki/history, it must lie elsewhere.

This does make me wonder if anyone actually uses ChangesInWeek?<xxx>. It would be relatively trivial for me to use some of the SharkBot infrastructure to automate ChangesInWeek?<xxx>, but there's not much point if ChangesInWeek?<xxx> are being updated without purpose. I've never seen the point in it myself, but maybe that's just me. -- DaveVoorhis

The top edit removed almost everything, and is identified as DonaldNoyes, though given that he still self-identifies it could actually be an honest error by him. Alternatively, it could have been overlooked by someone doing the ChangesInWeek? process. All of which leaves me confuddled as to what the numbers on NewRecentChanges are supposed to mean, since in my moving some lines up the page and adding one word, NRC has recorded those two as "-141" and "0 del" respectively.

GrammarVandal has deleted and re-created ChangesInWeek? a number of times, perhaps in a futile attempt to confuse the SharkBot. Repeatedly deleting and re-creating the same page often mungs NewRecentChanges. -- DV

You're breaking it again. Regardless of causation, the actions of SharkBot are your responsibility. I am not going to ask your permission to edit. -- AnonymousCoward

Why not? If you're not GrammarVandal, I can whitelist your IP and you'll be free to edit all you like. -- DaveVoorhis

{You can't usefully whitelist a dynamic ip address.}

I can whitelist a range. -- DaveVoorhis

{That includes potentially anyone using the same service-provider.}

Possibly, but not necessarily, and ranges can be expanded or contracted as needed. IP-based filtering is an imperfect mechanism. -- DV

[Braces added to distinguish a different AnonymousCoward] I believe this still amounts to requiring your permission, not to mention defining me by my IP address(es). I don't believe you have any business in requiring anyone else to prove to you that they are not the GrammarVandal, but rather it is for you to prove they are (which it would seem you've been able to do previously). "Tell me who you are, or the Shark will eat you!" Unless the entity known as DaveVoorhis is merely a alternate facet of WardCunningham, of course, in which case it would be entitled to do so :-)

I am not an alternate facet of WardCunningham, unless the voices in my head know something that they haven't been telling me...

I can appreciate your point that I shouldn't represent some gateway through which editing or identity must be proved, nor should I in general require participants prove they are not the GrammarVandal. However, this is a special case, I think you may be making a mountain out of a molehill, and furthermore I would argue that the value of keeping GrammarVandal at bay is worth the minor inconvenience of being whitelisted. The less he/she/it is allowed to edit at all, the more likely he/she/it will (eventually) recognise the utter futility of even attempting to edit anything here, and the sooner he/she/it will find other ways to amuse him-/her-/itself. At that point the SharkBot can be turned off, and all this ceases to be an issue. -- DaveVoorhis

Are you sure you're not overreacting here, and are being (at least in part) the architect of your own frustration? It is, after all, trivial for me to whitelist you. -- DaveVoorhis

Neither the SharkBot nor myself are yet convinced that you aren't the GrammarVandal. I note that you demonstrate a somewhat more, er, chatty style than that previously exhibited by GrammarVandal, but I wouldn't put it past you, er, the GrammarVandal to try new tactics. The SharkBot, of course, has no opinion on that. It's a machine, and has other reasons to believe you might be the GrammarVandal. If you're not the GrammarVandal, then I apologise for the inconvenience and irritation. If you are the GrammarVandal, then, well, nice try. For the time being, myself and the SharkBot shall continue to gather data. If it turns out you're one of the good guys, then your continued tolerance would be most appreciated, and I will endeavour to re-instate your edits as appropriate. -- DV

The bot seems to rely on the nature of edits, and assumes that most articles will not be refactored or substantially changed very often. At times, it appears to be attempting to detect dictators by using vegetarianism, university education, and a love of the sound of their own voice as heuristics. A better solution would be to whitelist the ChangesInWeek? pages, which - by their nature, and as a set - change radically and frequently. If you must insist on undoing edits to those pages, at least leave them out of the detection mechanism, since on pages where grammar and presentation play little part you have no reliable means to identify the GV from those pages alone - with other pages you might get false-positives, but if you include CIW then it appears that false-positives are guaranteed, and the bot will attack anyone it doesn't "know". It's generally clear from the diffs that substituting last week's edits for those from 53 weeks ago is certainly not a StupidLittleEdit?, and "GrammarVandal edited these" as a heuristic for detecting the GV is anywhere between not-particularly-good to BrainDamaged. Locking out legitimate users is not a desirable side-effect, and the general consensus is that sacrificing good users for the sake of one annoyance is a BadThing, and any time now I'm sure TopMind will present us with ObjectiveEvidence? of such.

I agree that locking out legitimate users is not a desirable effect, but I would argue that allowing GrammarVandal (as determined by a gaggle of heuristics, which are -- like any heuristics -- prone to false positives and negatives) any opportunity to edit is worse than preventing edits to ChangesInWeek?<xxx>. I have offered to automate updates to ChangesInWeek?<xxx>, pending some confirmation that it's actually of use to someone. I've not seen any such confirmation, and strongly suspect that ChangesInWeek?<xxx> are being maintained purely because it seemed like a good idea at the time, not because the good (?) idea is actually of value to anyone. As such, I have little problem regarding ChangesInWeek?<xxx> as nowt but maintenance for maintenance's sake. Therefore, inadvertently preventing legitimate users from maintaining an apparently pointless index is of considerably lower concern to me than (a) allowing GrammarVandal any edit opportunities, or (b) inadvertently locking out legitimate users from editing meaningful content.

You've made some good points. In the interest of moving forward on this, I will turn off automated reversion of ChangesInWeek?<xxx> edits (at least under the conditions that have caught you) if others are in agreement that doing so is the right course of action. I think it is appropriate, at this juncture, to solicit comments on this matter from other participants.

By the way, why the pot-shot at TopMind?

-- DaveVoorhis

Perhaps a more pressing problem should be the fact that that there is a gateway not requiring the magic code on simplewebs.com.

Can you confirm that simplewebs.com bypasses the secret code word, as opposed to the public code word? -- DV

All I can see is that an edit can be pushed here through there (preserving the source IP address) without having to enter anything. Without access to any logs or the internals, I can't say which code word it's passing on. I edited through it earlier, and it seems to have added a smiley at the top. However, I'm not going to remove it for fear that the shark will simply undo the entirety of my edit.

A simple investigation suggests that simplewebs always provides 567 as the code word, as it's hardcoded. I would think your reply above practically guarantees GV will continue, whilst all other users, unless "recognized" will not be able to, say, correct spelling alone, since that would be assumed to be a GV edit. Similarly, your logic implies that a whitelisted ip that reverted your bot's edits more than once would almost certainly cease to be whitelisted unless you knew for certain it wan't GV, despite your assurances that users other than GV would be allowed to correct spelling, etc.

Other users are free to make spelling corrections, etc. Mere spelling corrections are generally not sufficient to trigger the SharkBot; other indicators are used. However, ChangesInWeek?<xxx>, due to their obvious lack of textual content per se, are handled differently. -- DV

By definition, a simple reversion of a SharkBot edit would be sufficient, but in many cases would be the simplest way to correct spelling. Your remark above about not being convinced implies that even as a human, you can't identify edits accurately.

Identification of GrammarVandal is based on a determination of probability, which in many cases -- but not all -- is 100%. That applies to both human and 'bot. -- DV

The wording "but not all" implies "not 100%", but if you rate yourself on a par with your bot, you must concede you have no real basis on which you can spot mistakes by the bot; by definition, you must almost always agree, even when in error.

Over time, the 'bot has proven better at some things than I am. I am better at some things -- e.g., softer "human" heuristics -- than the 'bot. In some cases, like this one, the 'bot's confidence in detecting GrammarVandal is high in areas that have proven reliable in the past, but mine is somewhat (but only somewhat) lower due to certain "human" factors mentioned above. Therefore, I will defer overriding the 'bot until I have more data. -- DV


20080627T1604Z - the top edit on AvivEyal appears to be a null edit:

vdiff doesn't show any change either.

The SharkBot reverted a change to whitespace. -- DaveVoorhis

The last change to EditPage was a null edit by the bot, not even whitespace changed.

It was a reversion to a prior version of that page, distinct from the subsequent edits. -- DV

There are no subsequent edits to that page.

I meant those edits subsequent to the prior version. -- DV

It was distinct by definition, but it was a null edit, so it did precisely nothing. Why make a null edit?

The SharkBot determined that one of the edits, subsequent to the version to which it reverted, was made by GrammarVandal. -- DV

Then it should have reverted that edit, not made a null edit. If someone else reverted it first, that should have been detected and no action taken.

Why bother? The SharkBot was designed to DoTheSimplestThingThatCouldPossiblyWork, much like this wiki. As such, it oozes imperfection but gets the job done. GrammarVandal: Go edit WikiPedia, and it becomes a non-issue. -- DV

That is untrue, since taking no action is necessarily simpler than making a null edit.

Given a set of conditions C which may trigger a reversion R, and another set of conditions Q which may desirably -- but not necessarily -- prevent a reversion R, it is simpler to consider only C in performing R, than to consider both C and Q in performing R. In other words, DoTheSimplestThingThatCouldPossiblyWork means the minimum code needed to trigger necessary reversions, even if that has a side-effect of very rarely generating a so-called "null edit". More code would be needed to prevent the occasional "null edit", so it would no longer be the simplest thing that could possibly work. -- DV

What you called "one of the edits" must have been the last edit, so reverting the previous edit as well by restoring an earlier version (which happened to be the same as the current version) cannot possibly have been a simpler choice corresponding to a simpler condition C.

...Except that "one of the edits" doesn't have to be the last edit. Is there some point to this, or are you just quibbling? For the last time, let me put this very simply: Like a dozen or so similar quirks, the "null edit" would take more code to "fix". Hence, the simplest thing that could possibly work is to leave such issues rather than "fix" them. Now go edit WikiPedia. -- DV

It did have to be the last edit in the particular circumstances, where the previous edit occurred a long time earlier and had nothing in common with a GV edit and everything in common with a newbie test edit. That edit was (correctly) left unchanged by the bot . . . until a second edit occurred. Your explanation is absurd, as the action taken plainly was not the simplest way to act that would "work". The bot could not have known that its restoration would be a null edit, as discovering that needs a far from simple process. It's also absurd to suggest that pursuing the simplest course involves a dozen or so similar quirks. The simplest course is to revert (or not revert) the last edit, and that cannot result in numerous quirks capable of being noticed, only a very occasional false positive.

Pursuing the simplest course of development action almost inevitably results in quirks and oddities, especially given the GoodEnough-is-more-than-enough development process I used to construct the 'bot in the first place. Note the number of quirks and oddities in this Wiki, which was presumably developed in a similar manner. Also, what might appear to be "the simplest course" from an observer's point of view is not necessarily the simplest course from the developer's point of view. Furthermore, I happily admit the 'bot is imperfect, and that there's a list of bugs as long as your arm. However, it does the job for which it was designed and the net negative effect of the bugs (most of which no one sees but me) is minimal. This "null edit" issue, for example, I consider so negligible as to be unworthy of further consideration. If it really upsets you, then it's a feature, not a bug. -- DV

The fact remains that the bot failed to revert the preceding edit that it deemed to be by GV, carrying out a null edit instead.

Could be. It happens. -- DV

A FalsePositive occurred recently when a valid change (not by GV) to AbileneParadox was reverted.

Could be. It happens. -- DV

So it's down to you to rectify the situation, not just shrug your shoulders.

It's on my "to do" list, but with one so-called "null edit" occuring per several thousand reversions, and given a "null edit" (or a whitespace alteration) has no effect on actual content, it's not a high priority. Even better would be for GrammarVandal to finally realise that his/her editing here is a futile waste of energy, and find a place where his/her efforts will be appreciated, like WikiPedia. -- DV

Nomad will never understand priorities because it is a sociopath, and cares only for what it wants. I wouldn't bother devoting too much time to replying to it if I were you. -- EarleMartin

Point taken. Sometimes I forget that attempting to reason with GrammarVandal is also a futile waste of energy. -- DV

There are other people that disagree with you, you know.

If it walks like a duck and quacks like a duck... -- DV


Dave, check the bot's protection of WikiAsciiMathSymbols [and BoostLibraryDiscussion [fighting over the letter a?]]. -- Seattle1


I think there was a false positive today reverting some changes to WikiWikiWeb. Someone else has reverted back. -- JohnFletcher

Yup. I think I've fixed it. -- DV


All excuses aside, this bot causes me to cuss a lot. After all of this time, my IP address is hardly a secret, yet the bot causes me to still heartily cuss at its author whenever we cross paths. -- Seattle1

It's a machine, dude. It's got all the intelligence of a toaster (or, more accurately but less dramatically, all the intelligence of a spam filter), and it can't tell your IP address from a hole in the wall until I tell it your IP is "special". If it makes you feel better to cuss at me, that's cool, but it won't help. What might help is to remember what Wiki was like when GrammarVandal was unrestricted. -- DV

It was much better then, as changes by Seattle1 could stand without your permission being sought in advance.

Of course you'd think so -- you're GrammarVandal. Bye bye! -- DV

Why (a) require permission to be sought in advance, and (b) have no provision for those who don't have a fixed ip? You seem to have no reason for either.

I don't require (a), and (b) would require ESP or magic. Now go edit WikiPedia and leave us alone. -- DV

You do require (a), as you said so above in reply to Seattle1 after his edit was reverted for no reason, and (b) is no problem for other sites.

I do not require (a), however in some cases whitelisting the IP is helpful. On other sites, there are login-based authentication mechanisms that reduce the need for IP-based identification, but still typically employ it as part of a blacklist capability. Now go edit WikiPedia and leave us alone. Kthxbye. -- DV


[August 16 2010] First I played the game with the cookie name before seeing the cookie was pirated. So I removed the cookie and then, my categorizing actions were reverted by the SharkBot (I did quite a few bunch on hundreds of pages - sorry to try to help). That's really stupid!

SharkBot is decategorizing pages I just categorized to have an automatic index! Guys, this is really really counter productive. Those pages will be hard to find again. Cause I am not going to play the WikiGnome again on those pages - and believe me some of them are very interesting but a new comer will have no chance to find them. So what! What's the philosophy behind? Having content that people won't be able to find? If it is not, the SharkBot will have to be recoded to be smarter: For instance not revert small changes that contain the word "Category" in the text! (terrific enhancement!)

Frankly, coming from other wikis, I must say that reverting good actions (including "small things" such as categorization notes) is the best way to have this wiki abandoned despite of its great content and to legitimate the non structuration of the content. At least, you can be sure that it is a way to prevent occasional users to dive into the wiki to discover its marvelous content. That seems totally silly to me, I can hardly believe it!

Proposition of (smarter) solution against spam: Why don't you name official WikiGnomes (with some rights) for a while that take a commitment that they will do cleanup and categorizing in a set of pages for let's say 1 or 2 months? Guyz, with the number of people that have a page here!

I think people got probably used to that stupid bot with time and so they probably limit their actions to adding content + category (when they think about it). I was caught, I understood the message. What is pissing me off is not really understand the full rules or to discover them far after the changes are done - THANKS GUYZ I TOTALLY LOST MY TIME HERE (several hours). Cause we're here in the summum of the AntiPatterns applied in real life, before my eyes, live and in action! A code that is worst that the syndrome it is fighting against! That's not serious.

And for new comers for me, full of envy to make the content findable (and structured - guyz we're in IT and we like structuring things don't we?), OK, that's a really BadThing. For me, I'm done, I'm over. I accept having disagreements with people and fighting about editing pages (It occurred to me twice in more than 10 years) but doing a nice job and seeing it destroyed by a stupid bot, that's really silly.

Well I guess, the priority of c2.com is the content and the priority is not finding it. But remember Borges and its Babel library: all users of c2.com don't have a lifetime to find "the" page they are looking for - or to lead some kind of investigation. Better pirate those pages and put them in another wiki where anti-spam will be less silly. I am very upset by that but well, I have to think about helping somewhere else. You killed my motivation, really. On the other hand, noone invited me here so I suppose this is all my fault for trying to help on subjects I like a lot.

Best regards SharkBot, WikiGnome killer. You had me.

-- OlivierRey

P.S. I let a page full of things to do. I am so bummed out!

The pages that you categorised, which were reverted, were reverted because GrammarVandal edited them after you did. You used a UserName cookie and GrammarVandal spoofed it. Unfortunately, due to the way WardsWiki works, edits with the same UserName cookie are amalgamated. When GrammarVandal edited the same pages you edited, his edits were reverted, which obviously reverted yours as well since they're now the same edit.

When I see this sort of thing happening "live", I generally override the Sharkbot in order to preserve the legitimate edits. In this case, I didn't happen to see it happening. The SharkBot's reversions were retained. Since you now edit without a UserName cookie, it won't happen to you again.

In the long term, deleting the pages referencing the UserName cookie would probably be a good idea. It will prevent future WikiGnomes from being caught by this sort of thing.

Sorry for any inconvenience. -- DaveVoorhis

Yeah, uh, Olivier, about that -- not every page of this Wiki need be categorized. Sometimes pages just need to be. It might be a Good Thing to wait a while before jumping in as you did. You created a huge pile of new "categories" that were minor branches of already existing categories. We don't need quite that level of detail in organizing this Wiki's content. PleasePleaseDontCategorizeEveryPageOnWiki.

And as for newbies and casual visitors: if you found it why can't they? This is what FindPage is all about, is it not? Yes, the search functionality is a bit weak, but it will suffice to find discussions on topics of interest. Trust the other visitors to this Wiki to be as smart and as persistent as you are.

-- MartySchrader


Really?!! Regarding WabiSabi, you want the bot to choose "embrassed" over "embraced"? This is the problem, no? Really now...

The 'bot can't spell. It's really good at other things, though. -- DaveVoorhis


See: GrammarVandal, AnonIsStillBanned, CeaseAndDesist, ZeroTolerance, PissedOffAndExtremelyAngry, SharkBotConsideredHarmful


CategoryWikiMaintenance


EditText of this page (last edited December 6, 2014) or FindPage with title or text search