Lets Insulate Ourselves

From CrazyThingsThatMightSaveWiki

To dissuade spammers, modify RobotsDotTxt to tell the google-bot and other search engine bots to ignore the site completely. (Or at least don't follow any of the links; so GoogleLovesWiki won't still be true). Of course, people's HomePage will no longer have a high PageRank (or whatever they call it these days), but the reason for spammers to come here will be limited.

Apparently, this is possible already. The PythonCommunityServer? wiki claims, when editing, "Important note: All external links from these pages go through Google's redirector to remove PageRank. Don't bother spamming here to increase your PageRank; it won't work.". ...although I first found out about this when I editing its front page to remove WikiSpam. sigh

Even if we don't have logins; how about IP-banning known HTTP relays, anonymizers, and other means that might be employed for distributed and anonymous attacks against Wiki?

Let's all lay off WardsWiki for a month and see whether that makes a difference. Or we could allow certain non-contentious supervision like DeletedButWelcomeToWiki.

As for spams, using RobotsDotTxt to repel Google should be good enough.

Have to disagree on this one for multiple reasons:

It would be nice if there were some way to tell Google (and other search engine bots) that "this collection of nodes is a highly-connected graph; modify your PageRank algorithm appropriately". Agreed with the advantages of having Google index Wiki.


CategoryWikiProgress


EditText of this page (last edited January 26, 2005) or FindPage with title or text search