Wiki Snagger

I'm looking for a way to take the html generated by any wiki and copying it into a local folder. This should be helpful when trying to use WikiServer to create html which can then be distributed to others, or uploaded into a web folder.

Have you tried any of these? Do you know a better way?...

A product called "Site Snagger" is featured in a tutorial by PC Magazine - http://www.pcmag.com/article2/0,4149,35556,00.asp

A product called "Page Sucker" is profiled here - http://www.downloadfreetrial.com/internet/inte12308.html

A product called "Web Copier" is featured in a tutorial at download.com - http://download.com.com/1200-2001-5086518.html


It seems that a review of utilities in this category is here - http://www.webattack.com/shareware/downloader/swoffline.html


Plucker is an excellent open-source choice, particularly for PDA's.


The canonical utility to do this (and a lot more) on Linux systems is 'wget'; it's both gratis and libre and available at http://www.gnu.org/software/wget/. There are also ports to other systems, including MicrosoftWindows.


Typing ctrl-s in your browser is adequate in most cases. The above products tend to assume you need local copies of referenced pictures, etc., which you probably wouldn't in the case of wiki pages.

no no... that won't do at all... we're talking whole wikis... not just pages within a wiki


Here's a silly idea I had. It's a completely different way of "using WikiServer to create html which can then be distributed to others".

Possible implementation routes:



EditText of this page (last edited October 10, 2006) or FindPage with title or text search