Does anyone know of a way to save an entire 90-page forum post, without literally clicking on each page?
It's on http://forums.virtualfestivals.com/myfests/
I met my girlfriend on it over 5 years ago, so I would like to save it in case it vanishes!
Cheers 🙂
can't you just save the web page?
copy/Cut and paste it in to word
Yes, but it just saves the one page. The post is 90 pages long!
you need a recursive scraper. "Wget --mirror --convert-links sitename.com/page" if you've got linux available. [url= http://www.devarticles.com/c/a/Web-Services/Website-Mirroring-With-wget/1/ ](more here)[/url]
there must be a windows clicky clicky equivalent
Can't you contact the agency and see they have got any back issues of the catalogue?
Retro83 – Thanks very much, I will check that out 🙂
CharlieMungus – LOL!
There was a program called http weazel I used in the past, might be of use:
http://www.metaexception.com/en/products/HTW/
maxray – perfect, many thanks!