IT geek question: B...
 

[Closed] IT geek question: Backing up a long forum post for offline reading

Posts: 531
Free Member
Topic starter
 

Does anyone know of a way to save an entire 90-page forum post, without literally clicking on each page?

It's on http://forums.virtualfestivals.com/myfests/

I met my girlfriend on it over 5 years ago, so I would like to save it in case it vanishes!

Cheers 🙂


 
Posted : 28/06/2011 9:43 pm
Posts: 0
Free Member
 

can't you just save the web page?


 
Posted : 28/06/2011 9:47 pm
Posts: 0
Free Member
 

copy/Cut and paste it in to word


 
Posted : 28/06/2011 9:50 pm
Posts: 531
Free Member
Topic starter
 

Yes, but it just saves the one page. The post is 90 pages long!


 
Posted : 28/06/2011 9:51 pm
Posts: 621
Free Member
 

you need a recursive scraper. "Wget --mirror --convert-links sitename.com/page" if you've got linux available. [url= http://www.devarticles.com/c/a/Web-Services/Website-Mirroring-With-wget/1/ ](more here)[/url]

there must be a windows clicky clicky equivalent


 
Posted : 28/06/2011 10:09 pm
Posts: 0
Free Member
 

Can't you contact the agency and see they have got any back issues of the catalogue?


 
Posted : 29/06/2011 12:05 am
Posts: 531
Free Member
Topic starter
 

Retro83 – Thanks very much, I will check that out 🙂

CharlieMungus – LOL!


 
Posted : 29/06/2011 7:41 am
Posts: 5
Free Member
 

There was a program called http weazel I used in the past, might be of use:
http://www.metaexception.com/en/products/HTW/


 
Posted : 29/06/2011 7:45 am
Posts: 531
Free Member
Topic starter
 

maxray – perfect, many thanks!


 
Posted : 29/06/2011 8:00 am