Problems backing up website

IPXP

New Member
Hi, I'm trying to backup a website but it won't backup past the first page!!! The first page has a "web Address" but the other parts of the website are kinda pop-up pages with no web address at the top of the page.

I've tried Superbot website downloader, surfoffline, website extractor & site shelter, does anyone have any ideas what program I can use to back up all the content, including the pages that don't have web addresses?

Any help would be appreciated.

Cheerz
 
well you can ftp to the website, and copy the entire directory... Or use a program like flashfxp also known as flash get
 
erm.... possibly depends what kind of account your on about.... whats the address or URL of the server with the info on it that you need....?

dragon
 
Well the first one is "http://cisco.netacad.net/cnams/dispatch" and the second "http://cisco.gateshead.ac.uk/Student/cisco.htm"

And I know it can be back up as I have downloaded an older version of the course, I just don't know what was used or how it was done and that's frustrating.
 
Yes.

Go on to those web sites then click view scroll down and click source.

Now save this document as whateveryouwant.htm or whaeveryouwant.html (make sure you've got the .htm or html bit).

Also save the images on the page.

Hope this helps


[If you need new hosting you can always go to www.oder.co.uk/packages prices range from £1/$2 a month for 2GB bandwidth and 100MB webspace]
 
Back
Top