Lifehacker had a review of a free Windows tool for downloading a Web site for offline use. This is a great way to store a site that has information on it that you want to keep intact – that may disappear, for example, if a legal action is started, or that has information on it that you want to be able to ensure doesn’t change.
There are many offline Web site tools but Fresh WebSuction has got to be one of the more memorable. The Lifehacker post also mentions the venerable Wget that runs on Windows and Linux. Other products you might look at include MaximumSoft’s Web Copier, which has versions for Windows, Mac, and Linux. I like HTTrack as well, which is a free option for Windows and Linux users.
You put in the Web site address (URL) of the site you want to download and the offline tool will start to crawl the site, downloading the files that you have indicated you want. For example, you can eliminate picture files if you want, or focus on just a directory or folder of the site. The files that are downloaded are stored on your computer as individual files, in the same arrangement as they are found on the site. You can then open your Web browser and view the files on your computer. Links between the files are made relative to each other, so when you open a file and click on another file that came from the Web site, you stay on your own computer. In this way, you can browse the entire site without actually going to the remote Web site.