Scrape: page 1
Scraping websites with wget and httrack
Scrapes can be useful to take static backups of websites or to catalogue a site before a rebuild. If you do online courses then it can also be useful to have as much of the course material as possible locally. Another use is to download HTML only ebooks for offline reading. There are two ways that I generally do this - one on the command line with wget and another through the GUI with httrack. Read more ⇒