Scraping websites with wget
There are many tools out there to download/scrape websites, i.e. curl, httrack, sitesucker, deepvacuum (which is actually a GUI wrapper for wget) and probably more. I find wget to be one of the most useable tools to get an entire website. Make sure to use the option --convert-links,…