windows - Downloading entire web pages within domain
2014-07
This question already has an answer here:
- How can I download an entire website? 13 answers
Free Download Manager's HTML Spider feature might be what you're looking for. It even supports user login and password.
I was wondering if there was any argument that allowed me to use wget
and "call" a page, but without downloading the page.
The problem is that when you call wget
on a page, it downloads it to the folder wget.exe
resides in, but I don't want to download a file every time I use it.
Use the --spider
option.
--spider When invoked with this option, Wget will behave as a Web spider, which means that it will not download the pages, just check that they are there. You can use it to check your bookmarks, e.g. with: wget --spider --force-html -i bookmarks.html
GNU Wget is a free software package for retrieving files using HTTP, HTTPS and FTP. If you are not actually interested in retrieving files, you may be better off using a tool like cURL, which may have more fine-tuned support for what you want to do. (Using the --spider
switch in Wget, for example, is an experimental hack according to the documentation.)