Batch download from HTTP directory listing

 
 

Want to download a number of files that are exposed through directory listings via HTTP? It can be really time consuming to do manually, especially if it also includes sub folders. Thankfully there's the tool Wget to the rescue! And these are the steps to follow:

Download and extract GNU Wget:
http://www.gnu.org/software/wget/

Launch a command line prompt (ie run "cmd") and enter the following command (replacing "http://domain/folder/" with the url you want to download from:

wget --wait=1 --recursive --no-parent --reject "index.html*" "http://domain/folder/"

..or the same but with parameters in short-hand:

wget -w1 -r -np -R "index.html*" "http://domain/folder/"

Explanation of parameters

-w1, --wait=1 adds a 1 second wait between requests out of courtesy
-r, --recursive do a recursive request (follow links)
-np, --no-parent don't follow links backwards
-R "index.html*", --reject "index.html*" don't save any folder listings, just the files.

Related posts:

Comments

comments powered by Disqus