Occasionally you come across a page of content and you just don't feel like clicking on each image or video individually. The other day I came across a site of hot rods with over a thousand images. That's a lot of clicks ! Who has time for that. Here's what you do.
wget to the rescue !
You may have used wget to download stuff from the internet. A simple file grab with wget would look like this :
If you've never used wget before to download a file, search the internet for a file to download, open up a terminal, type wget and paste the download link after it like in the example above.
I don't know why but for some reason when I download something from the command line using wget it seems to download so much faster than downloading it in Firefox or using a browser. So I often cut and paste download links from the internet and download with wget from the command line. Ahright, I digress ! Assuming you understand the very basics of wget, here's how we would use it to grab images from a webpage.
Note: Use man wget to learn more about this command. This is the very basics and just something I've been experimenting with.
wget -r --level=2 -v -A jpeg,jpg --wait=2 http://www.targetDomain.com/webpage.htm
The above is all one line of code. So what do we have here and why does it work?
wget (command line utility used to download files)
-r (this is recursive, and will continue scanning directories to find the images or videos)
--level=2 (this will only allow it to scan 2 levels of directories, the higher the number the more directories it will download.)
(if you wanted to download an entire website of files you could omit the --level=2 NOT RECOMMENDED, but you could if you want)
-v (this is verbose, and will show you whats happening as it downloads, again this is optional)
-A jpeg,jpg (this creates the accept list in this example jpeg, jpg,. You could just as easily change jpg to gif, flv, mp3, mp4 etc, etc, or change it up a bit and download jpg, and mp4. That would download images and video. You can add as many file types as you want here separated by comma.You get the point ! )
--wait=2 (this is really important, this gives you a 2 second wait before each downloaded file. This command will download files so fast, that you really want to add this to help decrease server load. If you were to download an entire site, leaving out --level you should probably increase this number to around 5 to 10 seconds. You don't want to DDoS the server.
http://www.targetDomain.com/webpage.htm (the web page or website you want to download. Again I would refrain from downloading entire websites, as this can really strain the server your downloading from.)
So there you have it. This is the very basics, and you could really get more detailed in creating a Super Duper Image and Video Scooper command. As always to learn all the command options type man wget to learn more about this powerful tool/command ! Once you create a really great command , create a script for it and then you could just run the script and add the web address for instant downloading fun.
Here's a link to the ftp: man page on wget.