I need to download all the images from a website on my computer. The images are not listed all in one page. Instead I have an index page that links a set of pages (
page1.html
,page2.html
, …) each with many images inside.DownThemAll of Firefox only downloads the images from a single page.
There is no
wget
, but still I understand it would download the images from one page. In a sense I need the images at distance 2. Linked to a link from a page.
Answer
You can install wget
on OS X through Homebrew or MacPorts.*
Then, it’s as simple as:
wget -nd -A jpg,gif -r http://example.com
- The
-nd
option does not create a directory for each path segment. -A
sets the file types allowed.-r
crawls recursively.- You can add
-l 2
(or similar) to restrict it to two levels of recursion.
* If this is too complicated for you. alternatively there’s a 2008 blog with a prebuilt version, but it might not work in all cases.
Attribution
Source : Link , Question Author : Pietro Speroni , Answer Author : slhck