Tool which can download the contents of a website recursively [duplicate]

Posted on

QUESTION :

I need a utility which is capable of downloading the contents of an website recursively. For example I have a website URL which has 10 hyperlinks. Using the utility, I should be able to download the contents of those 10 hyperlinks to my local system.

Please let me know if you are aware of any such utility.

ANSWER :

I would suggest looking at wget.

Reference: http://www.linuxjournal.com/content/downloading-entire-web-site-wget

Leave a Reply

Your email address will not be published.