Some 'wget' tricks and examples...

  • Dowload all pdf-files listed on an index-page
    wget -r -l1 -nH --no-parent --cut-dirs=number -A ".pdf" URL
  • Copy a complete website
    wget --mirror -p --html-extension -nH --convert-links --cut-dirs=number -P outputdir URL

    --mirror: Specifies to mirror the site. Wget will recursively follow all links on the site and download all necessary files. It will also only get files that have changed since the last mirror, which is handy in that it saves download time.

    -w: Tells wget to 'wait' or pause between requests, in this case for 2 seconds. This is not necessary, but is the considerate thing to do. It reduces the frequency of requests to the server, thus keeping the load down. If you are in a hurry to get the mirror done, you may eliminate this option.

    -p: Causes wget to get all required elements for the page to load correctly. Apparently, the mirror option does not always guarantee that all images and peripheral files will be downloaded, so I add this for good measure.

    --html-extension: All files with a non-HTML extension will be converted to have an HTML extension. This will convert any CGI, ASP or PHP generated files to HTML extensions for consistency.

    --convert-links: All links are converted so they will work when you browse locally. Otherwise, relative (or absolute) links would not necessarily load the right pages, and style sheets could break as well.

    -P (prefix folder): The resulting tree will be placed in this folder. This is handy for keeping different copies of the same site, or keeping a ?browsable? copy separate from a mirrored copy.

    Disable generation of host-prefixed directories. By default, invoking Wget with
    -r http://fly.srk.fer.hr/ will create a structure of directories beginning with
    fly.srk.fer.hr/. This option disables such behavior.

    Ignore number directory components. This is useful for getting a fine-grained
    control over the directory where recursive retrieval will be saved.

    Take, for example, the directory at ftp://ftp.xemacs.org/pub/xemacs/. If you
    retrieve it with -r, it will be saved locally under ftp.xemacs.org/pub/xemacs/.
    While the -nH option can remove the ftp.xemacs.org/ part, you are still stuck
    with pub/xemacs. This is where --cut-dirs comes in handy; it makes Wget not
    ``see'' number remote directory components. Here are several examples of how
    --cut-dirs option works.

    No options -> ftp.xemacs.org/pub/xemacs/
    -nH -> pub/xemacs/
    -nH --cut-dirs=1 -> xemacs/
    -nH --cut-dirs=2 -> .

    --cut-dirs=1 -> ftp.xemacs.org/xemacs/

    If you just want to get rid of the directory structure, this option is similar
    to a combination of -nd and -P. However, unlike -nd, --cut-dirs does not lose
    with subdirectories---for instance, with -nH --cut-dirs=1, a beta/ subdirectory
    will be placed to xemacs/beta, as one would expect.