The wget
command is a widely used command-line utility for downloading files from the web and is included by default on most Linux distributions. It supports downloading via HTTP, HTTPS and FTP protocols, and is great tool for automating downloads of files.
The most common usage of the command is to just download a specific file. Here is the basic syntax and an example.
// Syntax
wget [OPTION]... [URL]...
// Example
wget https://example.com
However just downloading a file is not all it can do, below are some other wget
examples that I find useful.
These commands can be valuable if you want to download a site.
wget --recursive https://example.com
This will retrieve files recursively, the default is with a depth set to 5, but that can be changed with the flag --level
.
wget --recursive --level 2 https://example.com
There is also a flag available to retrieve a mirror of a website, --mirror
, this will create a local copy of the website including JavaScript and CSS files.
wget --mirror https://example.com
// It is currently equivalent to
wget -r -N -l inf --no-remove-listing https://example.com
To download multiple files and maybe also from different URLs, then you can use a text file as input to the wget
command, input-file.txt
:
https://example.com
https://google.com
And then the command.
wget -i input-file.txt
It possible to run the command in the background by adding the flag, --background
.
wget --background https://example.com
wget
normally identifies as wget/version
, version being the current version number of wget
, when requesting data from the server. However it's possible to change that with the flag, --user-agent=agent-string
.
wget --user-agent="My agent name is wget" https://example.com
As you can see there are plenty of flags to explore with wget
, here is one more neat thing you can do with wget
. That is to use the flag, --max-redirect
, to figure out where shortened URLs are going.
// A bit.ly address that points to example.com
wget --max-redirect 0 https://bit.ly/2yDyS4T
To learn more check the man page, man wget
.
Happy downloading!
Top comments (1)
Thanks for sharing.