Wget is a powerful, non-interactive command-line utility used to download files from the web in Linux and other Unix-based systems. It supports downloading via HTTP, HTTPS, and FTP protocols, making it a versatile tool for transferring files. Whether you’re downloading a single file, entire directories, or even websites, wget
has plenty of useful features.
Below are some practical examples of wget
commands to download files in Linux.
Prerequisites #
- Linux-based operating system: Ensure that you’re using a Linux distribution (such as Ubuntu, CentOS, or Debian).
- Wget installed:
Wget
is often pre-installed on many Linux systems. To verify or install it, run:
sudo apt install wget # For Debian/Ubuntu systems
sudo yum install wget # For CentOS/RHEL systems
sudo dnf install wget # For Fedora systems
Wget Command Examples #
In all the examples given below, remember to replace the URLs with your actual URLs where you are downloading the files from.
Also, the file being downloaded shoud have correct permissions (like 644) that allow it to be downloaded
1. Download a Single File to the current directory
- You can run the command below to download one file to the current working directory.
wget https://truehost.com/download.zip
2. Download a File and Save it at a Specific Location
- If you want to download the file at any location other than the current working directory.
wget -P /path/to/directory https://truehost.com/yourfile.zip
3. Download a File and Rename It
- To download a file and save it with a different name, use the
-
O option (capital “O”) placed before the URL: - The below command saves
file.txt
asnew_name.txt
.
wget -O new_name.txt http://truehost.com/file.txt
4. Resume Interrupted Downloads
- If a download is interrupted, you can resume it by using the
-c
option. - This command resumes the download from where it left off, rather than starting from scratch.
wget -c http://truehost.com/yourfile.zip
5. Download Multiple Files
- You can download multiple files by specifying URLs in a text file and using the
-i
option: - In the command below
file_list.txt
contains the URLs of the files to be downloaded, with each URL on a new line.
wget -i file_list.txt
6. Download a Website (Mirror)
- You can download an entire website recursively (mirroring) using the -r option.
- This command downloads all files from the website and preserves the directory structure.
wget -r http://example.com/
- To ensure that all files are saved for offline viewing (with relative links), you can use:
wget -r -k http://example.com/
7. Limit Download Speed
- If you have bandwidth limitations or want to control the download speed, use the –limit-rate option.
- This limits the download speed to 200KB/s.
wget --limit-rate=200k http://truehost.com/yourFile.zip
8. Download Files in Background
- To download a file in the background, use the
-b
option:
wget -b http://truehost.com/yourFile.zip
9. Download Files with Authentication
- To download a file from a password-protected URL, use the –user and –password options.
- Replace
username
andpassword
with the actual credentials.
wget --user=username --password=password http://example.com/yourFile.zip
10. Download Files with Cookies
- Some websites require cookies for downloads. You can pass cookies to wget using the –load-cookies option.
- First, save the cookies to
cookies.txt
and then use it with this command.
wget --load-cookies cookies.txt http://truehost.com/file.zip
11. Download Using FTP
- You can download files from an FTP server using wget:
wget ftp://ftp.truehost.com/file.zip
- If the FTP server requires authentication:
wget --ftp-user=username --ftp-password=password ftp://ftp.truehost.com/file.zip
12. Set a Number of Retries for Failed Downloads
- By default,
wget
retries failed downloads a limited number of times. To set a custom retry limit, use the--tries
option: - This command will retry up to 10 times if the download fails.
wget --tries=10 http://truehost.com/file.zip
13. Check File Size Before Downloading
- If you want to check the file size before downloading it, use the
--spider
option:
wget --spider http://example.com/file.zip
14. Download Over an Untrusted SSL Connection
- If the SSL certificate of a website is invalid or untrusted, you can bypass the SSL certificate check with.
- Note: Use this option cautiously, as it can expose you to security risks.
wget --no-check-certificate https://example.com/file.zip
These examples are just a few of the many possibilities with wget
. For more details on its usage, you can always refer to the manual by typing:
man wget