The wget command is a widely used non-interactive command-line utility for downloading files from the internet. It supports various protocols, including HTTP, HTTPS, and FTP, and is designed to work in the background without requiring user interaction. This makes it particularly useful for downloading large files, retrieving content from unreliable connections, mirroring websites, and scheduling downloads via scripts or cron jobs. This guide consolidates all key concepts, syntax, and examples to help you master wget.
wget [options] [URL]options: Command-line switches that controlwget's behavior.URL: The web address of the file or website you want to download.
wget https://example.com/file.zipDownloads file.zip to the current directory.
wget -O newname.zip https://example.com/file.zipSaves the downloaded file as newname.zip.
wget -c https://example.com/file.zipContinues downloading a partially downloaded file.
wget -b https://example.com/largefile.isoRuns the download process in the background and logs output to wget-log.
Check progress with:
tail -f wget-logwget --limit-rate=200k https://example.com/file.zipLimits download speed to 200 KB/s.
wget -P /home/user/Downloads https://example.com/file.zipSaves the file in the specified directory.
Create a file urls.txt:
https://example.com/file1.zip
https://example.com/file2.zip
Download all:
wget -i urls.txtwget -r https://example.com/Downloads the entire site recursively.
Limit recursion depth:
wget -r -l 2 https://example.com/wget -r -p --convert-links https://example.com/Downloads entire website, including assets, and converts links for local viewing.
wget --mirror -p --convert-links -P ./local-dir https://example.com/Creates a full local mirror of a website with all necessary files and links.
wget -r -A pdf https://example.com/Recursively downloads all .pdf files from the website.
wget --user-agent="Mozilla/5.0" https://example.com/file.zipSpoofs a browser user-agent to avoid download blocking.
With credentials:
wget --user=username --password=password https://example.com/protected.zipPrompt for password:
wget --user=username --ask-password https://example.com/protected.zipwget --spider https://example.com/Validates the URL or checks website availability without downloading files.
wget --tries=10 https://example.com/file.zipSets the number of retries (default is 20).
wget -w 10 https://example.com/file.zipWaits 10 seconds between each download to reduce server load.
wget https://example.com/file.txt -o download-log.txtLogs system-generated messages to download-log.txt.
Append instead of overwrite:
wget -a download-log.txt https://example.com/file.txtwget -vDisplays the current version.
wget -hDisplays the help message and available options.
| Option | Description |
|---|---|
-v |
Display version of wget |
-h |
Show help message |
-o logfile |
Save output to specified logfile |
-a logfile |
Append output to logfile |
-b |
Run in background |
-i file |
Read URLs from a file |
-c |
Resume partially downloaded file |
-r |
Enable recursive download |
-l depth |
Set recursion depth |
-p |
Download all required page assets |
--convert-links |
Make links suitable for offline viewing |
--mirror |
Enable mirror mode |
--user-agent |
Set a custom user-agent string |
--user, --password |
Use HTTP authentication |
--limit-rate |
Limit download speed |
--spider |
Check if the URL exists without downloading |
--tries=number |
Set the number of retry attempts |
-w time |
Wait time between downloads |
-P dir |
Save files to a specified directory |
The wget command is a versatile and robust tool for downloading files and entire websites directly from the terminal. Its non-interactive nature allows background operations, script integration, and support for slow or unstable connections. With numerous options for customization—such as limiting speed, resuming downloads, mirroring, and handling authentication—wget is essential for efficient web content retrieval in Linux and Unix-based systems.
cURL is a command-line tool for transferring data to or from a server, supporting various protocols including HTTP, HTTPS, FTP, SCP, SFTP, and more. It is lightweight, scriptable, and widely used for tasks like downloading files, testing APIs, and debugging web applications. cURL is pre-installed on most Linux distributions (e.g., Ubuntu, Debian, CentOS) and is accessible directly from the terminal.
- Versatility: Supports numerous protocols (e.g., HTTP, HTTPS, FTP, SMTP, DICT).
- Scriptable: Ideal for automation and cron jobs.
- Lightweight: Fast and efficient, launches in seconds.
- Common Uses:
- Test API functionality.
- Display HTTP headers.
- Send HTTP requests (GET, POST, PUT, DELETE).
- Transfer data to/from servers.
curl [options] [URL]- [options]: Command-line flags to modify behavior (e.g.,
-o,-X,-u). - [URL]: The target location for data transfer.
- DICT, FILE, FTP, FTPS, GOPHER, GOPHERS, HTTP, HTTPS, IMAP, IMAPS, LDAP, LDAPS, MQTT, POP3, POP3S, RTMP, RTSP, SCP, SFTP, SMB, SMBS, SMTP, SMTPS, TELNET, TFTP, WS, WSS.
Retrieve and display content from a URL.
curl https://example.com
curl https://www.geeksforgeeks.org- Multiple URLs:
curl http://site.{one,two,three}.com- Numeric Sequence:
curl ftp://ftp.example.com/file[1-20].jpegDisplays transfer rate, data transferred, and time remaining.
- Default Meter:
curl -O ftp://ftp.example.com/file.zip- Progress Bar:
curl -# -O ftp://ftp.example.com/file.zip- Silent Mode:
curl --silent ftp://ftp.example.com/file.zipSend custom HTTP requests with various methods.
- GET Request:
curl -X GET https://api.sampleapis.com/coffee/hot- POST Request:
curl -X POST -d "key1=value1&key2=value2" https://api.sampleapis.com/coffee/hotDownload files from a URL.
- Save with Custom Name (
-o):
curl -o hello.zip ftp://speedtest.tele2.net/1MB.zip- Save with Original Name (
-O):
curl -O ftp://speedtest.tele2.net/1MB.zipUpload files to a server (e.g., via FTP).
curl -T uploadfile.txt ftp://example.com/upload/Access protected resources using credentials.
curl -u username:password https://example.com/apiResume a partially downloaded file.
curl -C - -O ftp://speedtest.tele2.net/1MB.zipRestrict the transfer rate (in bytes).
curl --limit-rate 1000K -O ftp://speedtest.tele2.net/1MB.zipDownload files from a user-authenticated FTP server.
curl -u demo:password -O ftp://test.rebex.net/readme.txtUpload a file to an FTP server.
curl -u username:password -T filename ftp://example.com/- Append to Existing File:
curl -u username:password -T filename --append ftp://example.com/Output C code using libcurl for the specified command.
curl https://www.geeksforgeeks.org > log.html --libcurl code.cSend emails using SMTP.
curl --url [SMTP_URL] --mail-from [sender_mail] --mail-rcpt [receiver_mail] -n --ssl-reqd -u {email}:{password} -T [Mail_text_file]Retrieve word definitions.
curl dict://dict.org/d:overclock- Show Help:
curl --help
curl --help all # Full list of options- Verbose Output (
-vor--verbose):
curl -v https://example.com- Version Info (
-Vor--version):
curl -V- Follow Redirects (
--locationor-L):
curl -L https://example.com- Show HTTP Headers (
-Ior--head):
curl -I https://example.com- Show Headers and Content (
-ior--include):
curl -i https://example.com- Set User Agent (
-Aor--user-agent):
curl -A "Mozilla/5.0" https://example.comCheck if cURL is installed:
curl- Expected Output:
curl: try 'curl --help' for more information
- If not installed, refer to the cURL documentation for installation instructions specific to your OS.
- Case Sensitivity: Options are case-sensitive (e.g.,
-v≠-V). - Default HTTP Method: GET is used unless specified (e.g.,
-dimplies POST). - Error Handling: For password-protected pages without credentials, expect a 401 error.
- Redirects: Use
-Lto follow redirects automatically. - FTP Usage: Requires valid credentials and correct file paths.
Developed by Daniel Stenberg, cURL is a powerful and flexible tool for Linux users, enabling efficient data transfer across multiple protocols. Its extensive options and scriptability make it indispensable for automation, debugging, and server communication. For further details, consult the cURL manual or project website.