wget Command Guide
wget is a powerful non-interactive downloader for files from the web. Learn how to download files, mirror websites, and handle complex downloads.
6 min read•Last updated: 2024
Dai Aoki
CEO at init, Inc. / CTO at US & JP startups / Creator of WebTerm
Quick Reference
Basic
wget URLDownload filewget -O name URLSave as namewget -P dir URLSave to directoryOptions
-cContinue download-bBackground download-qQuiet modeRecursive
-rRecursive download-mMirror website-l 2Depth limit 2Advanced
--limit-rate=1MLimit speed-i file.txtURLs from file-A "*.pdf"Accept patternDownloadable Image Preview
Failed to generate preview
Basic Usage
wget downloads files from the web. By default, it saves files to the current directory with their original names.
bash
wget https://example.com/file.zipCommon Options
Frequently Used Options
| -O file | Save with specific filename |
| -P dir | Save to specific directory |
| -c | Continue interrupted download |
| -b | Run in background |
| -q | Quiet mode (no output) |
| -r | Recursive download |
| -np | No parent directories |
| -N | Only download newer files |
Downloading Files
Save with different filename
bash
wget -O newname.zip https://example.com/file.zipSave to specific directory
bash
wget -P /path/to/downloads/ https://example.com/file.zipResume interrupted download
bash
wget -c https://example.com/largefile.isoDownload in background
bash
wget -b https://example.com/largefile.zip
# Check progress with: tail -f wget-logDownload multiple files
bash
# From command line
wget URL1 URL2 URL3
# From file containing URLs
wget -i urls.txtTip
Use
-c with large downloads so you can resume if the connection is interrupted.Recursive Downloads
Download entire website
bash
wget -r https://example.com/docs/Mirror a website
bash
wget --mirror -p --convert-links -P ./local-site https://example.comLimit recursion depth
bash
wget -r -l 2 https://example.com/
# -l 2 means go 2 levels deepStay within the same domain
bash
wget -r -np https://example.com/docs/
# -np prevents ascending to parent directoriesWarning
Be respectful when downloading websites. Use rate limiting and don't overwhelm servers with requests.
Filtering Downloads
bash
# Download only specific file types
wget -r -A "*.pdf,*.doc" https://example.com/
# Exclude certain file types
wget -r -R "*.jpg,*.gif,*.png" https://example.com/
# Download only from specific directories
wget -r --include-directories=/docs/,/manual/ https://example.com/Authentication
bash
# HTTP authentication
wget --user=username --password=password https://example.com/protected/
# Using .netrc file
wget --netrc https://example.com/protected/
# FTP login
wget --ftp-user=user --ftp-password=pass ftp://ftp.example.com/file.zipSpeed and Rate Limiting
bash
# Limit download speed
wget --limit-rate=200k https://example.com/file.zip
# Wait between downloads (for recursive)
wget -r -w 2 https://example.com/
# -w 2 waits 2 seconds between requests
# Random wait (to be polite)
wget -r -w 2 --random-wait https://example.com/Practical Examples
Download all PDFs from a page
bash
wget -r -l 1 -A pdf https://example.com/documents/Mirror website for offline viewing
bash
wget --mirror \
--convert-links \
--adjust-extension \
--page-requisites \
--no-parent \
https://example.com/docs/Download with custom user agent
bash
wget --user-agent="Mozilla/5.0" https://example.com/Retry failed downloads
bash
wget --tries=10 --retry-connrefused https://example.com/file.zipDownload only if newer
bash
wget -N https://example.com/file.zipUse proxy
bash
wget -e http_proxy=http://proxy:8080 https://example.com/wget vs curl
Both tools can download files, but they have different strengths:
Comparison
| wget | Better for recursive downloads, mirroring, background downloads |
| curl | Better for API calls, supports more protocols, more flexible |
Summary
wget is perfect for downloading files and mirroring websites. Key takeaways:
- Use
-Oto specify output filename - Use
-cto resume interrupted downloads - Use
-rfor recursive downloads - Use
--mirrorfor website mirroring - Use
-A/-Rto filter file types - Use
--limit-rateto be polite to servers