Terminal GuideTerminal Guide

wget Command Guide

wget is a powerful non-interactive downloader for files from the web. Learn how to download files, mirror websites, and handle complex downloads.

6 min readLast updated: 2024
Dai Aoki

Dai Aoki

CEO at init, Inc. / CTO at US & JP startups / Creator of WebTerm

Quick Reference

Basic

wget URLDownload file
wget -O name URLSave as name
wget -P dir URLSave to directory

Options

-cContinue download
-bBackground download
-qQuiet mode

Recursive

-rRecursive download
-mMirror website
-l 2Depth limit 2

Advanced

--limit-rate=1MLimit speed
-i file.txtURLs from file
-A "*.pdf"Accept pattern

Downloadable Image Preview

Failed to generate preview

Basic Usage

wget downloads files from the web. By default, it saves files to the current directory with their original names.

bash
wget https://example.com/file.zip

Common Options

Frequently Used Options

-O fileSave with specific filename
-P dirSave to specific directory
-cContinue interrupted download
-bRun in background
-qQuiet mode (no output)
-rRecursive download
-npNo parent directories
-NOnly download newer files

Downloading Files

Save with different filename

bash
wget -O newname.zip https://example.com/file.zip

Save to specific directory

bash
wget -P /path/to/downloads/ https://example.com/file.zip

Resume interrupted download

bash
wget -c https://example.com/largefile.iso

Download in background

bash
wget -b https://example.com/largefile.zip
# Check progress with: tail -f wget-log

Download multiple files

bash
# From command line
wget URL1 URL2 URL3

# From file containing URLs
wget -i urls.txt
Tip
Use -c with large downloads so you can resume if the connection is interrupted.

Recursive Downloads

Download entire website

bash
wget -r https://example.com/docs/

Mirror a website

bash
wget --mirror -p --convert-links -P ./local-site https://example.com

Limit recursion depth

bash
wget -r -l 2 https://example.com/
# -l 2 means go 2 levels deep

Stay within the same domain

bash
wget -r -np https://example.com/docs/
# -np prevents ascending to parent directories
Warning
Be respectful when downloading websites. Use rate limiting and don't overwhelm servers with requests.

Filtering Downloads

bash
# Download only specific file types
wget -r -A "*.pdf,*.doc" https://example.com/

# Exclude certain file types
wget -r -R "*.jpg,*.gif,*.png" https://example.com/

# Download only from specific directories
wget -r --include-directories=/docs/,/manual/ https://example.com/

Authentication

bash
# HTTP authentication
wget --user=username --password=password https://example.com/protected/

# Using .netrc file
wget --netrc https://example.com/protected/

# FTP login
wget --ftp-user=user --ftp-password=pass ftp://ftp.example.com/file.zip

Speed and Rate Limiting

bash
# Limit download speed
wget --limit-rate=200k https://example.com/file.zip

# Wait between downloads (for recursive)
wget -r -w 2 https://example.com/
# -w 2 waits 2 seconds between requests

# Random wait (to be polite)
wget -r -w 2 --random-wait https://example.com/

Practical Examples

Download all PDFs from a page

bash
wget -r -l 1 -A pdf https://example.com/documents/

Mirror website for offline viewing

bash
wget --mirror \
  --convert-links \
  --adjust-extension \
  --page-requisites \
  --no-parent \
  https://example.com/docs/

Download with custom user agent

bash
wget --user-agent="Mozilla/5.0" https://example.com/

Retry failed downloads

bash
wget --tries=10 --retry-connrefused https://example.com/file.zip

Download only if newer

bash
wget -N https://example.com/file.zip

Use proxy

bash
wget -e http_proxy=http://proxy:8080 https://example.com/

wget vs curl

Both tools can download files, but they have different strengths:

Comparison

wgetBetter for recursive downloads, mirroring, background downloads
curlBetter for API calls, supports more protocols, more flexible

Summary

wget is perfect for downloading files and mirroring websites. Key takeaways:

  • Use -O to specify output filename
  • Use -c to resume interrupted downloads
  • Use -r for recursive downloads
  • Use --mirror for website mirroring
  • Use -A/-R to filter file types
  • Use --limit-rate to be polite to servers

Related Articles