What is wget
Last updated: April 1, 2026
Key Facts
- wget stands for 'web get' and is a non-interactive network downloader developed by the GNU Project
- It supports recursive downloading, allowing users to download entire websites while preserving directory structures
- wget can resume interrupted downloads, making it reliable for large files and unstable connections
- The tool operates entirely from the command line and requires no graphical interface
- wget is free, open-source software distributed under the GPL license and available on most Linux distributions
Overview
wget is a powerful command-line tool that enables users to download files from the internet without interaction after the initial command is issued. The name 'wget' is derived from 'web get', and the tool has been a staple of Unix-like systems for decades. It is particularly valued for its ability to handle large-scale downloads and complex downloading scenarios that would be tedious through a web browser.
Key Features
wget offers numerous features that make it ideal for various downloading tasks:
- Recursive downloading of entire websites
- Resume capability for interrupted downloads
- Support for HTTP, HTTPS, and FTP protocols
- Bandwidth throttling to avoid overwhelming servers
- Cookie and authentication support
- Background downloading capability
Common Use Cases
wget is used in many scenarios including automated backup of websites, downloading large software packages, creating offline copies of documentation, and system administration tasks. System administrators frequently use wget in scripts to automate repetitive download operations. Web developers use it to test website availability and structure.
Basic Usage
Simple wget usage involves typing 'wget' followed by a URL at the command prompt. For example, 'wget https://example.com/file.zip' downloads a file to the current directory. Advanced usage includes parameters for specifying output filenames, limiting download speed, setting timeouts, and configuring authentication credentials.
Advantages and Limitations
wget excels at batch operations and works efficiently on low-bandwidth connections. However, it lacks a graphical interface and requires command-line familiarity. For simple downloads, web browsers are more convenient. wget is particularly valuable on headless servers or when automation is required.
Related Questions
What is the difference between wget and curl?
wget is primarily designed for recursive downloads and mirrors entire websites, while curl is more flexible for API requests and data transfer. curl is better for complex HTTP operations and is available across more platforms.
How do I use wget to download an entire website?
Use the command 'wget -r https://example.com' to recursively download an entire website. Add '--page-requisites' to include CSS, images, and other resources needed for proper rendering.
Can wget resume partial downloads?
Yes, wget can resume interrupted downloads using the '-c' flag. This is useful for large files over unreliable connections, as wget will continue from where the download stopped.
More What Is in Daily Life
Also in Daily Life
More "What Is" Questions
Trending on WhatAnswers
Browse by Topic
Browse by Question Type
Sources
- GNU wget - Official Project PageGPL-3.0
- Wikipedia - wgetCC-BY-SA-4.0