What is wget

Last updated: April 1, 2026

Quick Answer: wget is a free command-line utility that downloads files from the internet using HTTP, HTTPS, and FTP protocols. It runs on Unix, Linux, macOS, and Windows systems and is widely used for automated downloads, web scraping, and website mirroring.

Key Facts

Overview

wget is a powerful command-line tool that enables users to download files from the internet without interaction after the initial command is issued. The name 'wget' is derived from 'web get', and the tool has been a staple of Unix-like systems for decades. It is particularly valued for its ability to handle large-scale downloads and complex downloading scenarios that would be tedious through a web browser.

Key Features

wget offers numerous features that make it ideal for various downloading tasks:

Common Use Cases

wget is used in many scenarios including automated backup of websites, downloading large software packages, creating offline copies of documentation, and system administration tasks. System administrators frequently use wget in scripts to automate repetitive download operations. Web developers use it to test website availability and structure.

Basic Usage

Simple wget usage involves typing 'wget' followed by a URL at the command prompt. For example, 'wget https://example.com/file.zip' downloads a file to the current directory. Advanced usage includes parameters for specifying output filenames, limiting download speed, setting timeouts, and configuring authentication credentials.

Advantages and Limitations

wget excels at batch operations and works efficiently on low-bandwidth connections. However, it lacks a graphical interface and requires command-line familiarity. For simple downloads, web browsers are more convenient. wget is particularly valuable on headless servers or when automation is required.

Related Questions

What is the difference between wget and curl?

wget is primarily designed for recursive downloads and mirrors entire websites, while curl is more flexible for API requests and data transfer. curl is better for complex HTTP operations and is available across more platforms.

How do I use wget to download an entire website?

Use the command 'wget -r https://example.com' to recursively download an entire website. Add '--page-requisites' to include CSS, images, and other resources needed for proper rendering.

Can wget resume partial downloads?

Yes, wget can resume interrupted downloads using the '-c' flag. This is useful for large files over unreliable connections, as wget will continue from where the download stopped.

Sources

  1. GNU wget - Official Project PageGPL-3.0
  2. Wikipedia - wgetCC-BY-SA-4.0