Linux wget: Your Ultimate Command Line Downloader

I


t is a common practice to manage UNIX/Linux/BSD servers remotely over the ssh session. You may need to download download the software or other files for installation. There are a few powerful graphical download manager exits for Linux and UNIX like operating systems:

  • d4x: Downloader for X is a Linux/Unix userfriendly program with nice X interface to download files from the Internet. It suppotrs both FTP and HTTP protocols, supports resuming
  • kget: KGet is a versatile and user-friendly download manager for KDE desktop system.
  • gwget / gwget2: Gwget is a download manager for the Gnome Desktop

However, when it comes to command line (shell prompt) wget the non-interactive downloader rules. It supports http, ftp, https protocols along with authentication facility, and tons of other options. Here are some tips to get most out of it:

Download a Single File Using wget

Type the following command:
$ wget http://www.cyberciti.biz/here/lsst.tar.gz
$ wget ftp://ftp.freebsd.org/pub/sys.tar.gz
$ wget -O output.file http://nixcraft.com/some/path/file.name.tar.gz
$ wget http://www.kernel.org/pub/linux/kernel/v2.6/testing/linux-2.6.36-rc3.tar.bz2

Sample outputs:

Fig.01: wget example

Fig.01: wget example

 

How Do I Download Multiple Files Using wget?

Use the following syntax:
$ wget http://www.cyberciti.biz/download/lsst.tar.gz ftp://ftp.freebsd.org/pub/sys.tar.gz ftp://ftp.redhat.com/pub/xyz-1rc-i386.rpm
You can create a shell variable that holds all urls and use the 'BASH for loop' to download all files:

URLS=”http://www.cyberciti.biz/download/lsst.tar.gz \
ftp://ftp.freebsd.org/pub/sys.tar.gz \
ftp://ftp.redhat.com/pub/xyz-1rc-i386.rpm \
http://xyz.com/abc.iso"
for u in $URLS
do
 wget $u
done

How Do I Read URLs From a File?

You can put all urls in a text file and use the -i option to wget to download all files. First, create a text file:
$ vi /tmp/download.txt
Append a list of urls:

http://www.cyberciti.biz/download/lsst.tar.gz
ftp://ftp.freebsd.org/pub/sys.tar.gz
ftp://ftp.redhat.com/pub/xyz-1rc-i386.rpm
http://xyz.com/abc.iso

Type the wget command as follows:
$ wget -i /tmp/download.txt

Resume Downloads

You can also force wget to get a partially-downloaded file. This is useful when you want to finish up a download started by a previous instance of wget, or by another program:
$ wget -c http://www.cyberciti.biz/download/lsst.tar.gz
$ wget -c -i /tmp/download.txt

Please note that the -c option only works with FTP / HTTP servers that support the "range" header.

Force wget To Download All Files In Background

The -o option used to force wget to go into background immediately after startup. If no output file is specified via the -o option, output is redirected to wget-log file:
$ wget -cb -o /tmp/download.log -i /tmp/download.txt
OR
$ nohup wget -c -o /tmp/download.log -i /tmp/download.txt &
nohup runs the given COMMAND (in this example wget) with hangup signals ignored, so that the command can continue running in the background after you log out.

How Do I Limit the Download Speed?

You can limit the download speed to amount bytes per second. Amount may be expressed in bytes, kilobytes with the k suffix, or megabytes with the m suffix. For example, --limit-rate=100k will limit the retrieval rate to 100KB/s. This is useful when, for whatever reason, you don't want Wget to consume the entire available bandwidth. This is useful when you want to download a large file file, such as an ISO image:
$ wget -c -o /tmp/susedvd.log --limit-rate=50k ftp://ftp.novell.com/pub/suse/dvd1.iso
Use m suffix for megabytes (--limit-rate=1m). The above command will limit the retrieval rate to 50KB/s. It is also possible to specify disk quota for automatic retrievals to avoid disk DoS attack. The following command will be aborted when the quota is (100MB+) exceeded.
$ wget -cb -o /tmp/download.log -i /tmp/download.txt --quota=100m
From the wget man page:

Please note that Wget implements the limiting by sleeping the appropriate amount of time after a network read that took less time than specified by the rate. Eventually this strategy causes the TCP transfer to slow down to approximately the specified rate. However, it may take some time for this balance to be achieved, so don't be surprised if limiting the rate doesn't work well with very small files.

Use wget With the Password Protected Sites

You can supply the http username/password on server as follows:
$ wget --http-user=vivek --http-password=Secrete http://cyberciti.biz/vivek/csits.tar.gz
Another way to specify username and password is in the URL itself.
$ wget 'http://username:[email protected]/file.tar.gz
Either method reveals your password to anyone who bothers to run ps command:
$ ps aux
Sample outputs:

vivek     27370  2.3  0.4 216156 51100 ?        S    05:34   0:06 /usr/bin/php-cgi
vivek     27744  0.1  0.0  97444  1588 pts/2    T    05:38   0:00 wget http://test:[email protected]/pub/linux/kernel/v2.6/testing/linux-2.6.36-rc3.tar.bz2
vivek    27746  0.5  0.0  97420  1240 ?        Ss   05:38   0:00 wget -b http://test:[email protected]/pub/linux/kernel/v2.6/testing/linux-2.6.36-rc3.tar.bz2

To prevent the passwords from being seen, store them in .wgetrc or .netrc, and make sure to protect those files from other users with "chmod".

If the passwords are really important, do not leave them lying in those files either---edit the files and delete them after Wget has started the download.
G) Download all mp3 or pdf file from remote FTP server:
Generally you can use shell special character aka wildcards such as *, ?, [] to specify selection criteria for files. Same can be use with FTP servers while downloading files.
$ wget ftp://somedom.com/pub/downloads/*.pdf
$ wget ftp://somedom.com/pub/downloads/*.pdf
OR$ wget -g on ftp://somedom.com/pub/downloads/*.pdfH) Use aget when you need multithreaded http download:
aget fetches HTTP URLs in a manner similar to wget, but segments the retrieval into multiple parts to increase download speed. It can be many times as fast as wget in some circumstances( it is just like Flashget under MS Windows but with CLI):
$ aget -n=5 http://download.soft.com/soft1.tar.gzAbove command will download soft1.tar.gz in 5 segments.

Please note that wget command is available on Linux and UNIX/BSD like oses.

See man page of wget(1) for more advanced options.

Was this answer helpful?

 Print this Article

Also Read

How to change my ssh port

IMPORTANT: Beofre changing your port, make sure the new port is whitelisted in your firewall, if...

Add Secondary IP Address to Centos

Just create a new interface file with a colon and a number....

Setting up an SSL secured Webserver with CentOS

This guide will explain how to set up a site over https. The tutorial uses a self signed key so...

rsnapshot backups

This guide will assist you in setting up an rsnapshot backup server on your network. It will...

Install vnStat Network Traffic Monitor To Keep a Log Of Daily Traffic on CentOS / RHEL

How do I install vnstat software - a console-based network traffic monitor under CentOS or RHEL...