Saturday, November 13, 2010

Howto: Use wget Recursively Download All FTP Directories

SkyHi @ Saturday, November 13, 2010
I would like to copy all of my files and directories from UNIX server to Linux workstation. How do I use wget command to recursively download whole FTP directories stored at /home/tom/ from ftp.example.com to local directory called /home/tom/backup?

GNU Wget is a free Linux / UNIX utility for non-interactive download of files from the Web or and FTP servers, as well as retrieval through HTTP proxies. GNU/wget has been designed for robustness over slow dialup internet or unstable network connections. If a download fails due to a network problem, it will keep retrying until the whole file has been retrieved. If the server supports regetting, it will instruct the server to continue the download from where it left off.

wget Recursive Example

You can use the -r ( recursive retrieving ) option as follows. You can also pass your ftp username and password to the wget command. First, make a backup directory in your $HOME directory:
mkdir ~/backup/
cd ~/backup/
Now, use wget command as follows:
 
wget -r ftp://username:password@ftp.example.com/
wget -r ftp://tom:myPassword@ftp.example.com/home/tom/
wget -r ftp://tom:myPassword@ftp.example.com/var/www/

wget recursive ftp with mirroring option

The -m option turns on mirroring i.e. it turns on recursion and time-stamping, sets infinite recursion depth and keeps FTP directory listings:
wget -m ftp://username:Password@ftp.example.com/
wget -m ftp://username:Password@ftp.example.com/var/www/html
 
REFERENCES
http://www.cyberciti.biz/faq/wget-recursive-download-command/