Downloading very large files over an unreliable connection can be a tiresome task, whether the failures are due to network failures (e.g. for connections over a wireless link) or for those annoying "24/7" dialup accounts that automatically kick you off every two hours.
The FTP client portions of many modern browsers (including the excellent Opera browser) support the resume
operation, but this still means you have to be present to manually restart the ftp session each time the session fails. The bash script below presents a simple way to wrap the
ncftpget
command-line FTP client with a loop that makes it retry a download automatically, continuing until the download completes.
safeget server-name remote-file
safeget ftp.foohost.com pub/foo/bar/bigfile.iso
#!/usr/bin/bash
# this script wraps a call to ncftpget in a sleep-retry loop, in an
# attempt to make it download a large file over an intermittent
# connection.
#
# Copyright (C) 2003 W.Finlay McWalter and the Free Software Foundation.
# Licence: GPL v2.
#
# v1 31st March 2003 Initial Version
#
# configuration settings
SLEEPTIME=10 # seconds between retries
NCFTPGET=ncftpget # name and path of ncftpget executable
NCFTPOPTIONS="-F -z" # command line options (passive, resume)
FTP_LOCALROOT=. # local directory to which files should be retrieved
# check parameters
if [ $# -ne 2 ] ; then
echo "usage:"
echo " safeget server-name remote-file"
exit 20
fi
# TODO: add support for username and password
until false
do
$NCFTPGET $1 $FTP_LOCALROOT $2
RESULT=$?
echo `date` ncftpget returns $RESULT
case $RESULT in
0) echo success
exit 0
;;
7|8|9|10|11) echo nonrecoverable error \($RESULT\)
exit $RESULT
;;
1|2|3|4|5|6) echo recoverable error \($RESULT\)
sleep $SLEEPTIME
;;
*) echo unknown error code \($RESULT\)
exit $RESULT
;;
esac
done
Alternatively, you can use an off the shelf client that supports retrying:
wget -t0 ftp://server-name/remote-file