I needed to automate copying files for a website that I was building. Since this site was hosted on an inexpensive shared hosting plan, I didn't have the luxury of shell or rsync access to automate copying files from my local development environment to the host. The only option was FTP, and after wasting too much time manually tracking down which files I needed to update, I knew I needed an automated solution. Some googling led me to
lftp, a command-line and scriptable FTP client. It should be available via your distribution's repository. Once installed, you can use a script like the one below to automatically copy files.
#!/bin/sh # paths HERE=`pwd` SRC="$HERE/web" DEST="~/www" # login credentials HOST="ftp.commodity-hosting.com" USER="mysiteusername" PASS="supersecretpassword" # FTP files to remote host lftp -c "open $HOST user $USER $PASS mirror -X img/* -X .htaccess -X error_log --only-newer --reverse --delete --verbose $SRC $DEST "
The script does the following:
- Copy files from the local ./web directory to the remote ~/www directory.
- Uses $HOST, $USER, $PASS to login, so make sure your script is readable, writeable, and executable only by you and trusted users.
- the lftp command connects and copies the files. The -c switch specifies the commands to issue, one per line. The magic happens with the mirror command which will copy the files. Since we added the --only-newer and --reverse switches, this will upload only files which have changed.
- You could be a little safer and remove the --delete switch, which will remove files from the destination which are not on your local machine.
- You can use the -X to give it glob patterns to ignore. In this case, it won't touch the img/ directory or the .htacess file.
If you're still moving files over FTP manually, even with a good GUI, it'll be worth your time to automate it and make it a quicker, less error-prone process.