Select Page

FTP Backup Service/Feral Backup Service

In the absence of an easy to adopt and tune mechanism to capture those lumps of data considered important to people involved in developing prototype tools supporting research activity, we have created a basic system suited to linux and MacOS X environments.

The pattern used is very simple, creating snapshots of the data, and not getting involved in incremental/differential algorithms -everything (targeted) is backed up each time the cycle runs. Management of these snapshots is done elsewhere.

Snapshots contain the data and configuration at a point in time. So if something goes wrong while people are fiddling with, um, developing a system, it can be recovered to a known point.

In the ideal world, the snapshots would be moved into HSM capable storage, where appropriate aging policies could be used to ensure the live file system is not too encumbered.

The bits and pieces

The following describes a specific implementation to illustrate the general idea.

The subject system is a LAMP -a platform made up of Linux, Apache, MySQL, and PHP. The server hosting the snapshots is a Windows file server configured to allow FTP upload.

As the root user, I created


(not to be confused with /var/backups -something looked after by the Ubuntu system I am using) and


data is the map -a folder of links pointing to the areas you want to snapshot, and a place to put files generated as part of the backup process

out is where the archived snapshot is held till it is uploaded to the FTP host. looks a bit like the following:


DATETIME=`date "+%Y%m%d"`
BACKUPDATABASES="somedb someotherdb"

echo "$DATETIME"

# is BACKUPOUT empty?

FILECOUNT=`ls -1 $BACKUPOUT/ |grep $DATETIME |wc -l`

if [ "$FILECOUNT" = 0 ];
	#	let's do it

	####	MYSQL
	#	delete the old sql backup(s)
	echo "Deleting the MySQL backup file"
	rm -f $BACKUPDATA/*.sql.z

	#	create the new
	echo "Creating the new MySQL backup file"
	mysqldump -u backupuser --databases $BACKUPDATABASES |gzip >$BACKUPDATA/$DATETIME.sql.z

	####	Remember what we backed up?
	#	delete the old file(s)
	echo "Deleting the old file and folder listing"
	rm -f $BACKUPDATA/*.txt

	#	add a listing of the data folder
	echo "Creating the directory listing file"

	####	TAR and ZIP
	#	now tar it all up
	echo "Creating the new backup archive"
	#	something odd here?
	echo "Today's backup is still in the folder"

echo "Attempting to upload the backup to $FTPHOST"

#	do we have an archive?
if [ -e "$BACKUPOUT/$DATETIME.tgz" ];
	ncftpput -u anonymous -p $MYEMAIL $FTPHOST $FTPPATH $BACKUPOUT/$DATETIME.tgz

	if [ $? -ne 0 ];
		echo "Upload did not work :-("
		exit 1
		echo "Upload successful!"
		echo "Removing the archive!"
	echo "No new archive to upload?"
	exit 1

echo "Finished!"
exit 0

and is run by root. This is done by adding a call to the script in the crontab for the root user

sudo crontab -l
0 6 * * * /var/backup/ >/var/backup/backup.out 2>&1

The script requires a number of things to work:

  • ncftpput (part of ncftpsudo apt-get install ncftp)
    Allows you to push (or pull) a file using a single line
  • A backup user with appropriate access
    I used something like GRANT SELECT, LOCK TABLES, SHOW VIEW, TRIGGER ON database.* TO ‘backupuser’@’localhost’;)
  • An FTP server


Skip to toolbar