So, a while ago I made a script, that logs into a server via FTP, downloads the folders and then saves all the files into one archive. The client that downloads the backup is a Linux-client with gftp-text as ftp-client.
The code pretty much explains itself.
First, gftp-text (the ftp client), logs into the server, changes path on the client and then downloads everything from the FTP.
Then it’s being packed into one archive with tar gz and moved to another folder (the folder called BACKUP-FOLDER).
Remember to change the path, so it’s suited to your system.
#!/bin/bash # apt-get install gftp-text gftp <<** open ftp://USERNAME:PASSWORD@IP-ADDRESS lcd /LOCAL/PATH/ON/SYSTEM mget * close quit ** #what to backup backup_files="/LOCAL/PATH/ON/SYSTEM" #where to backup to dest="/LOCAL/PATH/ON/SYSTEM/BACKUP-FOLDER" #create archive filename day=$(date +%F) #hostname=$(Set a hostname) archive_file=$day.tgz #print start status mesage echo "Backing up $backup_files to $dest/$archive_file" date echo #backup the files using tar tar czf $dest/$archive_file $backup_files #print end status message echo echo echo "Backup finished" date #listing the files in $dest to check file sizes echo "The Destination Path:" ls -lah $dest