Posted: 2016-09-02 07:40
I'm trying to implement automated website backups from FTP.
For now, my script successfully downloads files and puts them in my local dir:
option batch on
option confirm off
option transfer binary
get /* "path\%TIMESTAMP#(dd.mm.yyyy)%\*"
synchronize local "local\path\%TIMESTAMP#(dd.mm.yyyy)%" /
But I also want to have a zipped file instead of a folder as the result.
I am not a programmer so it's kind of hard for me to figure out how to integrate the "Automatically compress files before download" snippet from this website to my script.
I've been trying to play around with it but, unsurprisingly, it doesn't work:
call tar -czf /tmp/archive.tar.gz /
get /tmp/archive.tar.gz "local\path\%TIMESTAMP#(dd.mm.yyyy)%\*"
Could someone please point me in the right direction. Help would be much appreciated.
Location: Prague, Czechia
Your script looks correct in general.
But it will work with SFTP or SCP protocol only. If you are really using the FTP, you generally cannot execute commands on the server.
You can post new topics in this forum
And it's free!