Automatic compression before downloading

Advertisement

jon8800
Joined:
Posts:
1
Location:
Australia

Automatic compression before downloading

Hey all,

I'm trying to implement automated website backups from FTP.
For now, my script successfully downloads files and puts them in my local dir:

option batch on
option confirm off
open session
option transfer binary
get /* "path\%TIMESTAMP#(dd.mm.yyyy)%\*"
synchronize local "local\path\%TIMESTAMP#(dd.mm.yyyy)%" /
close

But I also want to have a zipped file instead of a folder as the result.
I am not a programmer so it's kind of hard for me to figure out how to integrate the "Automatically compress files before download" snippet from this website to my script.

I've been trying to play around with it but, unsurprisingly, it doesn't work:

open session
call tar -czf /tmp/archive.tar.gz /
get /tmp/archive.tar.gz "local\path\%TIMESTAMP#(dd.mm.yyyy)%\*"
exit

Could someone please point me in the right direction. Help would be much appreciated.
Thanks!

Reply with quote

Advertisement

martin
Site Admin
martin avatar
Joined:
Posts:
28,094
Location:
Prague, Czechia

Re: Automatic compression before downloading

Your script looks correct in general.

But it will work with SFTP or SCP protocol only. If you are really using the FTP, you generally cannot execute commands on the server.
See https://winscp.net/eng/docs/remote_command

Reply with quote

Advertisement

You can post new topics in this forum