Post a reply

Before posting, please read how to report bug or request support effectively.

Bug reports without an attached log file are usually useless.

Add an Attachment

If you do not want to add an Attachment to your Post, please leave the Fields blank.

(maximum 10 MB; please compress large files; only common media, archive, text and programming file formats are allowed)


Topic review


Re: Files loss during SFTP

There are no limits.
There's no easy explanation for your observations.
Unless you have a log file for transfer of the file that differs in size, there's no really much we can advice you here.

Files loss during SFTP


I was using the command put files to automate transferring folder with as many as 30 thousands of files. I notice there are some difference in file size and files number after multiple run.
So i change my script to use command synchronize just recently. I have seen only 1 out of 4 tries where the files size and number of files differ.

my question is
IS there any limit of size and number of files for transferring using put command?
Or are there any other factor like timeout etc?
Does synchronize have the above limit too?
How does i make the transfer to be more stable by getting everything from the source?