Download Folder only once

Advertisement

Jan1111
Joined:
Posts:
6
Location:
Germany

Download Folder only once

Hi Folks,

we download folders from an FTP server with files in it automatically with this simple syntax:
open "sftp://blablalba" -hostkey="*" 
get "/remotefolder/*.*" D:\localfolder\
exit
I would like to find a solution to download every folder only once.
Im not allowed to delete the folders on the ftp source but I have rights to rename the folder. Maybe that helps.

Any ideas how to do this?

Thanks alot!

Reply with quote

Advertisement

Jan1111
Joined:
Posts:
6
Location:
Germany

Re: Download Folder only once

Hi Martin,
thanks for your reply. Yes, that's exactly what I mean. The timestamp sounds interesting but there's a little problem. If we run the script multiple times a day (what we like to do, to get the files faster), it will download the Folders/Files multiple times on this day.
On the other hand, if we run the script only once a day, that has to work for sure or we lose the data.

As I said, I'm allowed to rename the folders. Maybe it's possible to rename all folders, that has bin downloaded with some Prefix and then skip all folder with that prefix the next time the script runs?

Thanks alot!

Reply with quote

martin
Site Admin
martin avatar

Re: Download Folder only once

I do not understand the "problem" about "running the script multiple times a day". Why should it cause the files to be downloaded multiple times? If their timestamp does not change, then they won't be downloaded multiple times.

Reply with quote

Jan1111

Re: Download Folder only once

My bad, I only read the parts about filemasks, sorry. Ok thanks but that wont work as well since the files on my local share are being processed and removed.

Reply with quote

Advertisement

martin
Site Admin
martin avatar

Re: Download Folder only once

And did you read the article to the end now? It seems that you did not.
There are sections "Remembering the last timestamp" and "Remembering already transferred files". Both should work for you.

Reply with quote

Jan1111
Joined:
Posts:
6
Location:
Germany

Re: Download Folder only once

Well, it seems like I did not. This one looked like the perfect solution: https://winscp.net/eng/docs/library_example_remember_downloaded_files

I set it up an made it work and then I got this:
Connecting...
Looking for new files...
Found new file /Path-Remote/. with size 0, downloading...
Error: Exception when calling "Check" with Argument:  "'...Local-Path\' is not file!"
The files are not stored directly in the remote-path. Every new set of files is stored in a new subfolderfolder. So to make this work, the script should be able to process one level of subfolders. That's toi bad. Thanks a lot.

Reply with quote

martin
Site Admin
martin avatar

Re: Download Folder only once

The script should be able to work recursively. If you have problems, post a session log file.

Reply with quote

Jan1111
Joined:
Posts:
6
Location:
Germany

Re: Download Folder only once

. 2022-06-23 11:33:47.956 File: '/RemotePath/.' [2022-04-26T14:02:18.000Z] [67]
. 2022-06-23 11:33:47.956 Copying "/RemotePath/." to local directory started.
. 2022-06-23 11:33:47.956 Binary transfer mode selected.
* 2022-06-23 11:33:47.956 (ExtException) 
. 2022-06-23 11:33:47.956 Asking user:
. 2022-06-23 11:33:47.956 'C:\LocalPath\' is not file! ()
< 2022-06-23 11:33:47.956 Script: 'C:\LocalPath\' is not file!
. 2022-06-23 11:33:47.956 Answer: Abort
* 2022-06-23 11:33:47.956 (ESkipFile) 'C:\LocalPath\' is not file!
. 2022-06-23 11:33:47.956 Copying finished: Transferred: 0, Elapsed: 0:00:00, CPS: 0/s
. 2022-06-23 11:33:47.956 Script: Failed
> 2022-06-23 11:33:47.971 Script: exit
. 2022-06-23 11:33:47.971 Script: Exit code: 1
. 2022-06-23 11:33:47.971 Closing connection.
. 2022-06-23 11:33:47.971 Sending special code: 1
. 2022-06-23 11:33:47.987 Session sent command exit status 0
. 2022-06-23 11:33:47.987 Main session channel closed
. 2022-06-23 11:33:47.987 All channels closed

Reply with quote

Advertisement

Advertisement

You can post new topics in this forum