Delays when synchronizing large directory

Advertisement

Bobcat00
Joined:
Posts:
7
Location:
New Jersey

Delays when synchronizing large directory

I'm using a WinSCP.com command line script for FTP with /ini=nul and synchronize commands to update my local directories to match the remote server. For example:

option batch continue
synchronize -transfer=binary -criteria=time -delete local plotworld plotworld

Now I understand WinSCP has to go through the entire directory tree to compare files to see what to do. Some of the subdirectories have ~2000 files in them, so that takes time. But once the transfers of the modified files start and WinSCP goes to one of these large directories, it just sits there for about 10 seconds before downloading the first file. I suspect it's re-reading the directory contents before starting the transfer. Is this really necessary? Can I turn off the second reading of the directories?

Reply with quote

Advertisement

martin
Site Admin
martin avatar
Joined:
Posts:
40,476
Location:
Prague, Czechia

Re: Delays when synchronizing large directory

Please attach a full session log file showing the problem (using the latest version of WinSCP).

To generate the session log file, use /log=path_to_log_file command-line argument. Submit the log with your post as an attachment. Note that passwords and passphrases not stored in the log. You may want to remove other data you consider sensitive though, such as host names, IP addresses, account names or file names (unless they are relevant to the problem). If you do not want to post the log publicly, you can mark the attachment as private.

Reply with quote

Bobcat00
Joined:
Posts:
7
Location:
New Jersey

Log file attached. Examples of large directories (1400+ files) include:
LanaPug/playerdata
LanaPug/stats
plugins/Essentials/userdata
plugins/GriefPreventionData/PlayerData
  • backuplog.zip (1.16 MB, Private file)
Description: Log file

Reply with quote

martin
Site Admin
martin avatar

OK, indeed with FTP protocol, the listing is retrieved once again, to retrieve the up to date timestamps.

It may be optimized, but only if you would accept a possibility, that you download an updated contents that does not match a possibly obsolete timestamp.

Reply with quote

Bobcat00
Joined:
Posts:
7
Location:
New Jersey

Even with the current implementation, obsolete timestamps are possible. Retrieving the directory listing and downloading the file(s) is not a single operation. It's possible that the file was updated after the directory listing and before downloading anyway. Yeah, it's only a 10 second window, but it could happen.

Rather than making any changes, I was hoping there were already some configuration options that could help. I've read the Directory Caching documentation, but I don't really understand if it would help in my situation. I also see the options AutoReadDirectoryAfterOp and CacheDirectoryChangesMaxSize. Could they help when running my script?

EDIT: I would NOT want a permanent cache, as the server files are changing between runs of the script.

Reply with quote

Advertisement

Advertisement

You can post new topics in this forum