Quite unusable if folders have more than 100 files - not to mention 10000 files

Advertisement

RR3DRad
Joined:
Posts:
7

Quite unusable if folders have more than 100 files - not to mention 10000 files

Hello
I ran into that problem with my own programs before.
Reading a folder via ftp takes "hours" if there are some more files than 100 or 200 inside not to mention several 1000.

I solved it with uploading a scandir-script to the folder, create the directory list on the server (takes less than a second), upload the directory list and delete script and list on the server.
Instead of 4 or 5 minutes waiting I have the dirlist in 2 - 3 seconds available.

Maybe you can add this useful feature to WinSCP?
It is just a 2 - 10 lines script, a checkbox in WinSCP to allow using that feature and changing the way WinSCP works.
At present every some seconds WinSCP looks for changes - takes 4 - 5 Minutes every time on certain folders. Not useful.
When the checkbox is set the check would take only seconds.
Same happens, when you end the program in a big folder. Starting the next session takes Minutes until the folder is read and you can start to work. All solved with the script feature.

Cheers

Reply with quote

Advertisement

martin
Site Admin
martin avatar
Joined:
Posts:
41,518
Location:
Prague, Czechia

Re: Quite unusable if folders have more than 100 files - not to mention 10000 files

Thanks for your suggestion. Though this looks like a server-side problem, that should be fixed server-side, not client-side.

If you want us to investigate, please post a session log file.

Btw, WinSCP does not look for changes on its own by default. Maybe you had enabled that in preferences. Turn it off, if you do not want it:
https://winscp.net/eng/docs/ui_pref_panels_remote

You can also configure WinSCP not to remember the last folder visited.
https://winscp.net/eng/docs/ui_login_directories

Reply with quote

RR3DRad
Joined:
Posts:
7

Not a server problem

Hi
Thank you for your answer.
No - its not a server problem - its an ftp problem.
Any client - filezilla or other - needs minutes to create a filelist! When the filelist is 1000, 2000 or even 10000 files long - So does WinSCP, too.

With ftp you step file by file through the directory. Means if your server is in the USA and you work in Europe only the time needed to transfer the data is 600 - 800 milliseconds for each file. 10000 files 6000000 - 8000000 milliseconds!

Actually every client is useless in such case until it does not use some tricks the user allows.
And as I wrote before. The quickest solution I found is to place a php-script on the server - call it per http and it creates a filelist with scandir und the filelist is echoed within 1 or 2 seconds back.
If your Client after every operation needs 300 or 360 Seconds to give you the actually needed updated filelist back and is blocked that long, then you can make 10 operations each hour - if it is 1 or 2 seconds its about 200 - 300 times more.

No other way.

Reply with quote

RR3DRad

Btw - to change the update rate in settings to 600 seconds does not work for me

Btw - to change the update rate in settings to 600 seconds does not work for me.
It is still working after several seconds and blocked for 5 minutes until it is back for some seconds when the showed folder has 9800 files. I cannot step through the list without getting disturbed. And if I want to change the Dir I have to wait till it is available - means minutes waiting. As I said - ftp is unusable.

Reply with quote

martin
Site Admin
martin avatar
Joined:
Posts:
41,518
Location:
Prague, Czechia

Re: Not a server problem

RR3DRad wrote:

No - its not a server problem - its an ftp problem.
OK, more specifically, it's FTP server problem.

With ftp you step file by file through the directory.
No. FTP listing for a directory is downloaded as one large plain text stream. If your solution is to generate a text file with the directory listing and download that, you are basically doing the very same operation that FTP does internally. If your solution works quickly, but the listing download with FTP is slow, there's something wrong with the FTP server. (Or there's something on the network that specifically slows down the FTP listing transfer – I find that unlikely). No point arguing any further, unless you post a log file proving your assumptions.

Btw - to change the update rate in settings to 600 seconds does not work for me.
It is still working after several seconds and blocked for 5 minutes until it is back for some seconds when the showed folder has 9800 files
You might have keepalives enabled:
https://winscp.net/eng/docs/ui_login_connection
Turn them off if you do.

Reply with quote

Advertisement

RR3DRad
Joined:
Posts:
7

No. FTP listing for a directory is downloaded as one large plain text stream. If your solution is to generate a text file with the directory listing and download that, you are basically doing the very same operation that FTP does internally. If your solution works quickly, but the listing download with FTP is slow, there's something wrong with the FTP server.
Well ... and why do I get the file within 2 seconds from the same server and have the complete list in firefox inclusive generated Buttons with links for delete, view, download?
And WinSCP, FileZilla need 5 Minutes to only generate a list which waits for action?

It seems I have to go on programming my own solution. Saves a lot of time in the end ;-)

Thank you anyway for everything!

Reply with quote

RR3DRad
Joined:
Posts:
7

Congratulations

Hello
Great, what have you done?
Just used the newest version and I didn't believe my eyes - Even with 10000 files it is very usable now !
Great work. Thanks.

Reply with quote

Advertisement

You can post new topics in this forum