Post a reply

Before posting, please read how to report bug or request support effectively.

Bug reports without an attached log file are usually useless.

Options
Add an Attachment

If you do not want to add an Attachment to your Post, please leave the Fields blank.

(maximum 10 MB; please compress large files; only common media, archive, text and programming file formats are allowed)

Options

Topic review

martin

Re: Could not retrieve directory listing with keepuptodate

Key is this:
> 2013-07-19 09:24:05.916 CWD /photos/1307601/

< 2013-07-19 09:24:05.932 550 Failed to change directory.


Are you sure the path is /photos/1307601/? Maybe you wanted to use /var/www/dashboard/uploads/photos/1307601/?
MichaelFanning

Could not retrieve directory listing with keepuptodate

Hi Martin

Thanks for your update. I have added three zip files for your perusal (for some reason it will only alow me to attach three). What I have done is attached the development script that was run and the log file produced. This works fine and the files on both local machine and the FTP server is keeptupto date. Again I cleared both directories when I commenced this test so everything was clean.

The second attachement has the script and log produced in the production environment. This now has over 7000 directories (growing quite fast). The script runs OK to kick it all off. However once it detects a change in the directory, the script fails. The log indicated that it could not retrieve the directory listing.

Based on that, I ran a new script (Third attachment) that ran the same connection Production environment parameters but just did a simple directory listing, change directories and again a directory listing to ensure that I was was in the correct directories at the start. This works OK too so I am not sure why the production script fails. Again I am assuming that the amount of directories it needs to keepuptodate is way too much.

My other line of thinking was to run the "/Synchronise" command (sorry /synchronize), but I need to modify the directory structure before I do this.

Any ideas?

Kindest Regards

Michael Fanning
Perisher
martin

Re: Could not retrieve directory listing with keepuptodate

Please attach a full log file showing the problem (using the latest version of WinSCP). Ideally also log from your development environment, where it works ok.

To generate log file, use /log=path_to_log_file command-line argument. Submit the log with your post as an attachment. Note that passwords and passphrases not stored in the log. You may want to remove other data you consider sensitive though, such as host names, IP addresses, account names or file names (unless they are relevant to the problem). If you do not want to post the log publicly, you may email it to me. You will find my address (if you log in) in my forum profile. Please include link back to this topic in your email. Also note in this topic that you have emailed the log.
MIchaelFanning

Could not retrieve directory listing with keepuptodate

Dear Sirs / Madams

I have been testing WinSCP (Ver 5.1.5 (build 3261))to see if this was the ultimate solution to keep two directories in sync. Testing was successful but when I move this to our production environment I run into trouble. It is more than likely due to the number of directories that this is watching (over 6500 in both the local and remote directories). The purpose of the sync is we are photographing vast amount of resort guests each day and using their RFID media apply this to the EXIF data of the image. when we process the photo's the subject gets their photo in their own directory which then needs to be FTP up to the web server where the guests can view their photo's. The process works excellently with the exception of the automated bit where we need to copy the files from the server processing the images to the web server.

As mentioned in test WinSCP works wonderfully well. As the image is process and saved to the guests directory (created by the processing application), WinSCP picks this change up and transfers the data to the web server. However in production, the keepuptodate process starts OK and states that it is watching 6500 directories, but as soon as a new jpg is added to a directory the process falls over (in both the GUI and command line interfaces).

Can you confirm that this is too much for WinSCP or is this possible and I just need to address my approach?

batch abort
confirm off
Connecting to 192.169.XX.XX
Connected with 192.168.XX.XX Waiting for welcome message...
Connected
Starting the session...
Reading remote directory...
Session started.
Active session: [1] dashimage@192.168.XX.XX
/var/www/dashboard/uploads/photos
d:\ftp
winscp> keepuptodate "d:\ftp" "/photos"
Watching for changes, press Ctrl-C to abort...
Scanning 'd:\ftp' for subdirectories...
Watching for changes in 6655 directories...
Change in 'd:\ftp\1307601' detected.
Error listing directory '/photos/1307601'.
Could not retrieve directory listing
Failed to change directory.
(A)bort, (R)etry, (S)kip: Abort

Kind Regards

Michael Fanning
michael.fanning@perisher.com.au