Before posting, please read how to report bug or request support effectively.

Bug reports without an attached log file are usually useless.

Options

If you do not want to add an Attachment to your Post, please leave the Fields blank.

(maximum 10 MB; please compress large files; only common media, archive, text and programming file formats are allowed)

Options

# Topic review

KJS_vt

## Log file

I reviewed the seven jobs that run Friday night, and each weekend night, and they were all successful. I'm confident that the issue is resolved and can be closed. Thanks so much for your help!
martin

KJS_vt

## Log file

I think that did the trick. Of the seven jobs that download (FTP) a large file Friday night, only one failed and I think that was an unrelated issue. I would like to have these run one more time (Friday 3/12) before declaring this issue resolved. I will reply back on 3/15 with a verdict. Thanks.
KJS_vt

## Log file

I was able to run the process last night and it only ran for about 3 hours and 20 minutes, opposed to 5 days and one hour the last time it ran. The log file shows that it disconnected once and then restarted and finished. I think it looks good. I updated the other two similar processes that we have to create the log files and use the /ini=nul option. They are all scheduled to run Friday night so I will see Monday if this was completely successful. I'm hopeful with last night's success. I am attaching last night's log file as proof.
KJS_vt

## Log file

Sorry, I've had Mon-Wed off this week. I will look into setting that inline parameter. Unfortunately, we cannot currently connect to the FTP site. I believe the host of the site changed the password and did not notify us. Once I am able to test I will capture another log file and see if the issue persists. Thanks!
martin

## Re: Log file

I should have noticed this before: You do not have transfer resume enabled:

< 2021-02-23 08:23:37.991 Script: Using configured transfer settings different from factory defaults.
...
. 2021-02-23 08:23:38.632 Copying 2 files/directories to local directory "\\AHS\AHSSOFT\BACKUP\SQL\AHSSQLW01D\Backups\OMS" - total size: 12,193,277,952
. 2021-02-23 08:23:38.632 PrTime: Yes; PrRO: No; Rght: rw-r--r--; PrR: No (No); FnCs: N; RIC: 0100; Resume: N (102400); CalcS: No; Mask:

You should use /ini=nul to isolate your script from the GUI configuration changes:
https://winscp.net/eng/docs/scripting#configuration
KJS_vt

## Log file

Here is the updated log file. It ran for about 22 hours.
KJS_vt

## Log file

I made the update to create the debug log file. I started the process about four hours ago. I have already seen that the file was downloaded and then started to download it again. I will let the process run until it eventually errors so that you get as much info as possible. I will post the log tomorrow morning (my time).
martin

## Re: Log file

Thanks. Please set logging level to Debug 1 and post a new log.
KJS_vt

## Log file

You're absolutely right. I'm not sure where or how I created that file. Real Log file attached. Sorry about that.
martin

What you have attached is not a log file, but just a transcript of the script output. Please attach an actual session log file.

To generate the session log file, use /log=C:\path\to\winscp.log command-line argument. Submit the log with your post as an attachment. Note that passwords and passphrases not stored in the log. You may want to remove other data you consider sensitive though, such as host names, IP addresses, account names or file names (unless they are relevant to the problem). If you do not want to post the log publicly, you can mark the attachment as private.
KJS_vt

Hello,

I just started having an issue pulling a file from an SFTP site. This issue started just after the WinSCP software was updated to the latest version (as of a couple weeks ago) from a much older (unknown) version.

Description of the process.

Question:
1. Is there a setting that can reconnect and not start the download from the beginning?
2. If #1 is not possible, is there a better combination of settings that can be used to better accomodate connection issues? Via automation?

I have attached the log file that was created this past Friday night. I have obfiscated the file paths, site, and user/password names. As you can see it downloads 99% of the file and times out. It then attempts to reconnect or maybe it downloads again until it gets to 99% (that part isn't clear to me since there are no timestamps). It then shows that it errors out at 19%, meaning it started to download the file from the beginning.