Remove remote file only if it exists without error

Advertisement

da_chicken
Joined:
Posts:
4
Location:
US

Remove remote file only if it exists without error

I'm setting up a automated transfer. The remote SFTP site is owned by a vendor, and we have somewhat uncommon permissions. We can create new files and delete existing files, but we can neither overwrite nor append to existing files. This means that when we need to send a new file, the first thing we have to do is delete the existing file.

My script looks like this:
# Set batch settings
option batch abort
option confirm off
 
# Connect
open ....
 
lcd "D:\SSIS\Package Name\Output"
 
rm /datafile1.csv
put -nopreservetime -nopermissions -transfer=ascii datafile1.csv
...
 
rm /datafile2.csv
put -nopreservetime -nopermissions -transfer=ascii datafile2.csv
...
 
close
exit
In reality there are about a dozen files that are sent.

This works fine in normal operations. The trouble comes when anything goes wrong. It's possible for one of the datafiles to be removed by another process, or for this process to fail after removing the data file (e.g., from a network related error). When this happens, the task fails and will retry in a few minutes. However, now the rm command will fail with an error because the file doesn't exist to delete anymore, and WinSCP calls that an error.

I need an option to rm an explicitly named remote file, and have WinSCP not fail if the file already doesn't exist. There doesn't seem to be a way to do that.

I've tried failonnomatch off, but I'm not using a wildcard so that option does nothing, and there are enough similarly named files that using a wildcard would be a bad idea.

I've tried batch on or batch continue before removing the file and then switching back to "batch abort", and that works... except WinSCP still returns a non-zero result code, meaning that the SSIS package that calls WinSCP thinks the whole operation failed. That seems like it might be fine; the file successfully uploaded so the failure causes a retry which would then find the file in place to remove. In practice, though, it introduces a new race condition. The vendor's system will sometimes block the file deletion somehow for some length of time – presumably during their ingestion process – so our process gets stuck in a failure loop and times out.

The only thing I can think to do would be to rewrite this in PowerShell, but I'd rather avoid that to keep this package consistent with the others that we operate.

Reply with quote

Advertisement

Advertisement

You can post new topics in this forum