And that deletes files where the timestamp is over 3 days old, but they are sending me files that are over a month old, but with an updated time stamp. Because of this, they are sending hundreds of files as opposed to a handful of files. I need to download these files in six different regions to different servers. The connection is slow and I need to do this hourly. When there are hundreds of files, sometimes the connection times out to display the folder, or I don't finish downloading all the files in time.
I keep asking the vendor to fix things on their end to stop sending me old reports with new timestamps, but months later they have still yet to fix things on their end.
Can I create a Powershell or VBS script to connect, parse the files by the file name to determine a date, compare that to the server date and then use that to delete files before I try to download everything in that folder?
For example, I have a file named:
The number string at the end is a 10 digit number with two digits in order for each of;
MM = Month
DD = Day
YY = Year
HH = Hour
MI = Minute
In a purely Powershell or VBS script I can parse the file name to get a date, but how would I combine this with a WinSCP script to delete files older than 3 days old based on filename, then after that only download the past 3 days of files (and even then preferrably do a synchronize or a "get * -resume -neweronly" to only download the absolute latest files I don't already have downloaded locally?