Error on keeping local directory up to date with remote Directory

Advertisement

tmc8295
Joined:
Posts:
3

Error on keeping local directory up to date with remote Directory

I will admit I am not as well versed in this as I would like and this may be a very simple problem.

I am testing this existing script of Keeping a local directory in sync with a remote directory to... you guessed it... automate keeping a local directory up to date with a remote directory.
For the Remote Directory, I am using an ftps connection to the Citrix ShareFile service my company is using.
The Local is obviously just where I want the folders and files to end up.

I tested the program first with just a plain old ftp connection running off of a computer running filezilla server, this was just so I could test out the script and make sure I had everything right and make sure it did what I needed it too. Worked fine, ran the script at my designated intervals, etc. Perfect!


So I switched out the necessary information in order to connect with the Citrix info. It connects fine, synchronizes the files and folders just like I want... and then gets an error instead of waiting its designated interval.

The Error is:
"Error: Cannot bind argument to parameter 'ReferenceObject' because it is null."
"Press any key to exit..."


The only things that changed from my test to trying it where I actually want to was the connection going from an ftp connection to an ftps connection.... but it does work the first time, just not afterwards with the designated interval.

Is it just something stupid I am doing or something more complicated?

thank you
Thomas

Reply with quote

Advertisement

tmc8295
Joined:
Posts:
3

Re: Error on keeping local directory up to date with remote Directory

So with my messing around with it on my own, I fixed it by getting rid of these lines of code here:

if ($delete)
{
# scan for removed local files (the $result does not include them)
$localFiles2 = Get-ChildItem -Recurse -Path $localPath
$changes = Compare-Object -DifferenceObject $localFiles2 -ReferenceObject $localFiles

$removedFiles =
$changes |
Where-Object -FilterScript { $_.SideIndicator -eq "<=" } |
Select-Object -ExpandProperty InputObject

# Print removed local files
foreach ($removedFile in $removedFiles)
{
Write-Host ("{0} deleted" -f $removedFile)
$changed = $True
}

$localFiles = $localFiles2
}

if ($changed)
{
if ($beep)
{
[System.Console]::Beep()
}
}
else
{
Write-Host "No change."
}



Optimally I would like to keep this in the script if possible, so if anyone could maybe point me in the right direction that would be great.

Reply with quote

tmc8295
Joined:
Posts:
3

Re: Error on keeping local directory up to date with remote Directory

Awesome! I am glad to know it is fixed, I tested the command out and it does indeed work now on my end. Perhaps you could answer another question for me?

I didn't see any obvious way of doing so and I am still getting used to all the in's and out's of WinSCP. Is there a way I can schedule the WinSCP.com application to run this extension with Windows Task Scheduler? I know it is setup in the script where I can say run every X amount of time, but I onyl plan on having it sync maybe every week or so. Using Task Scheduler would also ensure that it keeps syncing in the events of shutdowns or reboots without having the make sure the program is running.

Thanks again for the assistance!

Thomas

Reply with quote

martin
Site Admin
martin avatar
Joined:
Posts:
40,476
Location:
Prague, Czechia

Re: Error on keeping local directory up to date with remote Directory

So just remove the loop from the script and run the script from Windows scheduler.

But for such a trivial operation, you do not need PowerShell script actually. A simple WinSCP script with the synchronize will do too.

Start here:
https://winscp.net/eng/docs/guide_schedule

Reply with quote

Advertisement

You can post new topics in this forum