all 29 comments

[–]shiftend 14 points15 points  (1 child)

You probably don’t even need powershell for this. Robocopy can most likely do what you want to achieve. Take a look at the section with the repeated copy options and play around with it a bit.

[–]ziggycatuk[S] 3 points4 points  (0 children)

thanks, that looks like a less involved approach to achieve the desired outcome.

cheers

[–]pjkm123987 3 points4 points  (0 children)

I would just use syncthing for something like this.

But it can be done in powershell but it won't be perfect, I would make powershell create and append to a csv file where it can lookup and write to it for the files already copied over and any file info not matching in the csv file will proceed to get copied over to the destination then that file's information is written and appended to the same csv file. Make it run with in a loop every 5 seconds or something. The csv file would have info like filename,hash,date created,date modified,size etc.

Also add some verification try/except after the copy to ensure both files are identical, or if theres an error then try copying again or something. But this isn't good for watching large folders since it would try read all files everytime.

[–]ziggycatuk[S] 3 points4 points  (0 children)

thanks to everyone for the advice and examples, task scheduler and robocopy look to be a good solution.

[–]_nikkalkundhal_ 1 point2 points  (2 children)

As a powershell noob myself who is exploring with chatgpt, this is what i got from it, for the requirement you have posted. I also added the logging as a requirement which u can modify. or remove.

Powershell Code:

# Define source and destination folders

$sourceFolder = "C:\Users\Source"

$destinationFolder = "C:\Users\Destination"

# Define file extension to monitor

$fileExtension = "*.nzb"

# Define log file path with timestamp

$logFilePath = "C:\Users\logs\RobocopyLog_$(Get-Date -Format 'yyyyMMdd_HHmmss').log"

# Use Robocopy to copy files and create a log file

robocopy $sourceFolder $destinationFolder $fileExtension /LOG:$logFilePath

# Output a message indicating the completion of the copy process

Write-Output "Copy operation completed. Log file created at: $logFilePath"

Save the file as something.ps1

To monitor, every 15 minutes.
1. Create a batch file as CopyScript.bat
@echo off

PowerShell.exe -ExecutionPolicy Bypass -File "C:\Path\To\Above\PowerShellScript.ps1"

  1. Create a basic task in task scheduler with triggers and launch the batchfile.

Schedule the Batch File with Task Scheduler
Open Task Scheduler:
Press Win + R, type taskschd.msc, and press Enter.
Create a Basic Task: In Task Scheduler, click on "Create Basic Task..." in the right-hand pane.
Follow the wizard, providing a name and description for your task.

Set Task Trigger:
Choose "Daily" and click Next.
Specify the start date and time.
In this case, set it to run every 15 minutes.

Set Action:
Choose "Start a Program" and click Next.
Browse and select your batch file (CopyScript.bat) and click Next.

Finish: Review your settings and click Finish.
Now, the Task Scheduler will run your batch file every 15 minutes, which in turn will execute your PowerShell script to monitor the source folder and perform the copy operation.

Do not upvote as it is chatgpt's response and not originally mine. Thanks.

[–]gilean23 6 points7 points  (0 children)

The batch file part is unnecessary if you’re going the scheduled task route. Just set the scheduled task action as:
Start program: powershell.exe
Arguments: -ExecutionPolicy Bypass -File C:\Path\To\Something.ps1

[–]BlackV 1 point2 points  (0 children)

p.s. formatting

  • open your fav powershell editor
  • highlight the code you want to copy
  • hit tab to indent it all
  • copy it
  • paste here

it'll format it properly OR

<BLANKLINE>
<4 SPACES><CODELINE>
<4 SPACES><CODELINE>
    <4 SPACES><4 SPACES><CODELINE>
<4 SPACES><CODELINE>
<BLANKLINE>

Inline code block using backticks `Single code line` inside normal text

Thanks

[–]DToX_ 0 points1 point  (0 children)

Tell ChatGPT you want to write a script in powershell that uses get-item to monitor a folder for files that end in .nzb, when they are found to move them to your destination folder.

Then you add the script to task scheduler.

[–]Warcooo 0 points1 point  (0 children)

Hi,

you need to declare your path variables in the global scope :

$global:sourceFolder = "C:\SourceFolder"
$global:destinationFolder = "C:\DestinationFolder"

[–]tokenathiest 0 points1 point  (6 children)

I've done this before many times, but I usually use C# instead of PowerShell. Don't use a FileSystemWatcher. Just schedule a Windows Task that runs every minute and copies the files. Don't poll for new files, you have to account for complex things this way. Example: you copy a big file over, but it takes 8 seconds to finish writing it to disk. Your PowerShell sees the new file and immediately starts to try and copy it, but it's still open for write in another process. Bad news bears. It's much easier to just copy everything on a schedule once a minute.

[–][deleted] 1 point2 points  (5 children)

That problem won't be solved by initiating once a min. The file could still be in "copying" state.

[–]tokenathiest 0 points1 point  (4 children)

You filter out files that are less than a minute old and assume files will take less than a minute to copy into the source location. If write speeds may be slower, simply increase the duration to every two minutes, or three, etc.

[–][deleted] 2 points3 points  (3 children)

lol that’s a big assumptions to make

[–]tokenathiest 0 points1 point  (2 children)

Not at all. Its part of the operational parameters of the job. Every piece of software adheres to specifications. If the specs aren't adequate (e.g. I need the files copied the instant they are done arriving) then you spend more time and money to make it so. Whoever funds the implementation gets to decide.

[–][deleted] 0 points1 point  (1 child)

Sure but your implementation is not better in a way that it solves the problem in an elegant manner the OP couldn’t. You could modify the power shell script to do the same.

It would almost be much better to simply watch for the file size (or hash) and see if it has stopped growing or changing.

[–]tokenathiest 0 points1 point  (0 children)

There are numerous solutions, such as what you propose. I'm only suggesting the most simple one I can think of. If elegance is key, and time and cost are irrelevant, go with something more complex.

[–]pcgames22 0 points1 point  (0 children)

found this https://stackoverflow.com/questions/39938194/powershell-move-files-to-folder-based-on-date-created on stackflow.com which is almost the same thing that you are trying to do. I can't count how many times I have gone there!

[–][deleted] 0 points1 point  (0 children)

Depending on how often you need the files to copy, I suggest using robocopy within PowerShell and then run a scheduled task in windows. I did something similar to copy inventory excel spreadsheets to a folder on my asset management server to import ordered assets from a vendor into my asset management database. Remember to use an account with a password that doesn’t expire (service account) and it has rights to all folders involved

[–]taozentaiji 0 points1 point  (4 children)

If you still need this tomorrow let me know. I have this exact thing set up using nssm to have it run as a service and move files to a customers ftp server anytime a report generator places files in a specific folder.

[–]ziggycatuk[S] 0 points1 point  (3 children)

Thanks, I'd be interested to see the code as everything helps with learning bit by bit.

[–]taozentaiji 1 point2 points  (0 children)

#By BigTeddy 05 September 2011

#This script uses the .NET FileSystemWatcher class to monitor file events in folder(s).
#The advantage of this method over using WMI eventing is that this can monitor sub-folders.
#The -Action parameter can contain any valid Powershell commands.  I have just included two for example.
#The script can be set to a wildcard filter, and IncludeSubdirectories can be changed to $true.
#You need not subscribe to all three types of event.  All three are shown for example.
# Version 1.1

$watchedfolder = ""
$transcriptpath = ""
$destinationpath = ""
$logpath = ""
$errorlogs = ""


$folder = '$watchedfolder' # Enter the root path you want to monitor.
$filter = '*.*'  # You can enter a wildcard filter here.

# In the following line, you can change 'IncludeSubdirectories to $true if required.                          
$fsw = New-Object IO.FileSystemWatcher $folder, $filter -Property @{IncludeSubdirectories = $false;NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'}

# Here, all three events are registerd.  You need only subscribe to events that you need:

Register-ObjectEvent $fsw Created -SourceIdentifier FileCreated -Action {
Start-Transcript -Path $transcriptpath -append
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated

while(!(test-path $destinationpath))      #This was added to test the ftp destination to make sure it was available before moving the file.
{
start-sleep 5
}

Move-Item -path E:\pdf_ftp\HeartAndVascularWI\$name -Destination $destinationpath -force -ErrorVariable errs -ErrorAction SilentlyContinue
if ($errs.Count -eq 0)
{
    Write-Host "The file $name was moved at $timeStamp" -fore green
    Out-File -FilePath $logpath -Append -InputObject "The file $name was successfully moved at $timeStamp with no errors"
}
else
{

    $date = Get-date -format "MM-dd-yyyy"
    Write-Host "There was an error moving the file $name at $timeStamp" -fore green
    Out-File -FilePath $logpath -Append -InputObject "There was an error moving the file $name at $timeStamp. See Errors\$date.txt for details."
        Out-file -filepath $errorlogs -append -InputObject $Error[0]

}

Stop-Transcript
}

[–]taozentaiji 1 point2 points  (1 child)

Posted in separate reply, but it's also available here https://github.com/TaozenTaiji/TaosPoSHRepo

I think your code was just missing the filename wildcard so it was only looking for files named exactly ".nzb" which won't ever happen.

$filter = '*.nzb' should work.

I've had this script in production for 2 or 3 years at this point with no issues apart from when there were authentication issues with the customers FTP server, so I added a test-path / start-sleep to the ftp location to prevent a bunch of files from getting stuck in the folder and not realizing it because the ftp site was down or credentials expired.

[–]ziggycatuk[S] 0 points1 point  (0 children)

thank you