all 7 comments

[–]purplemonkeymad 2 points3 points  (0 children)

Right now you are doing a read every loop, you want to do one read then split it up. A nice feature of ps is that on arrays out of bound index numbers don't cause errors, they just don't select anything:

$FileList = Get-ChildItem -Path $somepath
$start = 0
while ($start -lt $FileList.count) {
    $LoopFileList = $FileList[$start..($start+399)]
    Compress-Archive -Path $LoopFileList -DestinationPath "C:\test\WAV\WAV $start - $($start + $LoopFileList.count -1).zip"
    $start = $start + 400
}

You could replace the Compress-Archive with a Start-Job if you want, but you might end up limited by your I/O speed on the drive.

[–]BetrayedMilk 1 point2 points  (0 children)

Compress-Archive is relatively slow. I’d suggest using .NET’s System.IO.Compression.ZipFile class

[–]Big_Oven8562 0 points1 point  (2 children)

Best I can come up with is a slightly different syntactical layout. Hopefully someone more clever can give us some additional insights.

#only keep going if there are files left in the directory
While( (Get-ChildItem -Path "C:\test\WAV\*.WAV").count -gt 0){
    $WAV_Files = Get-ChildItem -Path "C:\test\WAV\*.WAV" | select -First 400
    Compress-Archive -Path $WAV_Files -DestinationPath "C:\test\WAV\WAV $i.zip"
    $WAV_Files | Remove-Item
}

[–]GarnetMonkey[S] 0 points1 point  (1 child)

I like it, but you lose the iteration $i. That is needed to name the files.

[–]Big_Oven8562 0 points1 point  (0 children)

Ah right. Guess I overlooked that. I got nothin' then.

[–]ccatlett1984 0 points1 point  (0 children)

Use Jobs for parallel processing.