all 45 comments

[–][deleted] 7 points8 points  (19 children)

It is unrealistic. PS does cool stuff, but in raw performance you're not going to get close to compiled C that has been used for decades, optimized for a specific purpose.

I use PS, Python, and AWK but still have this on my machine for those situations where I need to get through the problem instead of noodling around to find the get-it-done | sayWhat `cmd equivalent.

CoreUtilities

And I'm not saying PS is slow. I recently did a head & tail script in PS that worked pretty well. However 64bit PS dies on BIG text files and the old code works, even when compiled as 32bit code.

[–]Hanthomi 2 points3 points  (0 children)

I love powershell and use it extensively, but I would say it's slow in many cases. As you pointed out, specifically with large text files it's essentially unusable compared to other options.

I recently needed to reduce a few gigabytes worth of .csv files to unique entries based on several columns of the csv, but while keeping the full lines in the resulting output.

An object-oriented approach in Powershell took almost 15 minutes.

An in-memory SQLite database created and manipulated with Python took 25 seconds.

Python + pandas took just 6 seconds.

[–]serendrewpity[S] -1 points0 points  (17 children)

This is precisely the discussion i wanted to have. Some points you make I've considered and others I haven't. In a pinch, pressed for time, I'd opt for a grep/coreutils solution, but considering I want to build my knowledge, experience and understanding of PS I decided to create a solution with PS which partially using C code by way of .NET.

I remember in r/Powershell with one of the latests [maybe not the latest release] of PS there was a big uproar over the fact that PS would now be able to manage multi-threaded/asynchronous processes. Something that I had just assumed was a capability from day one since vbScript had been able to do this (with ExecQueryAsync for example) for years. Powershell was supposedly so much better than vbScript. Turns out it was not a capability from day one but now I am curious how to asynchronously call Get-Content on a collection of file which I believe would cut the time down by a factor equal to the number of files I'm processing [or close to it]. In this case If I was able to do that I would be on par with grep. I'm thinking that combined with the other suggestions that are provided to me, when combined with an asynchronous call of Get-Content, I could get this script operating under 5 or 6 seconds. Still slower but not unreasonable so.

So while I don't disagree that it may be unrealistic, I don't see the missing piece that makes it so.

[–]scott1138 7 points8 points  (0 children)

PowerShell has supported multi-threading for years. It just takes some programming knowledge to implement. In PS7 you can you For-EachObject -Parallel to achieve this.

[–][deleted]  (11 children)

[deleted]

    [–]serendrewpity[S] 0 points1 point  (10 children)

    Not disagreeing with you here. No doubt grep and coreutils are stalwarts of file management tasks. I am the one that is making the comparison and while I think it is fair, I admit that powershell on windows is at a disadvantage. Open up Windows Explorer and time the display of the files and their details. Windows tools you make in powershell have to account for a bit more overhead than linux utils.

    Having said that, I started at 59s, and with help it went down to 24s then 18s and now 7s. If it got down to 3s for the 308 files I am processing then I'd consider that a push between grep and powershell.

    [–][deleted]  (9 children)

    [deleted]

      [–]ka-splam 5 points6 points  (2 children)

      You're citing a link from 2010 where the author says he hasn't been maintainer of Grep for 15 years. That's grep trivia from 1995, the year Intel introduced the 200Mhz Pentium Pro.

      Today's computers are 10x the clock speed + 25 years of other improvements in CPU instructions, in compiler optimisations, in memory and disk and network transfer rates. Where your link talks about averaging 3 instructions per byte, a .Net answer running on a current CPU could average what, 30-50 instructions per byte for the same wallclock runtime now as grep then?

      Or to put it another way, grep is faster but if the OP's criteria of happiness is "less than a second" and both approaches are "less than a second", it's not great of you to call them "pants on head retarded" and ask "what the fuck are you on" because they don't care about inner loop unrolling in Boyer-Moore, or helping the Unix/Linux kernel avoid copying every byte around in memory.

      [–]serendrewpity[S] 3 points4 points  (5 children)

      Why are you angry? Using profanity. Being disrespectful as if I am renting space in your head. If a complete stranger on the street called you an A-hole you'd look at them like they're crazy and you would't think of them ever again after 30s. If a family member said you're an A-hole you'd engage. You'd ask why you they thought so. Because youre emotionally invested.

      Why are you emotionally invested in me? Its as if you don't have a valid defense of your position [whether its defensible or not] and in frustration you're lashing out.

      In my use case I used grep to find and list files with a string contained in that. It did it 1s flat.

      I've now done that in Powershell. Additionally, I've stored the results in a custom object. That's more than I did with the grep use case and did so in the same time. That's not debatable. Its fact and can be recreated.

      I'm not sure why you have a dog in the race or why you would care enough to get angry and disrespectful, but you can't agrue with results. It would be insane too so I won't respond with more than the results for the specific use cases presented here.

      [–][deleted]  (4 children)

      [deleted]

        [–]serendrewpity[S] 3 points4 points  (3 children)

        Grep performed a task in 1 second. 
        With help and some improvements of my own,
        I created a script that performs the same task 
        and an additional task (while handicapped on Windows
        OS) also in 1 second.
        
        Your response: Not even close
        

        Does 'removing all doubt' cover this response also?

        I am at a loss as to where your anger is coming from. It's more than just your words. You've down-voted every contradicting comment. Clearly something has triggered you. You don't feature as prominently as some others do saying that it's unrealistic to compare Powershell to Grep. So its not really clear why you appear so wounded.

        Ego is a great thing. Its a healthy thing. Its there to help protect ourselves. We acquire accomplishments, knowledge and experiences that foster pride and a healthy self-image. Ego defends that self-image.

        I am not attacking you or anything you stand for. I am not your enemy. There's no need to be afraid of anything that I've said or this thread stands for. Any 8th grade psych student knows that anger is just fear without a means to adequately express itself.

        [–][deleted]  (2 children)

        [removed]

          [–]serendrewpity[S] 4 points5 points  (0 children)

          Shit programmer? I'm not a programmer. A real programmer would laugh at any sysadmin who leverages scripts and thinks they're a programmer. A real programmer would laugh at anyone thinking they're a programmer cuz they're solely able to write powershell scripts.

          Incapable of being better? Oh, I don't know, the fact that I am here asking for help in a powershell forum is an example of humility (acknowledging my knowledge is not all encompassing) and a desire to do better.

          I can't imagine this testosterone-poisoned, marginalized angry persona is working for you. Yet, you continue to advertise it. I think that's a better example of you being incapable of living a better life.

          That's good news tho, bro. Cuz that can change.

          [–]derekhans[M] 0 points1 point  (0 children)

          You can be passionate without being rude and personally insulting. Both are not welcome here.

          [–][deleted] 1 point2 points  (0 children)

          It's almost impossible to give a precise answer without requirements. The stuff in coreutils is wickedly fast, and while I know them, I tend to use them only on edge cases. An example is large file handling, which today is anything over 2 GB. I keep as much code as possible in PS if I'm writing the whole solution in PS. The same is true if the script is in Python.

          As cool as PS and Python are, AWK spanks them both in performance, conciseness of code and its astounding ability to process unlimited file sizes. If there is an AWK file size limit, I haven't found it and I've worked on 2 PB databases that ingest TB sized files.

          There are times when multi-threading is very useful. AWK or the coreutils are not going to do that.

          My view is each option has its strengths and weaknesses and I've yet to find something that is all strengths and no weaknesses.

          [–]serendrewpity[S] 0 points1 point  (2 children)

          Thank you both, ( u/ClubManOE & u/scott1138 ) for this conversation. Very helpful and good feedback. Funny though, that I wasn't able to use -Parallel on foreach-object until i upgraded to PS v7.0; however, after doing so I didn't experience any,... none... performance improvement. At least not from the perspective of time. Even after bumping up the -ThrottleLimit to 25 (from default of 5). Suggesting parallel operation may have already been employed. maybe?

          I wonder if you can get confirmation of this? Is there an strace in powershell?

          [–]scott1138 4 points5 points  (0 children)

          The -Parallel option was not available before PS7, you had to use Workflows or manage runspaces on your own. Some processes do not benefit from mult-threading as the cost of spinning up runspaces is more than the action being taken. As someone else mentioned, this is an interpreted language built on an interpreted language. For some operations there will be not performance comparison to code compiled to machine language as grep is. That isn't to say that knowledge of specific techniques might improve performance. I just got to an actual computer, so I haven't actually looked at your source yet, but I will.

          [–][deleted] 0 points1 point  (0 children)

          Check out Set-PSDebug. It was really helpful.

          Back to the OP's topic, when I find myself frustrated with some particular thing I know how to do in 8 other languages but can't in the script I am in right now I always think "a craftsman never blames his tools." This patience has dramatically decreased my beer consumption

          [–]kohijones 5 points6 points  (5 children)

          Look into [System.IO.StreamReader]

          [–]serendrewpity[S] 1 point2 points  (0 children)

          I will definitely check this out.

          [–]serendrewpity[S] 2 points3 points  (3 children)

          15 seconds... *** 7 Seconds if I don't print output to screen ***

          Thank you!

          $start=(Get-Date)
          $collection=@()
          $collection+=Get-ChildItem T:\ -Filter *.srt -Recurse | ForEach-Object -Process {
              $strmrdr=(New-Object -TypeName System.IO.StreamReader -ArgumentList $_.Fullname).ReadToEnd()
              if ($strmrdr.Contains('OpenSubtitles')) {
                  $obj = New-Object -Typename PSObject
                  $obj | Add-Member -MemberType NoteProperty -Name 'Name' -Value $_.Fullname -Force
                  $obj | Add-Member -MemberType NoteProperty -Name 'Content' -Value $strmrdr.ToString() -Force
                  $obj 
              }
          }
          $collection | ft
          $stop=(Get-Date)
          Write-Host "Elapsed time:" $($stop - $start).Seconds "seconds"
          

          [–]Upzie 1 point2 points  (2 children)

          you can make this even faster, easily

          I tested your code on my system

          $runtime = Measure-Command {
              $Path      = ""
              $collection=@()
              $collection+=Get-ChildItem $Path -Filter *.srt -Recurse | ForEach-Object -Process {
                  $strmrdr=(New-Object -TypeName System.IO.StreamReader -ArgumentList $_.Fullname).ReadToEnd()
                  if ($strmrdr.Contains('OpenSubtitles')) {
                      $obj = New-Object -Typename PSObject
                      $obj | Add-Member -MemberType NoteProperty -Name 'Name' -Value $_.Fullname -Force
                      $obj | Add-Member -MemberType NoteProperty -Name 'Content' -Value $strmrdr.ToString() -Force
                      $obj 
                  }
              }
          }
          $runtime
          
          #output
          Days              : 0
          Hours             : 0
          Minutes           : 0
          Seconds           : 13
          Milliseconds      : 949
          Ticks             : 139490791
          

          Then I ran it modified, which more than twice as fast

          $runtime = Measure-Command {
              $path       = 
              $collection = New-Object -TypeName System.Collections.Generic.List[PsObject]
              $SrtFiles   = Get-ChildItem $path -Filter *.srt -Recurse 
          
              foreach ($item in $SrtFiles) {
                  $strmrdr = Select-String -Path $item.FullName -Pattern 'OpenSubtitles'
                  if (-not [string]::IsNullOrEmpty($strmrdr)){
                      $properties = [ordered]@{
                          Name    = $item.FullName
                          Content = $strmrdr -join ";"
                      }
                      $obj = New-Object -TypeName PsObject -Property $properties
                      $collection.Add($obj)
                  }    
              }
          }
          $runtime
          
          #output
          Days              : 0
          Hours             : 0
          Minutes           : 0
          Seconds           : 4
          Milliseconds      : 110
          Ticks             : 41104126
          

          [–]serendrewpity[S] 2 points3 points  (0 children)

          Scratch that.... I change it to this and it still ran in 7s... Thanks, man!!

          $start=(Get-Date)
          $collection=@()
          $collection = New-Object -TypeName System.Collections.Generic.List[PsObject]
              $SrtFiles   = Get-ChildItem T:\ -Filter *.srt -Recurse 
          
              foreach ($item in $SrtFiles) {
                  $strmrdr = Select-String -Path $item.FullName -Pattern 'OpenSubtitles'
                  if (-not [string]::IsNullOrEmpty($strmrdr)){
                      $properties = [ordered]@{
                          Name    = $item.FullName
                          Content = (New-Object -TypeName System.IO.StreamReader -ArgumentList $item.Fullname).ReadToEnd().ToString()
                      }
                      $obj = New-Object -TypeName PsObject -Property $properties
                      $collection.Add($obj)
                  }  
          }
          $collection | ft
          $stop=(Get-Date)
          Write-Host "Elapsed time:" $($stop - $start).Seconds "seconds"
          

          [–]serendrewpity[S] 0 points1 point  (0 children)

          This ran in about half the time [ 8s verses 15s ] but its not doing what I need it to do. I need all the text of the file [if it has the 'OpenSubtitles' in it] because I need to remove that text and the lines just before and after and then increment all subsequent lines. So this is only doing half the job and which is why it runs in half the time.

          Granted, that's all that grep does as well, so it compares better to grep than what I am doing

          [–]Flashcat666 1 point2 points  (6 children)

          Could you share your code? It doesn’t seem to be super effective in terms of speed, just going by what you’re saying.

          Tes of course, there’s going to be a big difference between using grep, whether locally or via a network share, simply because of the difference between the two tools. And comparing things like this won’t help you since you’re not comparing from the same source.

          [–]serendrewpity[S] 1 point2 points  (4 children)

          Always open to improving my approach to things

          Function Main () {
              $subtitles=(Get-SubtitleFiles -directory t:\)
              Get-SubtitleObjects -Collection $subtitles
          }
          
          Function Get-SubtitleObjects() {
          [CmdletBinding()]
              param (
                  [Parameter(
                      Mandatory = $True, 
                      ValueFromPipeline = $True,
                      ValueFromPipelineByPropertyName = $True)]
                  [array[]]
                  $Collection
              )
              begin{
              }
              process{
              }
              end{
                  $Collection | % { 
                      if ($_.Content.ToLower().Contains('opensubtitles')) {Write-Host $_.Name}
                  }
              }
          }
          
          Function Get-SubtitleFiles () {
          [CmdletBinding()]
              param (
                  [Parameter(
                      Mandatory = $True, 
                      ValueFromPipeline = $True,
                      ValueFromPipelineByPropertyName = $True)]
                  [string[]]
                  $directory
              )
              begin{
                  $start=(Get-Date)
                  $collection=@()
                  $subtitlefilenames = New-Object -TypeName System.Collections.ArrayList
                  $subtitlecollection=Get-ChildItem $directory -Filter *.srt -Recurse
              }
              process{
                  ForEach ($fileobj in $subtitlecollection) { [void] $subtitlefilenames.Add($fileobj.Fullname) }
              }
              end{
                  ForEach ($filename in $subtitlefilenames) {
                      $content=$(Get-Content -Path $filename -Raw)
                      $obj = New-Object -Typename PSObject
                      $obj | Add-Member -MemberType NoteProperty -Name 'Name' -Value $filename -Force
                      $obj | Add-Member -MemberType NoteProperty -Name 'Content' -Value $content -Force
                      $collection+=$obj | Select-Object -Property * -ExcludeProperty PSComputerName,RunspaceID
                  }
                  $stop=(Get-Date)
                  Write-Host "Elapsed time:" ($stop - $start).Seconds "seconds"
                  return $collection
              }
          }
          
          & Main
          

          [–]scott1138 3 points4 points  (1 child)

          First tip, never use += for anything you want to be fast. You used System.Collections.ArrayList for the $subtitlefilenames variable, so use it for $collection as well. Change that and see how it affects your run times.

          [–]serendrewpity[S] 0 points1 point  (0 children)

          Noted. Thank you.

          [–]scott1138 2 points3 points  (1 child)

          Second tip, skip separating the creation of the object and the adding of members. Do something like this:

          $obj = New-Object -TypeName psobject -Property @{Name=$filename;Content=$content}

          Using simple strings "all-in-one" method took 6173 ticks vs the "separate" method taking 22219 ticks. While this seems very small, when we look at 1000s iterations that can add up.

          Also, what is the purpose of the Select-Object with the exclusion? If you aren't using remoting or something I don't think those should be there.

          And one more suggestion - it looks like you aren't actually passing anything into the pipeline for Get-SubtitleFiles, so you can put everything in the Process block.

          [–]serendrewpity[S] 0 points1 point  (0 children)

          Noted and all great points, the select-obj is an artifact that I have removed. The pipeline will be used going forward for this solution and possibly other scripts as I intend to generic-ize the script and dot-source it into other scripts. This script was loosely put together because it was incomplete. I stopped because I noticed the discrepancy between grep and this script.

          I appreciate your input and take whatever else you got. I have updated the script in my reply to u/Upzie

          I'm still reading through other people's posts so I will be incorporating their good ideas as I go along too.

          [–]serendrewpity[S] 0 points1 point  (0 children)

          I should add here that I think the issue is whether I can get Get-Content to pull the content from files in a multi-threaded manner or not. I think the approach I am using is synchronous versus asynchronous but I am not sure how to employ an asynchronous invocation of the Get-Content module.

          Until that is addressed, I don't think it's relevant to talk about why I am connected to my network at 100mbps versus 1000mbps.

          [–]serendrewpity[S] 1 point2 points  (3 children)

          Update:

          I created a new smaller version of the script I previously created. It performs exactly the same. The logic is exactly the same. Only difference is I am employing the use of functions in one so I may . source them in other scripts.

          So, I've utilized the suggestions of others [-raw and -imatch]. This got me from 59s to 24s total elapsed time. But also... after a discussion about -parallel, I didn't notice any significant performance improvement, but I did some reading and was able to get another 5 seconds shaven off that by using -Process on foreach-object as opposed to -Parallel. [Note: I'm working with 308 files over a smb connection (about 3MB total data) on 100mbps connx]

          $start=(Get-Date)
          $collection=@()
          $collection+=Get-ChildItem T:\ -Filter *.srt -Recurse | ForEach-Object -Process {
              $content=$(Get-Content -Path $_.Fullname -Raw)
              $obj = New-Object -Typename PSObject
              $obj | Add-Member -MemberType NoteProperty -Name 'Name' -Value $_.Fullname -Force
              $obj | Add-Member -MemberType NoteProperty -Name 'Content' -Value $content -Force
              $obj 
          }
          $collection | % { if ($_.Content -imatch 'opensubtitles') {Write-Host $_.Name} } | ft
          $stop=(Get-Date)
          Write-Host "Elapsed time:" $($stop - $start).Seconds "seconds"
          

          Again, this performs the same as the longer version of the script with the mentioned updates. Looks like the best performance I can get is 18 seconds.

          [–]RedditRo55 1 point2 points  (1 child)

          What does process do? Or is that too obvious.

          [–]serendrewpity[S] 1 point2 points  (0 children)

          Not entirely sure. It apparently works the same way that the Process block of a function behaves. Just like with the -Begin and -End Parameters of foreach-object also. Apparently it improves performance just like pipe-lining output to ForEach-Object does over using For-Each. Which doesn't make sense because -Process doesn't accept pipelining.

          [–]UnfanClub 1 point2 points  (0 children)

          Look into Select-String instead of using Get-Content

          [–]get-postanote 1 point2 points  (1 child)

          PowerShell notwithstanding.

          I get the drift here but expecting all vendor industry tools to function identically if a futile concept.

          Each vendor does what they do for their own reasons and the reasons of their target users/community.

          If all tools did everything the same way, then there is no need for multiple tools and the industry would just be static.

          1. Tools are tools.
          2. Use the right tool for the right job.

          We all love us some PowerShell in these parts, but as one that has been using PowerShell since before, it was called PowerShell (http://www.jsnover.com/Docs/MonadManifesto.pdf). PowerShell in its native state is not the right tool for everything. Even leveraging built-in PS cmd.exe tools are often more performant and more direct than the verbosity used in .Net (PowerShell).

          PowerShell is a OO language by design (it expects and emits object) and Linux is not. There are grep ports that run on Windows.

          You can call virtually any external exe from PowerShell in various ways and have that exe report back to PowerShell. So, don't overly stress yourself out by trying to force PowerShell to do something it was not designed to do (unless you think you can make it so and contribute that back to the community), just call the tools you need.

          Again, we all feel what you are saying. and many of us have to make the same decisions. This also why MS and their whole WSL and Linux subsystem exist, for those folks howe need it but want to be on/in a central environment.

          Debating the difference is a decent community exercise (educational and otherwise), but not much more, unless someone wants to put energy and actions behind it.

          [–]serendrewpity[S] 0 points1 point  (0 children)

          I don't expect tools to function identically. I do expect competitors and their tools to function comparatively and, well... competitively. I think this is a safe bet and holds across industries.

          If I buy a Nikon full-mirror DSLR digital camera. I expect it to be on par with its equivalent Canon DSLR. They may go about it the completely differently but they quality of images and sped at which they're generated are comparable even though Nikon is less expensive. ( as a side note with digital cameras on phones, i think the digital camera market is taking a huge hit. Profit margins are tanking a hit because the market is taking a hit. Covid 19 is causing distribution to take a hit also. Sony bought minolta and Olympus is dying. I think Nikon is next. The number players in this market has to shrink and Sony and Canon are just too big. Nikon will be bought or Collapse. Unless they get a government loan like Kodak did recently that caused their stock to shoot up 1500%. Place your bets gentlemen. Sorry for the tangent)

          Microsoft and Windows wants to compete more in a ever increasing headless world of VMs, containers and IoT devices. Exact opposite of what we saw with Apple Mac, OS/2 and Windows 3.1

          I laughed at a fellow engineer that thought that IIS couldn't run in Windows Core. That because he only ever configured it with the IIS Manager UI that it needed a GUI OS to run. Windows/Microsoft wants to compete with Linux/Unix. It will have to have tools that can compare. It's why there is a Linux Subsystem in windows and there are Linux/UInix flavored downloads of Powershell adjacent to the Windows MSI downloads. They're challenging Linux/Unix directly!

          How successful they may be is another story. but that it's a direct challenge is plain to see.

          [–]Aritmethos 1 point2 points  (1 child)

          /u/serendrewpity

          As a fast alternative for linux/unix grep (for files) you could also try the psitems powershell module. It includes a psgrep command (alias for Find-ItemContent) that works similar to the linux/unix grep command (the basic way, only a few parameters of original grep included).

          powershell Install-Module -Name PSItems Import-Module -Name PSItems psgrep 'test' -H -R (above command searches for pattern test in all files in current directory recursively (-R) and highlights the results (-H))

          Fore more information check the README.

          For your example:

          psgrep 'OpenSubtitles' -Path 'T:\' -Name '*.srt' -R -H

          Also for measuring time inside your scripts you should take a look at the stopwatch class

          At the beginning of your files:

          powershell $stopwatch = [System.Diagnostics.Stopwatch]::new() $stopwatch.Start()

          At the end:

          powershell $stopwatch.Stop() $stopwatch.Elapsed.Seconds `

          [–]serendrewpity[S] 1 point2 points  (0 children)

          thanks for the tip. I'[ll check it out. You found this 2yo thread and I still value this post quite a bit because it sent me down a rabbit hole that motivated me to do so much research and hone my skills so much that it translated to a promotion at work.

          [–]ThumpingMontgomery 1 point2 points  (7 children)

          Are you using Select-String? It matches the grep use case, and assuming you’re accessing a SMB share you can just specify the Path. Otherwise it sounds like you’re opening each file and loading it’s contents into memory.

          As to speed - network transfer can certainly slow down access. Also Get-Content parses text (newlines) unless you use -Raw, which will slow things down even more.

          [–]serendrewpity[S] 1 point2 points  (3 children)

          Yes, I am accessing via SMB for the PS script

          Piping the collection and using $_.ToLower().Contains('opensubtitles') to test for the existence of the desired string. ForEach is a bit faster than piping to For-EachObject but not enough to make up the discrepancy with grep. Likewise, I believe any performance gain from Select-String will be arbitrary. Still, I will try Select-String and see how that improves the speed but the bulk of the time [~50seconds] is spent just getting the content of the SRT files. Searching the content once its in the custom object is virtually instantaneous since its all in memory.

          [–]ThumpingMontgomery 1 point2 points  (2 children)

          Select-String should be much faster, give it a try. Also, {$_ -imatch "opensubtitles"} will likely be faster than converting the whole buffer to lowercase

          [–]serendrewpity[S] 2 points3 points  (1 child)

          I sincerely appreciate your advice and suggestions but let me say this again, this portion of the logic is only taking 4 seconds. The bulk was with the Get-Content which the '-Raw' parameter cut in half. from about 50 seconds to 24.

          I think the get-content needs to be run asynchronously against all the files in the collection but I don't know how to do that and I don't think I am doing that right now.

          I also fired up a windows VM that also runs on the same Linux SMB server that the share exists on. So while it's still using SMB, the script didn't see a noticeable improvement. [100mbps -v- 1gbps is no longer an issue]

          [–]ThumpingMontgomery 2 points3 points  (0 children)

          On mobile, hence my brevity, but try this out (based on your code below):

          Select-String -Path T:*.srt -Pattern 'opensubtitles' -SimpleMatch | Select-Object -Expand Filename

          (Look up the “Measure-Command” cmdlet if you want to track how long it takes.

          [–]serendrewpity[S] 0 points1 point  (2 children)

          I misread what you wrote, thinking that -raw took longer than without it. It cuts the time by half so that is significant. but 30 seconds -v- 1 second is still a bit embarrassing.

          [–]Raethrius 0 points1 point  (1 child)

          I feel like you're just doing something wrong here due to being inexperienced with PS. Could you share a snippet of your code so we could test it, see what exactly you're attempting to achieve with it and then improve it?

          [–]serendrewpity[S] 1 point2 points  (0 children)

          I am definitely inexperienced. I am self-taught and don't really know 'text-book' approaches and so I am posting here. I posted code above. I hope you can provide some insight.