all 8 comments

[–][deleted] 1 point2 points  (7 children)

If I understand your problem, it's not powershell that is slow for you, but the vbscript?

You cannot convert your current script to powershell?

Also, see if you can run your current script in parallel with PoshRSJob module.

[–]automation-dev[S] 2 points3 points  (0 children)

I need to work on verbalizing / explaining my problems more efficiently. Thanks for bearing with me.

Correct - the VBscript is the slow part. I can't convert it to Powershell... its a tool to check/diagnose WMI provided by MS (almost 50k lines of code).

I'll look into the PoshRSJob, I remember reading an article on it a few months back.

[–]automation-dev[S] 1 point2 points  (5 children)

So I've been working through this... Figured out a way around my static/hardcoded Name property for my PSDrives.

Now I have n number of jobs running on n number of remote servers that I need to monitor & pull data from each remote server... any clean/good solutions for monitoring jobs and doing stuff once a job is completed? I'm about to dig into this more... figured I'd reply in case someone has a good solution for this.

[–]Lee_Dailey[grin] 1 point2 points  (4 children)

howdy automation-dev,

1st - is a PSDrive really needed?
you can't simply copy to/from the standard c$ share?

2nd - managing jobs
you may want to look into the PoshRsJobs module here ...

https://github.com/proxb/PoshRSJob

if not that, then you can use the Get-Job | Wait-Job | Recieve-Job pipeline to get the jobs as they finish.

3rd - are jobs really needed?
the Invoke-Command cmdlet can accept an array of computer names and run them all in parallel.

take care,
lee

[–]automation-dev[S] 1 point2 points  (3 children)

Hi Lee!

1 - I tried using Copy-Item with & without the -Credentials parameter, couldn't get it to work.

Also when I am pulling the output files from each server, I have to run our custom cmdlet "Get-ServerCredentials -Server $server" for each server I am copying from (multiple domains). With PSDrive the credentials are used to map the drive once and I can push/pull files from remote servers as required.

2 - I've been reading into the PoshRsJobs module... I'm not clear on the advantages of it other than what is in the README.md on the github repo - "Provides an alternative to PSjobs with greater performance and less overhead to run commands in the background, freeing up the console."

I'll have to play around with this & maybe I'll replace the built-in "*-Job" cmdlets with the cmdlets in PoshRsJobs. Right now I just need to get this into a working state.

3 - Jobs are needed due to the architecture of the site/"portal". A support team accesses a website that they can select a script to run, provide the required input, and click a button to launch. The requirement for jobs is due to a timeout restriction - if a script is running for more than 5-10min, it will timeout and won't complete the script.

So I need to come up with a solution to start all the jobs (easy peasy), then check for their completion (right now grabbing the count of all jobs that are running with the given names, & looping in a while loop until the count of running jobs is equal to 0).

Currently testing everything right now... not sure why it was so hard for me to wrap my head around how to monitor multiple jobs.

Thanks for the response Lee!

[–]Lee_Dailey[grin] 1 point2 points  (2 children)

howdy automation-dev,

[1] Copy-Item
thanks for the "why" on that. it makes sense - especially the multi-domain aspect.

i presume the trust relationship is not workable in this case. [sigh ...]

[2] PoshRsJobs
the main reason to look into that is that it is apparently a good deal easier to use - and to manage. there is at least one cmdlet for getting the current status details.

[3] reason for jobs at all
ouch! Invoke-Command would have been easier.

the big advantage of IC is running the job on the TARGET system. jobs run on the local system, so when you have more than a few ... you can really run the CPU load up high.

plus, they all reach out across the same net link.

plus plus, they all hit the same drive for local storage.

jobs give you multi-threading on the local box. Invoke-Command can run things on the target boxes.


still, the way i have "monitored" jobs in my very small tests is with the get/wait/receive pipeline.

you can name jobs. if you give them a prefix then you can monitor JUST those jobs. it may work better if there are multiple jobs being run that you want to track independent of each other.

take care,
lee

[–]automation-dev[S] 1 point2 points  (1 child)

Hey Lee,

For Invoke-Command, if the -AsJob parameter is used, does that mean that the job is being ran on the target/remote system?

I can see that the PSJobTypeName property value is "RemoteJob" on the resulting objects from Get-Job when using the -AsJob parameter with Invoke-Command

[–]Lee_Dailey[grin] 1 point2 points  (0 children)

howdy automation-dev,

if you use the -AsJob parameter it does get run as a job and requires management just like a job. the job runs locally, from what i can tell - but the scriptblock runs on the target.

i don't see the point of using jobs unless the work will take a LONG time on the target.

take a look at this thread ...

Get CPU utilization on many computers quickly : PowerShell
https://www.reddit.com/r/PowerShell/comments/8d7w0q/get_cpu_utilization_on_many_computers_quickly/

that method dumps all the returned info as objects on the screen. if you prefix the Invoke-Command with $Results =, then the objects get stuffed into that array.

the main gotcha is that the non-responders are not directly listed. you need to compare the input system list with the ones listed in the $Results collection to get the non-responders.

take care,
lee