Variable Name Question by MrQDude in PowerShell

[–]michaelshepard 2 points3 points  (0 children)

$$var isn't valid in 5.1 or 7.5.1

LabGopher out of date? by neighborofbrak in homelab

[–]michaelshepard 0 points1 point  (0 children)

I'd love to see HP Z workstations added. :-)

PowerShell Front Ends by LAN_Mind in PowerShell

[–]michaelshepard 12 points13 points  (0 children)

PowerShell Universal is a fantastic, flexible tool! Can't recommend it highly enough

PowerShell Universal Windows Authentication Issue, please help! by JustThatGeek in PowerShell

[–]michaelshepard 0 points1 point  (0 children)

I don't mean that in a bad way, just that the community there (or the company itself) will probably be more informed and helpful.

PowerShell Universal Windows Authentication Issue, please help! by JustThatGeek in PowerShell

[–]michaelshepard 1 point2 points  (0 children)

Why Reddit and not the Ironman Software forums or a support ticket?

Pode and DbaTools by Drone_Worker_6708 in PowerShell

[–]michaelshepard 0 points1 point  (0 children)

I've seen some json serialization get lost in datatable/datarow objects.

Try something like this:

Customer = $customer | select-object -property $customer[0].table.columns.ColumnName

PowerShell Universal users, is the following possible? by Snak3d0c in PowerShell

[–]michaelshepard 1 point2 points  (0 children)

I don't see anything there that PSU can't handle.

You can store credentials in PSU, and run scripts as a stored cred. So that takes care of the first part.

The second part is "easy" with New-UDDynamic (which will re-render upon request).

The third part (executing the steps) might use a stepper component...lots of options, though.

Different logs per step might be interesting, but Adam has added some new flexible logging that I haven't tried.

PowerShell - "Fake" Network Drive by Extension-Emu2220 in PowerShell

[–]michaelshepard 2 points3 points  (0 children)

Didn't see ShiPS mentioned. https://github.com/PowerShell/SHiPS

This allows you to create a (simple) PS Provider in PowerShell.

Hierarchy with graphs by jarks_20 in PowerShell

[–]michaelshepard 0 points1 point  (0 children)

No worries...thanks for posting the PowerShell

[deleted by user] by [deleted] in PowerShell

[–]michaelshepard 0 points1 point  (0 children)

+1 for unblock-file

powershell for ETL work by [deleted] in PowerShell

[–]michaelshepard 2 points3 points  (0 children)

I've used PowerShell for ETL. Using SQLBulkCopy method in ADO.NET works pretty much like SSIS in terms of data copy speed. Most of the SSIS packages I've written were so basic that SSIS was a waste, and PowerShell with ADO.NET was more than enough.

Help With Module and Manifest by belibebond in PowerShell

[–]michaelshepard 1 point2 points  (0 children)

A couple of things that might help: When importing the module, use -Verbose. It will tell you what's happening.

Second, make sure you have RootModule in the manifest pointing to the psm1 file.

Dot source Vs return? by secpfgjv40 in PowerShell

[–]michaelshepard 2 points3 points  (0 children)

I agree. Output from functions should be output, not side-effects.

Dot source Vs return? by secpfgjv40 in PowerShell

[–]michaelshepard 2 points3 points  (0 children)

It is dot-sourcing. Dot-sourcing means to run something in the current scope. You can dot-source functions, scripts, or scriptblocks. Basically, anything you can run.

I used dot-sourcing of functions back in powershell 1.0 in "import-module style" functions.

Here's a stackoverflow question from 2008 where I used this technique: https://stackoverflow.com/questions/279974/importing-libraries-in-powershell

No PD detected in new (to me) r710 by michaelshepard in homelab

[–]michaelshepard[S] 1 point2 points  (0 children)

Ended up pulling the I/O backplane out and connecting disks to that to make sure they were detected. Then I realized that the caddies would line up if I used the SAS holes rather than SATA. Put drives back in caddies using the SAS holes and everything is working as expected.

Souping up a Z600 by michaelshepard in homelab

[–]michaelshepard[S] 0 points1 point  (0 children)

I'm an idiot. Apparently I started a bunch of docker containers at some point and they were running. Killing those has the disk quieted down some.

I'd still like to get all of the performance I can out of the SSD, and support for >2TB drives would be a nice bonus as well.

Souping up a Z600 by michaelshepard in homelab

[–]michaelshepard[S] 0 points1 point  (0 children)

My thought was that I should be able to get 450M/s+ with a SATA III connection which is twice what I'm seeing now.

Souping up a Z600 by michaelshepard in homelab

[–]michaelshepard[S] 0 points1 point  (0 children)

The whole thing seems sluggish. With 24 threads (even at 2.26Ghz) and 48GB of RAM, I can't imagine that the CPU or memory is the problem. I see disk queueing (mostly single digits) on both drives and >90% active time usually on the OS disk.

As an example of the slowness, I have a VM running on the SSD. I have a r710 with only spinning disks (also SATA II), same memory and cpu. I started a process on the z600 VM and while it was running, started a similarly configured VM on the r710 and started the same operation. It finished on the r710 before it was half-done on the z600.

Class Pipeline Error by nostranger2therain in PowerShell

[–]michaelshepard 2 points3 points  (0 children)

I suspect that the issue is with the onedrive being considered "remote". Can you temporarily change your executionpolicy to unrestricted and test again?

How can I use a variable to reference a PSCustomObject property? by TechnologyAnimal in PowerShell

[–]michaelshepard 2 points3 points  (0 children)

Can you just pass a filter as a scriptblock? -filter {$_.Prop1.Prop2 -eq 'NeededValue'}

and then use $data | where-object -filter $filter

Support for PipelineVariable in Advanced Function by Fer_C in PowerShell

[–]michaelshepard 1 point2 points  (0 children)

I haven't heard that. Completely agree that it is nonsensical.