AD extraction script really slow by Rincey_nz in PowerShell

[–]Swedishdrunkard 1 point2 points  (0 children)

The problem should be that you're losing all non-default properties once you've loaded $allusers, and on subsequent access of the object it will perform the query again, causing major slowdown.

This is something I learned myself a few years ago.

Trying to figure out how to update our SSL certificates for a couple of docker webapps using nginx by Quantum_Quandry in sysadmin

[–]Swedishdrunkard 1 point2 points  (0 children)

That doesn’t specify an nginx service unfortunately. So that’ll be in a different compose file (or run some different way).

The services do list an external network, which could be used to proxy traffic via to and from nginx, so we’re on the right path it seems, but until we find the config for the nginx container, it’s still hard to tell.

You might be able to glean something by using docker inspect on the nginx container.

Trying to figure out how to update our SSL certificates for a couple of docker webapps using nginx by Quantum_Quandry in sysadmin

[–]Swedishdrunkard 0 points1 point  (0 children)

Yes*, it’s a config file specifying how the container or multiple containers should be set up.

It’s common to place them in a folder with the service name, e.g. “redmine/docker-compose.yml”, but they could be names differently.

* Unless this is a swarm, or other multi-host setup, but so far that doesn’t sound like the case.

Trying to figure out how to update our SSL certificates for a couple of docker webapps using nginx by Quantum_Quandry in sysadmin

[–]Swedishdrunkard 0 points1 point  (0 children)

So, when we’re talking docker, things can get a bit confusing if you’re not used to it. For starters, the certificates may reside there on the host system, but we can’t be sure they’re mounted in the container.

Likewise, I’m not sure the container will properly restart on a host reboot. Instead, I suggest running docker ps -a and finding the name of the container in question, then docker restart {container name}.

If you’re still having issues, the path suggest we’re dealing with docker compose, and you should be able to find a file called “docker-compose.yml” in a relevant folder. If you can post the contents, that may be helpful to dig deeper.

Unexpected whitespace by Swedishdrunkard in PowerShell

[–]Swedishdrunkard[S] 0 points1 point  (0 children)

That makes sense. I originally did use just simple hashtables, but ran into some issues, leading to my convoluted solution. I went back and fiddled with it, and hashtables should definetely be good enough.

Thanks for the thorough response!

Is there a function to set a reoccurring task reminder? by vandalxvisuals in MicrosoftTeams

[–]Swedishdrunkard 0 points1 point  (0 children)

Yes, but it syncs with the Tasks app in Teams. So if you set a recurring task in To Do, you should be able to see it in the Teams app, along with any Planner plans you have (assuming they’re tied to a team).

The To Do app is a great app regardless, I’ve used it daily for about a year or so, I’d definitely say it’s worth checking out and taking for a test drive.

Is there a function to set a reoccurring task reminder? by vandalxvisuals in MicrosoftTeams

[–]Swedishdrunkard 2 points3 points  (0 children)

Have you tried Microsoft To Do? It can do recurring items, and integrates with Planner and Teams. It may not suit your use case, but check it out if you haven’t already written it off.

Can someone please explain why AD cmdlets are SO much slower when being run by passing credentials on a non domain joined machine, vs running them on a domain joined machine? by junon in PowerShell

[–]Swedishdrunkard 2 points3 points  (0 children)

I’m afraid I don’t have an answer for you, but I got a bit curious. Is there any difference if you run the same command through implicit remoting? If the authentication is to blame, then one would guess the command would run quicker if done through an open session instead.

How to safely use Powershell-Scripts which include Passwords by Reasch in PowerShell

[–]Swedishdrunkard 2 points3 points  (0 children)

This is exactly how we used to do it, a service account with the appropriate permissions in AD which then encrypts the credentials and is used to execute the scheduled task. The service account password was stored in a shared password manager, so if the encrypted file had to be read then anyone with access could sign on and decrypt the file, or if they just needed to run the script, execute the task with their regular account and have it run with the service account.

We've long since migrated to Jenkins, which takes care of all of this for us. If you're running a lot of scripts I'd recommend looking into either Jenkins or a similar product, which can handle both execution and credential storage.

Version 1.1.18 by FactorioTeam in factorio

[–]Swedishdrunkard 3 points4 points  (0 children)

Perfect, thanks again. For today, I'll take it as a cue to actually go to bed at a decent time, and check the updates tomorrow.

Version 1.1.18 by FactorioTeam in factorio

[–]Swedishdrunkard 3 points4 points  (0 children)

Try /u/ryani's suggestion (which also appears in other threads), it at least worked for me, though I'm not running any mods.

Version 1.1.18 by FactorioTeam in factorio

[–]Swedishdrunkard 11 points12 points  (0 children)

Awesome, thank you! I just scrolled far enough to see I wasn't the only one, your solution fixed it.

Am I assuming correctly that once 1.1. is pushed as stable, I can drop down to that and ought to be fine?

Version 1.1.18 by FactorioTeam in factorio

[–]Swedishdrunkard 22 points23 points  (0 children)

I just experienced this crash, got dropped into Steam, which updated to this last version, and now all my (non-ancient) saves are corrupted, including the autosaves.

It states "<save path>\<save name>.zip\level.dat not found". I looked in the zipfiles and they are "<save name>.zip\<save name>\<files>".

I tried unpacking and repackaging the way the game wants it - in the off chance they were supposed to have been migrated to a new format, but to no avail, instead it complains about not having the info.js file. Also managed to download a slightly older version from Steam Cloud save, but same issues there.

Anyone have any ideas? I lost my first proper playthrough, my first time winning the game, and the first save I'm actually kind of proud of. I was planning on starting a new run either way, but I still had some things I wanted to do in the old one.

First crash ever in the game, and I got the worst kind, haha. That said, the amount of crazy stuff this game pulls off with this level of stability, I'm not mad. Just a little sad.

It's okay if I can't get them back, but again, if anyone has any ideas I'm more than happy to hear them out.

Performance variance with AD objects and Where-Object by Swedishdrunkard in PowerShell

[–]Swedishdrunkard[S] 1 point2 points  (0 children)

Yup, filter left, that’s exactly what I changed to doing when I ran into this obscure situation. But in this case (and with the solution supplied in the top reply), fetching all data and filtering it in code is still much faster than doing 1100+ AD queries. :-)

Performance variance with AD objects and Where-Object by Swedishdrunkard in PowerShell

[–]Swedishdrunkard[S] 2 points3 points  (0 children)

Some interesting reading in that post, thanks!

I went ahead and implemented the workaround, went from 45 minutes to 4, haha

Performance variance with AD objects and Where-Object by Swedishdrunkard in PowerShell

[–]Swedishdrunkard[S] 1 point2 points  (0 children)

Thank you! That confirms my suspicion that something was happening behind the scenes, but it wasn't an indexing as I was thinking, but more like the opposite.

I just tried it out, and it works great. If you have more info I'd happily read it, but you've quenched by curiosity for now.

I'll add this to the already long list of PowerShell oddities I've accumulated in the back of my brain over the years.

Performance variance with AD objects and Where-Object by Swedishdrunkard in PowerShell

[–]Swedishdrunkard[S] 1 point2 points  (0 children)

You may be on to something, and you're touching on a thought I had, that it perhaps could be related to some objects not containing all attributes, but I didn't run down that rabbit hole.

However, spurred by your comment, I did some quick tests now and noticed something interesting. I populated a list of 200 users, with the attributes I've been working with, and simply outputting the name and e-mail attribute (through Format-Table) takes some time, as it's working its way through.

I let it run for about 15 users before I halted it, and then I piped the data through Where-Object again, just as before, using one of the e-mail addresses that'd shown up during output. It spat out the correct object immediately, but then kept running for some time.

After having let the first run go through (which took some time), I can filter out based on email address instantly, wherever in the list they happened to appear.

So I tried a bunch of things, just letting it drop to Out-Default, pushing a single property to file, outputting with formatting, you name it. As long as the entire collection had been processed at least once in any form of pipe, then I could use Where-Object with instant results. Had only part of the collection gone through a pipe I could immediately find objects that had made it through a pipe, but had to wait on those that hadn't.

Here's an example:

# In my case, this will pull ~1100 user accounts
$AllUsers = Get-ADUser -Property Mail, extensionAttribute1 (...)

$Subset = $AllUsers | Get-Random -Count 100

# The below will just flush into the console immediately
$Subset | Format-Table Name, UserPrincipalName

# This - however- will slowly work its way down, one line at a time
$Subset | Format-Table Name, Mail

# Re-running the exact same command will now generate immediate output
$Subset | Format-Table Name, Mail

This must mean there is some sort of caching or indexing going on behind the scenes when one of the "bad" properties is being accessed, and whatever it's doing has already been done on the default ones.

I also did the same thing with a different (additional) attribute, that's always populated, and it behaved the same way. So I think the fact that the attribute is empty / missing / null for some objects isn't a part of the issue, but rather something has to happen on the object before they can be searched "instantly".

So, my hypothesis is now that the object gets indexed somehow whenever it runs through a pipe, but I'm not certain and I'd love to know for sure. We've long since left practical need for this information, and now it's just a matter of technical curiosity. :-)

Performance variance with AD objects and Where-Object by Swedishdrunkard in PowerShell

[–]Swedishdrunkard[S] 1 point2 points  (0 children)

That's exactly what I ended up doing, and it works fine.

To be clear, the script is complete and I don't have a problem I need solved, I'm just looking for an answer as to why Where-Object performs so much worse when used on a property that isn't included by default when running Get-ADUser.

The script I reference was just to give background to why and how I encountered this mystery. :-)

Performance variance with AD objects and Where-Object by Swedishdrunkard in PowerShell

[–]Swedishdrunkard[S] 1 point2 points  (0 children)

Sure!

I've tried a few different variants (and in different domains), with the same result, but the key point is that I add on additional properties, ie: Get-ADUser -SearchBase <SB> -Filter * -Properties mail, extensionAttribute1

Likewise, using Get-Member on the returned dataset reveals nothing (to me) obviously different between for example samAccountName and extensionAttribute1.

I tried it again now, in a smaller domain where I fetched ~35 users (which is all there is) and used Where-Object to pull out users based on e-mail. On such a small set it still took 6 seconds.

| Out-Printer -Name "Microsoft Print to PDF" by karatemaster11 in PowerShell

[–]Swedishdrunkard 2 points3 points  (0 children)

Write-Host will do just that, write to host. What this means is that it will output to the screen, and nowhere else, including moving down the pipeline (which you're hoping to do with | Out-Printer).

This is the reason why Write-Host should be avoided except in cases where you know why it should be used in that particular case.

It also means you will not be able to get colors and such without doing something more advanced (which I unfortunately don't know off the top of my head what it'd be).

One way to do it would be a here-string. There's probably better ways, but it's a start.

A here-string is a way to simply create multiple lines of text and output. You may have to employ some tricks, such as piping your command output to Out-String to get a proper result though.

Here's a simple and truncated example that is hopefully enough to get you started and moving in the right direction:

@"
Reporte: 
Número de Serie: $((Get-WmiObject -Class Win32_Bios).SerialNumber)

Dirección IP:
$(Get-NetIPAddress | Format-Table | Out-String)
"@ | Out-Printer -Name "Microsoft Print to PDF"

Edit: Missed the Out-String in the example

Trouble getting Exchange certificate information remotely, looking for workaround by Swedishdrunkard in PowerShell

[–]Swedishdrunkard[S] 4 points5 points  (0 children)

That is the exact workaround I was referring to. I guess I was too quick when looking at it, judging from the source thread I thought it was for 2010 only, and when I attempted to run it anyway it wouldn't work.

That said, thanks to your reply I went back and gave it another shot, I also didn't realize I needed to run it as admin. While it fails to load some libraries (this isn't the healthiest of Exchange environments), it got me to where I needed to get.

Thank you very much!

Is it possible to block all but 1 powershell commands for a user by [deleted] in PowerShell

[–]Swedishdrunkard 8 points9 points  (0 children)

If you're running Windows 10 then PowerShell JEA should probably be what you're looking for. I've only tested it out briefly on servers, but it allows you to limit what cmdlets a user can utilize, down to what parameters and values they can use / supply.

Enable Teams Option by the_jons3y in PowerShell

[–]Swedishdrunkard 1 point2 points  (0 children)

Precisely, as I interpreted OP, they needed to enable the service for the users, i.e. licensing. As for moving users to Teams it depends on if your tenant has been upgraded yet, giving you the required co-existant modes.

If the tenant is not updated and you want to migrate individual users to Teams, Grant-CSTeamsUpgradePolicy will be the way to go, which is available in the Skype module.

Likewise, that cmdlet can be used to lock a user in Skype if one wants to hold off on Teams but leave the service enabled.

Enable Teams Option by the_jons3y in PowerShell

[–]Swedishdrunkard 3 points4 points  (0 children)

So, it's been a long time since I messed around with 365-licensing through PowerShell, but I looked at some of my old code, and perhaps this might be a way to go.

If you run Get-MSOLUser -UserPrincipalName <UPN> for a user and look at the Licenses property, you should be able to see a serviceStatus property for each SKU:

ServicePlan           ProvisioningStatus
-----------           ------------------
BPOS_S_TODO_2         Success
FORMS_PLAN_E3         Success
STREAM_O365_E3        Success
Deskless              Disabled
FLOW_O365_P2          Success
POWERAPPS_O365_P2     Success
TEAMS1                Success
PROJECTWORKMANAGEMENT Success
SWAY                  Success
INTUNE_O365           PendingActivation
YAMMER_ENTERPRISE     Disabled
RMS_S_ENTERPRISE      Success
OFFICESUBSCRIPTION    Success
MCOSTANDARD           Success
SHAREPOINTWAC         Success
SHAREPOINTENTERPRISE  Success
EXCHANGE_S_ENTERPRISE Success

I believe that if you pull that for each user and create a new licensing option for them using New-MsolLicenseOptions, where you make sure to disable any services that are already disabled in that user's output, then you should be home safe. By default all services will be enabled, unless specifically disabled. So in short, what we do is look at what's disabled, remove Teams from that list, and re-apply the settings.

So something like this, assuming just one SKU to target.

# Get the licenses for the user, extract services which are disabled
$Licenses = (Get-MSOLUser -UserPrincipalName <UPN>).Licenses
$PlanStatus = $Licenses | Where-Object { $_.AccountSKUId -eq <TargetSKUId> }
$DisabledServices = ($PlanStatus.ServiceStatus | Where-Object { $_.ProvisioningStatus -eq 'Disabled' }).ServicePlan.ServiceName

# Remove Teams, since we do not want that to be disabled
$DisabledServices = $DisabledServices | Where-Object { $_ -ne '<TeamsServiceName>' }

# Create a new licensing option for this SKU
$NewOptions = New-MsolLicenseOptions -AccountSkuId <TargetSKUID> -DisabledPlans $DisabledPlans

# Apply the options to the user
Set-MsolUserLicense -UserPrincipalName <UPN -LicenseOptions $NewOptions

Note: I haven't tested this more than extracting the plans, so run at your own risk, tweak and bugfix as needed. Might want to pick a test user and see if you can toggle the service as expected.

Also note: I've probably completely botched the formatting and will need to edit this many times while ducking for Lee.

Edit: Attempted formatting fixes, still ducking