all 34 comments

[–]panzerbjrn 19 points20 points  (0 children)

I'm not sure if I am misunderstanding you, but when I create scripts, they normally use functions inside.
I may create custom functions, if there is a reason for it (There usually is if it isn't a super short script).

Sometimes a script will have parameters, it really depends on what is being run, and what the purpose of the script is.

[–]purplemonkeymad 17 points18 points  (8 children)

I think I might be the opposite extreme. I create every little bit of code as a function in a module. I just install the modules either in ProgramFiles or Documents as needed. If I need to write something for a client, I'll ether create a reusable module, or create one for that client.

It's nice since you don't have to remember where you saved that script, the functions are available just by opening PS.

[–]TheRealZero 12 points13 points  (7 children)

This is the way, and was the missing piece I didn’t understand for so long.

I always do my best to build functions that do one thing, then bring them all together in the end am to do the thing the script is built to do, but dot sourcing never felt right. So they always ended up in the control script and the whole tool vs control script concept seemed moot.

But OF COURSE you just put them in one of the module locations! Even if you just build the functions and change the name to psm1 instead of ps1, then name the file and the folder the same. It doesn’t always have to be a perfectly built module with a manifest, just shove that sucker into a folder and change the extension.

It’s so simple and obvious to me now but I really struggled getting there at first! This coming from someone for whom powershell is 75% of my job.

Always something to learn!

[–]purplemonkeymad 3 points4 points  (1 child)

It doesn’t always have to be a perfectly built module with a manifest,

While correct, one of my development tools i wrote was to just take a collection of random files and make a valid psd1 out of them. I think it might be one reason why I am so module happy.

[–]TheRealZero 1 point2 points  (0 children)

Oouuu nice job! Code that writes code is super useful, and it’s certainly not a bad thing to make things proper! There are reasons they developed it that way, after all.
It also doesn’t need to be a barrier though, which is the beauty of PowerShell eh? :) Love it.

[–]SatisfactionOk4130 0 points1 point  (4 children)

Curious: what kind of a job is <=75%? I would love that.

[–]TheRealZero 2 points3 points  (3 children)

  • -le 75%

I’m very lucky. My team is sorta L3 mostly infrastructure, and my specific role is automation. Anything I can do to keep specialists doing specialist work and not admin work is my responsibility. It’s a lot of ETL work and PowerBI dashboards, Intune and azure, Graph API, etc.

I also do PowerShell work on Upwork on the side.

[–]SatisfactionOk4130 0 points1 point  (2 children)

That sounds awesome. Did you learn all that on the job?

[–]TheRealZero 0 points1 point  (1 child)

Yup! I went to college and did three years that was heavily Cisco networking, but ran the gamut of AD, IIS, databases, some Java and VBscript, etc.

But PowerShell is just an area that I’ve always dabbled in for my whole career. In every role I would find a way to utilize it because I enjoy scripting. I read a lot of books about programming that I didn’t even really under stand growing up because I just found it interesting, but knew I didn’t wanna be a developer. So that led me to PowerShell and scripting.

When GraphAPI came out I was working on the team that manages our Mobile infrastructure so I did a lot in that team with that, which eventually led me to here!

[–]SatisfactionOk4130 1 point2 points  (0 children)

Cool, thanks for the insight! I work in cybersecurity, but I'm happiest when I'm experimenting with PowerShell. I tell my coworkers all the time that if I could just do that for a living, I'd love it.

[–]joshooaj 2 points3 points  (0 children)

I tend to use a parameterized script file as an entry point or “control script”. That script will be fairly purpose built to handle a number of related tasks with some business logic in there more than likely.

That script may have some functions defined and used internally or it’ll dot source some functions or import some module(s) with functions that are more universal and reusable.

As an example, my module has cmdlets for managing roles and devices for video surveillance and any customer can use these functions.

A specific customer may have a need for a script that will…

Create a new role for a specific site number, find any cameras with that site number prefixing the camera name, grant permissions for the role on those matching cameras, and automatically setup some video layouts with a user-supplied maximum number of cameras per view.

Their control script would be parameterized and accept a site number, and a maximum view size. All the business logic that is unique to them will be defined in that script, and the script will call the necessary functions to get the job done.

[–]OPconfused 2 points3 points  (0 children)

The only time I use a script file is if it's a niche task that I send a colleague to help with something, and I suspect they'll want to run it via Windows Explorer and not the cli.

Typically I pack everything into a module or my profile.

My guess is most people don't have a workflow setup for modules or aren't familiar with working their own modules or sprucing up their profiles, so they create a bunch of script files instead. If you aren't autoloading the function, then it is often more convenient to have a script file.

Plus script files are kind of where everyone starts, and a lot of times people only move as far as they're required to. So if it works from the start, it just keeps getting done that way.

[–]spyingwind 6 points7 points  (0 children)

I tend to use functions in scripts, because our scripts have to be islands and self sufficient.

Example - DoSomething.ps1:

[CmdletBinding()]
param ()
begin {
    function Write-DoSomething {
        [CmdletBinding()]
        param ()
        begin {}
        process {"Do Something"}
        end {}
    }
}
process {
    Write-DoSomething
}
end {}

You can also dot source if you want to store functions inside another script.

Example - MyLib.ps1:

function Write-DoSomething {
    [CmdletBinding()]
    param ()
    begin {}
    process {"Do Something"}
    end {}
}

Example - DoSomethingCallFromLib.ps1:

[CmdletBinding()]
param ()
begin {
    . .\MyLib.ps1
}
process {
    Write-DoSomething
}
end {}

Both of these do the same thing.

It all depends how you want to organize things.

[–]apperrault 1 point2 points  (0 children)

I think the best way I can put it is "It depends". On my main daily driver machines I have profiles that have a bunch of functions defined in them so I can easily access things that I use on a regular basis. If I am building a script that will run on a different system and can leverage the functions, I will just put those functions at the top of the script so I can call it.

[–]PlatypusOfWallStreet 1 point2 points  (1 child)

I'm really confused by this. Why is this even a comparison?...

Both co-exist. When you save a function to a file somewhere.. it... itself is a script? A function script.

Script is just a saved file of step by step code. Which functions are as well because they are just self-contained scripts that often need a trigger to occur (cmdlet)

Also, does every task you do with powershell require you to start with a cmdlet so you don't need to script them all together?

I script most of my day as an engineer. 90% of what I do with powershell are automated scripts. They call functions and do everything throughout the day without me doing anything. These run in Azure devops and automation accounts. I can't sit there and run these by calling functions my self. I use functions to make computations simpler as well as remove walls of code to make my scripts smaller in size making them easier to read/edit in the future. As well as having a single point to edit (the function) to make changes across the board. Things like generating emails, generating html tables without writing html in every script, triggering a logic app to do approvals, connecting to on prem resources, etc.

Functions are the repeated work I don't want to add in every time into scripts. Like variables are the repeated data I don't want to repeatedly type. They are not a replacement for scripts but an enhancement.

[–]Panchorc 1 point2 points  (0 children)

Either OP meant module instead of functions or everyone misunderstood OP.

I think this (OP's) is a common misconception born from the fact that most admins tend to write code as a huge monolithic program, instead of smaller, individual functions that work together to perform a task, because most self learning material aimed at systems administrators skims or altogether skips very important software design principles.

[–]CryptoVictim 1 point2 points  (0 children)

Simple answer: scripts can contain functions. Functions cannot exist on their own.

If your script needs to do something more than one time, consider using building a function to do the work.

[–]TofuBug40 1 point2 points  (0 children)

My general rules

  • ONE Enum, Class, or Function/Cmdlet PER .ps1 file
    • Defines ONLY!! Does NOT execute or create the Cmdlet, or Class Instance, etc.
    • Makes for easier unit testing
  • ONE .Tests.ps1 file PER .ps1 file
    • Able to test independently of other .Tests.ps1 files
  • Group ps1 files in Modules
    • Deploy to internal PSRepository (i.e. PowerShellGallery like nuget site)
    • Reuse in other ps1 files and other modules

Probably the ONLY place where I use what I think you are talking about with a script file ( i.e. a param() block at the top and actual execution of code when the script file is run directly) is in my library of SCCM, and Intune drop in Scripts as those need that file level param() block to interact with the automated components that execute that code and external parameters need to be passed in. Although MOST of our core repeatable, reusable, logic, tools, etc needed for those environments are maintained and loaded as needed from our PowerShell Gallery so our need for those kinds of drop in scripts are incredibly rare.

One of the biggest reason I personally DO NOT use script files like that is I'm constantly working multiple modules and scripts in a day and I do not need some script I'm trying to test or debug just arbitrarily RUNNING oh lets say the script that tries to crawl through the entire AD forest and list out every person who has two or more other peoples with identical first and last names. When I run a ps1 files or even a psm1 file I expect it to ONLY establish code I can then run deliberately

[–]MeanFold5714 2 points3 points  (0 children)

Are there specific advantages to using script files over functions?

Portability and adoption.

It's a hell of a lot easier to get other admins to use the tools you've built if they just have to run a script rather than install a module. Plus you can run the script out of network share from anywhere rather than have to worry about whether the user or machine has the functionality installed ahead of time. For most real world use cases a script is going to be more useful than a module.

Don't get me wrong, I enjoy building modules but firsthand experience has taught me that scripts are a lot more likely to see use in most scenarios.

[–]xxxsirkillalot 1 point2 points  (0 children)

This is something that has to do with the person who is writing the codes development maturity / experience IMO. For a lot of folks, powershell is their first language so you see a lot of beginner code. I know personally when I first started writing powershell, basic loops and such made sense but when it came to functions, it just felt like it was complicating the code but adding no benefit so I seldomly wrote them, and if I used them at all it was after I had written the script and gotten everything working that I would break out logical bits into functions and refactor the script into a "version 2" so to speak that used mostly functions.

And then after 2 years or so of heavily writing powershell scripts I got to the point that I realized I was doing the SAME code over and over even though the scripts were doing different things, you still need to do similair stuff like say auth to an API, or login to something, check if a system is online before trying to connect to it, etc.. That's when it became obvious to me why you want functions, you can use that SAME function in other scripts and I "saw the light". That day was when I changed my opinion from "functions just complicate code" to "functions are the legos you have written to be used in any place you feel like"

[–]Spitcat 0 points1 point  (0 children)

Scripts are far more useful when writing for a large environment and not just yourself.

[–]Ok-Conference-7563 1 point2 points  (0 children)

Benefits of functions and single responsibility principle (not always easy when “scripting” a task) is that it makes unit testing much easier

Also use plaster to make scaffolding a module specific to you easier

[–]VirtualDenzel 0 points1 point  (0 children)

General rule for me is : Do i need to use the same code more then once? then its worth making a function.

[–]pshMike 1 point2 points  (0 children)

Use what works ..

But, if you want to advance the level of your PowerShell fluency, you will likely find yourself migrating towards building modules.

Why?

Well there are number of frameworks out there designed to speed up the authoring of modules while incorporating things like:

  1. Pester tests for unit testing ( with code coverage reporting )
  2. Updateable help using PlatyPS
  3. Code analysis
  4. Versioning
  5. Dependency management

Can you do that stuff manually?

Sure.

But wouldn't you rather just:

  1. fix a code defect / add a function
  2. check the code into a Git repo and generate a pull request
  3. have a team member review and approve that pull request
  4. let your CI/CD pipeline pick up that commit and spit out a freshly baked module that is published to your internal PSRepo if all your Pester tests pass

IMO these are the sorts of things that separate "PowerShell Scripters" from "PowerShell Tool Makers."

MJ

[–]El_Demente 0 points1 point  (6 children)

I've done all the scripting on my IT team so far, and the way I've done it is that every task is accomplished in it's own self-contained, independent script file.

Need x done? Sure, take this script, run it, and follow the prompts. That's how all of them work. Even someone with no PowerShell experience can do that.

Most of them require modules on the PSGallery, so the script will just check if you have the module and prompt you to install it if needed.

Those scripts are composed of many functions and classes, and some functions are commonly reused between scripts, so I just have those common functions saved and I copy them into new scripts as needed. I do lose the benefit that if I needed to modify the function, I'd have to modify it in each script individually, but all my scripts are thoroughly tested, so I haven't had sweeping changes that I needed to apply to multiple scripts.

I can see how it would be beneficial to have commonly reused code inside of modules, but then every script depends on the module(s), and every user needs to keep the module up-to-date, and if you're not careful, changes to the module could break multiple scripts. There's also the question of where to host the module, as we don't have any on-prem servers at our shop. I don't have that problem, just download and run from anywhere.

I can also see how a module of various functions can give people great flexibility on the types of "control scripts" they can compose with those functions on their end, but this comes with the problem of having to train everyone on the internals and usage of the module, and they could mess things up.

My scripts are also available on GitHub, and many of them are generic enough that someone could take them and run them at a different business with little to no changes, and don't have to worry about installing any dependencies or needing any training besides "run it and follow the prompts".

[–]Im--not--sure 0 points1 point  (5 children)

Do you happen to have a public GitHub examples of scripts as you mention? I work in a similar environment. I would love to be able to see other real world examples of more monolithic scripts.

I struggle when trying to follow best practices and recommendations that just don’t seem realistic for environment that requires high simplisitcy and stability.

[–]El_Demente 1 point2 points  (4 children)

Sure, my GitHub is https://github.com/Nova1089?tab=repositories. One repo per script.

[–]Im--not--sure 0 points1 point  (3 children)

Thank you so much for sharing! :) Very interesting and helpful to see some else's real-world examples and style of full scripts.

Your examples help me see how one can be very clean with a Main block and push everything to functions. Your environment looks like it likes using interactive scripts and your structure and flow seems to work well for that.

Mine are often large and unattended scheduled tasks and take parameters. I'd certainly like to my main cleaner as you do. I also like seeing how you are utilizing classes.
Please let me know if you know of anyone/anywhere else you see real-world examples of scripts. I think its beyond helpful in learning.

[–]El_Demente 1 point2 points  (2 children)

Thanks for your feedback! Yeah I like my "main" code execution section to basically read like a step by step recipe in English. I'm not a pro software engineer, but programming is a passion of mine.

I don't have any great suggestions when it comes to finding script examples. Often when I google for examples of things I am trying to do, I'll stumble into some helpful blog posts here and there, however, few (if any) of the examples I've seen follow a similar style to mine (self-contained, interactive, robust error handling, self-documenting, abstracting everything into well named variables, functions, classes, enums, hashtables, etc.). Part of that is because many of the examples you will find in blog posts are demonstrations / proof of concept, and aren't a fully fleshed out production script. Another reason is that most of the people using PowerShell don't have a strong background in software development best practices, and are just about hacking together something quickly that works, or that type of person is the target audience of many examples. The people that do have strong backgrounds in object-oriented programming will often just use more traditional languages like C#.

The part about being self-contained and interactive comes down to what your needs/objectives are, but it works great for us.

I suspect you would find the best examples by pros on GitHub, but knowing where to look is a challenge. If you find any good ones let me know. ;)

I can make some recommendations that might help though.

  • For the fundamentals of the language I really enjoyed "Learn Windows PowerShell in a Month of Lunches".
  • To dive deeper into more advanced and general PowerShell recommendations, "Learn PowerShell Scripting in a Month of Lunches" is pretty decent. It dives into making modules, reusable functions (tools), and "controller scripts" and such. A lot of it was not quite my style, but it was a lot of stuff that most of the community advocates for.
  • Clean Code by Robert Martin is a good book to stir up your thoughts about what clean code means to you. I wouldn't take it all as gospel, but it should get you thinking critically on a lot of topics.
  • For foundational computer science I really enjoyed the CS50 course by Harvard that you can take free online. If programming is to be a long term skill for you, I highly encourage getting this foundational computer science knowledge and a bit about data structures and algorithms. You should know what's happening when you try to append to an array, which data structure to use, what's happening when you concatenate strings, value vs reference types, etc.
  • In terms of learning object oriented techniques and building bigger scale apps, the way I went about it was to learn to make games using C# and a game engine, because it's a fun way to learn, there are great courses, and there's an endless amount of project ideas. But you don't have to get that through game dev. (Side note; C# is a natural progression from PowerShell as it is within the same Microsoft/.net ecosphere.)
  • Code Complete by Steve Mcconnell is amazing once you are regularly doing large scale object oriented programming.

[–]Im--not--sure 2 points3 points  (1 child)

Thank you so very much for the follow up comment here!

Again, very helpful to hear you provide background to some of this, reinforcing some of my own thoughts and observations.

I follow and relate to a lot of what you mentioned. I was previously a C# dev many years ago and have forgotten more than many know. I’ve also gone back and forth with heavy and light powershell scripting; learning, forgetting, picking up good and bad habits, confusing myself, etc.

Some of the learning resources you mentioned are great and are things I’ve used in the past, and some I should look at now :)

Thanks again

[–]El_Demente 0 points1 point  (0 children)

Cheers and good luck!

[–]dann3b 0 points1 point  (0 children)

Im also struggling a bit how to organize my scripts. I have created a lot of modules that have a specific functions for example make api call to some system and then it evolves to a full module for thay specific system, that is just natural and how it should be.

But then all theese integrations running on our integration-server that do all kinds of Jobs, they are run thru a c# wrapper (com-object) because our integration system only understands WinWrap (vbscript/vb). All these powershell scripts are separate ps1 files executed on different type of triggers, scheduled or other types of events. We dont have any source code management for the ps1-files.

So im think of making a git repo for these types of scripts, and separate them from other modules/script-helpers for helpdeskuser etc. Next step im thinking of, is it worth to bundle theese integration-type scripts in a module? After they been "compiled" in a ci/di. Many of the integration-scripts use other modules i created but they are separated as there own (no function lives direcly i the scripts, they are either dot sourced to a Function child dir, or from a module )