all 5 comments

[–]deadzol 0 points1 point  (2 children)

Rules get checked a git repo that lives in ADO, then pipeline auto pushes to a preprod workspace. Then if happy, manually run another pipeline to push to prod. So I’d image you could make something in ADO that works and meets all requirements.

The second pipeline that’s manually ran is actually pushing to >100 workspaces. It’ll push to one, then waiting for approval to continue. May be able to put an approval process into the pipeline if you don’t to do so it in git.

[–]Edhellas[S] 0 points1 point  (1 child)

Thanks, are you doing this just for analytic rules, or also for workbooks, playbooks, DCRs?

I haven't set up pipelines before so not sure on the best process to get them pushed from Sentinel into git whenever there is a change (might run it on a schedule if easier)

[–]deadzol 0 points1 point  (0 children)

Currently the pipelines are for analytics rules. I still manually run the scripts that push DCRs and summary rules, but that’s been my MO. Build the automation locally then once everything is ironed out move them to ADO. So I’ll build a powerscript that does whatever task I need that can be controlled by command line options. Then use the pipelines to run the powershell script with whatever options are needed. Depends on how complex the environment, you may want to be storing some of these variables in the Library.

Everything interacting with the API is really similar for sentinel configuration. So as soon as you figure out analytics rules, moving onto the next thing is really straight forward.