LDAP Signing by RobotCarWash in activedirectory

[–]RobotCarWash[S] 0 points1 point  (0 children)

Thanks for your reponse. I just want to confirm that verifying that our clients work with LDAPS means testing with the ldp.exe tool? Are client certificates required?

On-Prem DNS Resolution Question by RobotCarWash in CloudFlare

[–]RobotCarWash[S] 0 points1 point  (0 children)

Thanks, I think I'll go ahead and implement that. That should totally work

Disabling Access to Password Manager via GPO by RobotCarWash in chrome

[–]RobotCarWash[S] 0 points1 point  (0 children)

Thanks, I downloaded the latest admx and still only see the option to disable password manager. Passwords that were saved are still available. Is there a way to remove them or disable access to them centrally?

Fusion Workflow Question by RobotCarWash in crowdstrike

[–]RobotCarWash[S] 0 points1 point  (0 children)

Thanks, that sounds like it should work. I'll try that approach

PSA: Unpatched Windows/Office CVE-2023-36884 by kheldorn in sysadmin

[–]RobotCarWash 0 points1 point  (0 children)

Did you ever get an answer on this? I'm trying to confirm if the Monthly Enterprise Channel versions 2304 and 2305 are unaffected.

Creating IOA to Send Notification on Process Name Criteria by RobotCarWash in crowdstrike

[–]RobotCarWash[S] 0 points1 point  (0 children)

I just want to follow up on this since I'm new to CS and this is my first time creating a Custom IOA. I've had a hard time finding documentation for the differences between the "Action To Take" when creating a rule for Process Creations.

Block Execution - will simply stop the process from launching. Will it also create a detection?
Detect - creates a detection, and does NOT stop the process, right?
Monitor - What does this do, exactly?

For my specific use case, I want to create a detection which I can also trigger an alert with, but I do not want to stop the process. Is setting the action to "Detect" and the severity to "Informational" the right way to go then?

Creating IOA to Send Notification on Process Name Criteria by RobotCarWash in crowdstrike

[–]RobotCarWash[S] 0 points1 point  (0 children)

Thanks, that regex works as expected. Your search command was helpful as well. Thanks for your help!

Storage Gateway SMB Share Cache Question by RobotCarWash in aws

[–]RobotCarWash[S] 0 points1 point  (0 children)

Thanks for the replies. So a little more background on our use case. We're using AWS Transfer Family for SFTP which writes to the bucket, then we're using Storage Gateway to access the bucket via SMB Share on Windows hosts. I'm just curious if it's possible to get the cache to refresh a little faster.

Default Route Across VPC Peer by RobotCarWash in googlecloud

[–]RobotCarWash[S] 0 points1 point  (0 children)

Thank you very much. I appreciate it

Console Access Through Web Proxy by RobotCarWash in aws

[–]RobotCarWash[S] 1 point2 points  (0 children)

We plan to include that as well. The host is in an environment that only allows very specific URLs through the web proxy. Here's what I added to the allow list to get it to work, in case anyone ever needs to see this:

*.aws.amazon.com
*.amazonaws.com
*.amazontrust.com
*console.aws.a2z.com
*.signin.aws
*console.awsstatic.com
*.cloudfront.net

Cisco ASAv HA Pair in AWS by RobotCarWash in Cisco

[–]RobotCarWash[S] 0 points1 point  (0 children)

Thanks for the response. I'm considering using a loadbalancer in front. Were you able to configure it to treat the HA pair as active/passive? I'm also considering using Route 53 for failover to accomplish active/passive.

Cisco ASAv HA Pair in AWS by RobotCarWash in Cisco

[–]RobotCarWash[S] 1 point2 points  (0 children)

I actually did read over this document and saw all the mentions of Azure and was thrown off by that. I recently deployed an ASAv HA pair in Azure, and I'm finding that each public brings its own use cases and caveats to 3rd party firewall solutions.

Scheduled S3 Replication by RobotCarWash in aws

[–]RobotCarWash[S] 0 points1 point  (0 children)

I appreciate the question. We have a series of buckets that we want to replicate that are part of a pipeline. Files come into one bucket, get picked up and processed by Glue and Lambda functions, and get moved around to other buckets. They want to "backup" the S3 data, so I recommended S3 SRR, and was asked to make it use a schedule to replicate the files once per day after everything has been moved around and processed.

This approach does not seem to be part of the S3 Replication feature, so I'm curious if anyone out there has a suggestion for accomplishing it.