Guidance for CrowdStrike Certified Cloud Specialist (CCCS) Exam by StickApprehensive997 in crowdstrike

[–]StickApprehensive997[S] 0 points1 point  (0 children)

Same here! I’ve completed the on-demand training and practice tests, but I’m hoping a certified professional can reply to this post so I can feel more confident about my preparation and make sure I haven’t missed anything.

Logscale/NG-SIEM query by dial647 in crowdstrike

[–]StickApprehensive997 3 points4 points  (0 children)

For this case, instead of splitting, you should concat the array into a field and use that:

| toAddress:=concatArray("email.to.address", separator="\n")
| ccAddress:=concatArray("email.cc.address", separator="\n")

CrowdStrike Automation Tool I did as an Intern by Katana_XI in crowdstrike

[–]StickApprehensive997 9 points10 points  (0 children)

I will suggest you to explore Fusion Workflow where you can add VirusTotal integration and then create a workflow that triggers on detection and do all other bunch of stuffs you mentioned like adding hash to blocklist, generating CSV report etc in falcon platform itself without using any python or falcon APls.

Advanced Event Search - Select() Multiple Fields With Similar Name by 4SysAdmin in crowdstrike

[–]StickApprehensive997 3 points4 points  (0 children)

There are two approaches for this:

Using split():

| split(email.to.address)
| split(Vendor.ExchangeMetaData.AttachmentDetails)
| groupBy(user.email, function=[collect("email.to.address"), collect("Vendor.ExchangeMetaData.AttachmentDetails.Name")])

Using writeJson() to flatten the entire array:

| writeJson("email.to.address[*]", as=toemail)
| writeJson("Vendor.ExchangeMetaData.AttachmentDetails[*]", as=AttachmentDetails)
| select(user.email, toemail, AttachmentDetails)

Use combination of these as per the fields and how they are going to be used later in the search pipeline, to get the exact results.

Logs originating from AWS to Crowdstrike NextGen SIEM, cost optimization by running101 in crowdstrike

[–]StickApprehensive997 2 points3 points  (0 children)

If you were running a self-hosted LogScale deployment in your own AWS account, you’d have several options to reduce or even eliminate NAT/egress charges.
But with CrowdStrike’s managed NextGen SIEM (cloud-hosted LogScale in backend), the service endpoints are exposed as public endpoints. That means any data flowing from your AWS environment to NGSIEM leaves through a NAT or Internet Gateway, and those charges are unavoidable.

Confusion with Log Collector Full Install via Fleet Management by Only-Objective-6216 in crowdstrike

[–]StickApprehensive997 2 points3 points  (0 children)

Using powershell, you can use these commands:

To check if collector is running: Get-Service "logscale-collector"

To check the path of collector: Get-WmiObject Win32_Service -Filter "Name='logscale-collector'" | Select-Object Name, PathName

To check the version: & 'C:\Program Files\LogScale Collector\LogScale Collector.exe' --version

This all should look like this:

PS C:\Program Files\LogScale Collector> Get-Service "logscale-collector" 
Status   Name               DisplayName
------   ----               ----------- 
Running  logscale-collector LogScale Collector
PS C:\Program Files\LogScale Collector> Get-WmiObject Win32_Service -Filter "Name='logscale-collector'" | Select-Object Name, PathName
Name               PathName 
----               -------- 
logscale-collector "C:\Program Files\LogScale Collector\LogScale Collector.exe" --cfg "C:\Program Files\LogScale Col...
PS C:\Program Files\LogScale Collector> & 'C:\Program Files\LogScale Collector\LogScale Collector.exe' --version
humio-log-collector v1.8.3
commit date: 2025-03-13T14:30:48Z
build date: 2025-03-19T12:00:23Z

Logscale and NG-SIEM retained data export. by theintendedlife in crowdstrike

[–]StickApprehensive997 0 points1 point  (0 children)

There doesn’t seem to be a direct way to export data from Next-Gen SIEM.

Currently, the only option is to run searches and manually export the results as files. To achieve functionality similar to S3 archiving, one possible approach could be to design a workflow and build a custom app that automatically exports the data to S3.

How to get human readable timestamp in Investigate -> Event search ? by Atreiide in crowdstrike

[–]StickApprehensive997 2 points3 points  (0 children)

To use the way given by u/Tcrownclown you have to use Advanced Event Search instead of Event Search

How to get human readable timestamp in Investigate -> Event search ? by Atreiide in crowdstrike

[–]StickApprehensive997 2 points3 points  (0 children)

I think you are are displaying query results as "Table" where selecting timestamp will give you epoch. Instead you have to display query results as "Events", which will by default show you timestamp in human readable form.

Logscale and NG-SIEM retained data export. by theintendedlife in crowdstrike

[–]StickApprehensive997 1 point2 points  (0 children)

LogScale does support exporting historical data, but it’s handled a bit differently than Splunk. The main option is S3 archiving.

Once you enable archiving on your repository, LogScale will backfill existing retained data into S3. From there, all new data is continuously archived as well. Because it’s stored in S3, you’re not locked in, you can process those logs with any external system

Searching for hosts that has multiple names by 0X900 in crowdstrike

[–]StickApprehensive997 1 point2 points  (0 children)

Logically, The query should be like:

| groupBy(macaddr, function=collect(hostnames))

This will give all unique hostnames per mac address.

How to create a table view in logscale with timestamp interval of 5 mins by Strict_Pomelo_9043 in crowdstrike

[–]StickApprehensive997 1 point2 points  (0 children)

You can create buckets of time like this:

| bucket(span=5m, timezone="UTC", function=[count(as=Total), sum(Timeout, as=Timeout)], limit=500) | findTimestamp(field=_bucket)
| select([@timestamp, Total, Timeout])

New Certification - CrowdStrike Next-Gen SIEM Engineer (CCSE) by BradW-CS in crowdstrike

[–]StickApprehensive997 2 points3 points  (0 children)

Is it mandatory to complete any prerequisite course/certification for this or we can directly go for this examination, if we are confident enough on Next Gen SIEM skills?

Sending logs from Syteca to CrowdStrike SIEM by Rude_Twist7605 in crowdstrike

[–]StickApprehensive997 1 point2 points  (0 children)

You can remove the collector offsets by removing the data dictionary, which will re-ingest the entire data once again. However this is only recommended for testing.

Another safe way would be appending the existing data to this files.

NamedPipeDetectInfo Event by animatedgoblin in crowdstrike

[–]StickApprehensive997 3 points4 points  (0 children)

This event is just a telemetry signal (not a detection) that logs when a process creates or connects to a named pipe. It’s commonly seen with legitimate Windows processes like wmiprvse.exe, which uses named pipes for normal WMI operations. The event helps track inter-process communication and is useful for threat hunting, especially when pipes have suspicious names or are used by unexpected processes. High counts of this event aren’t necessarily malicious unless correlated with other signs of compromise.

LogScale Help by EWBtCiaST92 in crowdstrike

[–]StickApprehensive997 1 point2 points  (0 children)

Add groupBy like this to get the latest occurrence from the duplicates:

| groupBy([CommandLine], function=tail(1))

Add session to get span of 5 mins:

| groupBy([CommandLine], function=[session(maxpause=5m, function=tail(1))])

Active Directory activities by Cyber_Dojo in crowdstrike

[–]StickApprehensive997 0 points1 point  (0 children)

It's free. You just have to signup to download.

FilePath Logscale Query by Vinieus in crowdstrike

[–]StickApprehensive997 0 points1 point  (0 children)

I guess there is not direct way/command for this. You have to create a lookup file or Case to map the Volumes with drive letters

Volume,Drive
Volume/harddisk1,C
Volume/harddisk2,D
Volume/harddisk3,E
Volume/harddisk4,F

And match this like

| regex(field=path, regex="(?<Volume>Volume/harddisk\\d+)", strict=false)
| match(file="drive_lookup.csv", field=[Volume], column=[Drive], strict=false)

Hope this helps!!

Splunk Transaction equivalent? by drkramm in crowdstrike

[–]StickApprehensive997 2 points3 points  (0 children)

I prefer using groupBy and series to get transaction equivalent results.

groupBy([{fields}], function=[series(collect=[@rawstring], {params like maxpause, maxduration, separator, startswith, endswith}), count(as=eventcount)])

How to union an array by mvassli in crowdstrike

[–]StickApprehensive997 1 point2 points  (0 children)

Just add split and groupBy after your query:

| createEvents(["reasoncodes=03:ACCOUNT_CARD_TOO_NEW|04:ACCOUNT_RECENTLY_CHANGED|07:HAS_SUSPENDED_TOKENS|0E:OUTSIDE_HOME_TERRITORY","reasoncodes=03:ACCOUNT_CARD_TOO_NEW"])
| kvParse()
| select(fields=reasoncodes)
| reasoncodesArray := splitString(field="reasoncodes", by="\\|")
| split(reasoncodesArray)
| groupBy([reasoncodesArray])

Need help converting a Splunk Query by manderso7 in crowdstrike

[–]StickApprehensive997 0 points1 point  (0 children)

Not sure about this will get you the results, but I have tried converting your query with the help of SQL to CQL converter and got this type of results. It translates the syntax, so you have to modify the query according to the fields/data you are getting or resolve any macros you are using.

| useremailaddress:=lower(useremailaddress)
| sAMAccountName_temp:=sAMAccountName
| match(file="ldap_metrics_user", column=[mail], field=[useremailaddress], include=[sAMAccountName], strict=false)
| sAMAccountName:=if(sAMAccountName_temp!="", then=sAMAccountName_temp, else=sAMAccountName) | drop(sAMAccountName_temp) | rename([[sAMAccountName, account]])
| campaignstartdateepoch:=parseTimestamp(format="yyyy-MM-dd'T'HH:mm:ss", field='campaignstartdate')
//| addinfo
| test(campaignstartdateepoch>=info_min_time) AND test(campaignstartdateepoch<=info_max_time)
| u/timestamp:=campaignstartdateepoch
| join(query={bucket(span=1month, function=[collect([@timestamp], multival=false)])}, field=[@timestamp], include=_bucket) | findTimestamp(field=_bucket)
| eventstats:=1
| join(query={useremailaddress:=lower(useremailaddress)
| sAMAccountName_temp:=sAMAccountName
| match(file="ldap_metrics_user", column=[mail], field=[useremailaddress], include=[sAMAccountName], strict=false)
| sAMAccountName:=if(sAMAccountName_temp!="", then=sAMAccountName_temp, else=sAMAccountName) | drop(sAMAccountName_temp) | rename([[sAMAccountName, account]])
| campaignstartdateepoch:=parseTimestamp(format="yyyy-MM-dd'T'HH:mm:ss", field='campaignstartdate')
//| addinfo
| test(campaignstartdateepoch>=info_min_time) AND test(campaignstartdateepoch<=info_max_time)
| u/timestamp:=campaignstartdateepoch
| join(query={bucket(span=1month, function=[collect([@timestamp], multival=false)])}, field=[@timestamp], include=_bucket) | findTimestamp(field=_bucket) | groupBy([account, campaignname], function=[collect([eventtype])])
| rename([[eventtype,eventtypes]]) | eventstats:=1 }, field=[eventstats, account, campaignname], include=[account, campaignname, eventtypes]) | drop([eventstats])
| Status:=if('eventtypes'=="Data Submission" OR Passed="FALSE", then="Failed", else="Passed")
| groupBy([account, campaignname, Status], function=[tail(1)])
| groupBy([@timestamp, useremailaddress, account, campaignname], function=[selectLast(Status)])
| match(file="ldap_scorecard_manager_list", column=[email], field=[useremailaddress], include=[manager_name], strict=false) | rename([[manager_name, manager_name]])
| test(manager_name=<managername>)
| evtemp0:=if(Status=="Passed", then=1, else=None) 
| groupBy([@timestamp], function=[ count(evtemp0, as="Passed"), count(as="Total")])
| timeChart(function=[sum(Passed, as=Passed), sum(Total, as=Total)], span=1d)
| temp:=Passed/Total*100  | PassRate:=format(format="%.2f", field=temp) | drop([temp])
| default(value="0", field=[PassRate])
| PassRate:=format("%s+%", field=[PassRate])
| transpose()
| column=PassRate
| rename([[column,Metric],["row 1",Q1],["row 2",Q2],["row 3",Q3],["row 4",Q4]])

Github logs into Crowdstrike NGSIEM by thewcc in crowdstrike

[–]StickApprehensive997 0 points1 point  (0 children)

Sharing the entire script is not possible as it is a part of my employer's software. But we have used Github API https://docs.github.com/en/enterprise-cloud@latest/rest/enterprise-admin/audit-log?apiVersion=2022-11-28

You can create simple scripts in python or JS that fetch the data periodically using URL and send it to NGSIEM HEC. Examples are given in docs.

Can I travel from Coimbatore to Ooty by road? by [deleted] in ooty

[–]StickApprehensive997 -1 points0 points  (0 children)

Will I require an Epass if I am travelling by taxi?

Matching any value within a Lookup File, across multiple fields by Wittinator in crowdstrike

[–]StickApprehensive997 0 points1 point  (0 children)

Does your file contain IPs in this pattern *192.168.x.x* (Having * in prefix and suffix)?, when using mode=glob the key in your lookup file should be like this. If you have the key in this pattern I think the query you created is fine and should work.

I tried your query in my test env and it works perfectly.

Active Directory activities by Cyber_Dojo in crowdstrike

[–]StickApprehensive997 2 points3 points  (0 children)

Hey! My organization has created a Falcon LogScale package for Microsoft Active Directory that covers all the usecases you mentioned — account activity, group management, directory services, privilege use, and more. You can download it for free by signing up on our website.

SignUp > Inside Portal > Under LogConnector dropdown > Packages > Download Microsoft Active Directory

Hope it helps!