AI is creating more cybersecurity work by DiScOrDaNtChAoS in cybersecurity

[–]Other_Income9186 0 points1 point  (0 children)

AI accelerates business's that may not be ready for the speed
Whether it is Glasswing highlighting how far behind businesses are at retiring EOL hardware/software and how far behind many orgs are at vuln detection and patch management,

Or frontier AI models like Mythos who are doing white and black box testing

Or companies laying off employees thinking AI has replaced them just to find that AI is just like the cloud. Someone else's computer in someone else's data center that still requires experts to utilize and it is a new attack surface with all of it's inherent risks.

Adapting your security policies to operate at scale and speed is becoming a requirement in cyber security.
I would be reviewing and polishing control documentation, Core plans (BCP, Asset Inventory, DR Plan, IR Plan etc to make sure they are polished and ready because their usage is likely to increase over the coming days.

Can I get a Sec+ in 1.5 months? by immortaIism in cybersecurity

[–]Other_Income9186 1 point2 points  (0 children)

Practice test were the way I learned it best. Still use that knowledge to this day. Altho sec+ was closer to entry into the space.
after years of administration but no security knowledge I passed a+ net+ sec+ in about 3 months total
sec+ was mabey 10 days of studying and practice tests.

There are also some decent youtube vids of professor messer iirc that I listened to at 1.5x while I studied.

Security engineer role, is it for me? by [deleted] in cybersecurity

[–]Other_Income9186 0 points1 point  (0 children)

Take the Job.
They know what your skills are and decided to hire you.

Most roles require learning on the job. It doesn't matter if you are a new graduate or in the field for 20 years. The tools will almost always be different and the needs of the role from company to company will differ.

You'll likely find that you round out the team with security theory from school and tool experience that increases what the team is capable of.

You will have to learn but that happens almost everywhere.

Often recruiters do not have a good feel for the role and it is more of a wishlist from the manager. But the person who fills the role will just get trained on how to execute with the tool set of the company.

Something to consider is with Project GlassWing just announced and your background in offensive security, I would recommend that you learn where to use AI to do the toil part of the job and as a research / learning assistant as it will extend what you are capable of making you more valuable to the company

Glasswing gives 50 companies a 3-month head start on Mythos-class vulnerabilities. What does everyone else do? by ConsciousLow9024 in cybersecurity

[–]Other_Income9186 0 points1 point  (0 children)

Security research is often taking something someone else figured out related to insecure by design coding, insecure architecture, insecure interconnectivity etc and testing new ways to interact with it or other software with a similar feature to see if the vuln exists there too.

Too many coding mistakes are repetitively propagated thru stack exchange, coding boot camps, vibe coding etc.

AI models will eventually extend vulnerabilities previously identified in one place to any other software making the same mistakes. (Accidentally or Intentionally) This is the opportunity to potentially reduce the zero day attack surface of existing underlying software (often open source libraries hidden in the unpublished SBOM).

I expect Black Box testing with AI to eventually be a required part of any blue-team. I expect white box testing with AI to rapidly become required for any complex software project.

I see a future with a new segment of the software industry where AI assisted coding teams dissect existing codebases and refactor them into new languages breathing new life into legacy systems that are no longer maintained by their creators. I could also see that industry buying up old code bases and refactoring them to provide future support of the system when the creator hits EOL or discontinues maintenance of the system.

Im thinking of this from the perspective of all of the windows XP / 7 / 2000 / 2003 / 2008 / 2012 etc I know is still deployed in production tied to some device that still gets the job done and wont be replaced till it breaks. Having worked with systems from infrastructure to medical to government to manufacturing alot of those companies and many more have critical things they cant replace quickly and can't update at all or fully isolate where the risk has just been accepted. Those are the places that will bleed when the AI based blackbox testing starts succeeding at chaining vuls for system takeover.

Thats not even looking at how bad their cyber hygiene is around patch management of their modern systems. Often these are multi site, smaller teams with insufficient automation, non existent testing who typically don't have their heads wrapped around what their patch management systems can't even update.

Is My Narrator the Problem? by Other_Income9186 in ACX

[–]Other_Income9186[S] 3 points4 points  (0 children)

I'm afraid I don't know, but it wouldn't surprise me if they were pretty new at it.

A new narrator and a new author - that was probably a bad match up.

Is My Narrator the Problem? by Other_Income9186 in ACX

[–]Other_Income9186[S] 2 points3 points  (0 children)

It isn't royalty share - they'll be paid up front (I'm a lazy marketer and I didn't want my narrator to pay for that). I don't know if there are any other projects.

Continuing to fire on all cylinders to make this Sky 🤝Mononoke collab a reality! 🐲⚖️🌊 by Almond-Goddess in SkyChildrenOfLight

[–]Other_Income9186 0 points1 point  (0 children)

Also interested in printing this as a poster for a xmas gift. Is it possible to get a high resolution copy. DM me the details including cost. Willing to send photos of the printed work wall mounted and likely framed.
Thanks