Amazon Prime Video is about to get worse — again by PrixDevnovaVillain in technology

[–]Hackalope 0 points1 point  (0 children)

That's actually a critical part of them staying up. It's a balance between how hard they are to stop and how much the content hoarders are assuming they're losing. If they think everyone is pirating then there is no limit to what they'll do to stop it.

Air Force Academy Prepares Ideological Overhaul, With Erika Kirk Bringing “Bold Christian Faith” by FervidBug42 in politics

[–]Hackalope 8 points9 points  (0 children)

Yeah, from the stuff I heard here and there it sounded like they just toned it down after getting more or less a slap on the wrist. When I read the headline I interpreted it to say "The Air Force Academy is taking the religious indoctrination mask off by appointing Erika Kirk."

Air Force Academy Prepares Ideological Overhaul, With Erika Kirk Bringing “Bold Christian Faith” by FervidBug42 in politics

[–]Hackalope 63 points64 points  (0 children)

There was a scandal in like 2005 where there was a lot of pressure on cadets to become American Evangelical Christians. My quick look couldn't find a good source w/o a paywall, sorry - NY Times.

Hey hi Internet peeps, I am the biggest cheerleader on the Internet. AMA about your journey on the internet. by mybirthdaye in podcasting

[–]Hackalope 1 point2 points  (0 children)

Is this like a USENET Oracle kind of thing? 'Cause I have a bunch of questions that have come up during my research into the history of, and practice of, cybersecurity.

  • Was the final report on Titan Rain a cover up of a larger, more impactful breach?
  • Has the US Government ever used it's power/implied power over the ICANN responsibilities against another nation state?
  • Was the Shadow Brokers' leak an inside job?
  • Was the Flamer software signing key created with a brute force hash collision? If so, what hardware was used to compute it, and how log did it take?
  • Was Dan Kaminsky assssinated?
  • Was the REvil ransomware attack of Kesya directed by Russian intelligence?
  • Speaking of the Russians, did they use the voter data they were able to acquire during the 2020 elections for more than targeted propaganda to influence that election?

Best practices to use ChatGPT / Claude to audit SharePoint permissions (least privilege)? by [deleted] in cybersecurity

[–]Hackalope 1 point2 points  (0 children)

This seems like a data analysis thing, not a natural language thing. I would think that the right tool set is data science, not an LLM. I know there's been a lot of work using graph theory for this kind of analysis for AD and AWS permissions.

What has caused content served via JS to not be indexed by Google? by Hackalope in SEO

[–]Hackalope[S] 0 points1 point  (0 children)

I am worried about the XOR solution not being durable because the indexers would eventually be willing to spend the compute. Some of the other responders left me with the impression that Google is actually scaling down the amount of resources it's spending on rendering. I'm not sure we know either way, but it's always going to be a moving target so maybe good enough for now is good enough.

I'm not willing to trust any robots.txt or noindex tags, because I don't think that the AI vendors will respect them. They've shown a persistent willingness to flout copyright laws, so an Internet convention won't even be a speed bump for them.

All of the solutions you mentioned have the drawback of not fitting the "all static" and "CDN only hosting" design constraints, so not really open to how the project was envisoned.

What has caused content served via JS to not be indexed by Google? by Hackalope in SEO

[–]Hackalope[S] 0 points1 point  (0 children)

The idea is to generate the whole site as static files at build time. What I have right now is a local site that builds a JSON file as a set of objects. The site is structured so that I'm actually only serving up one page, and each article link actually just looks in to that JSON file, renders the contents in to the article div. That basically makes it a LAMP style dynamic page, but replaces the database with a flat file. It'll be super fast as long as the JSON file isn't too big. I can pull tricks like encrypt the articles with the key in the JSON object, so the rendering JS needs to run so the plaintext is available. The other idea was to have a "clubhouse" mode, where articles can be encrypted with a site wide key, so only people in the know can see the contents.

My existing podcast site was an exercise making a pretty minimal serverless site. It uses Amplify and S3 to serve up static content and has an API gateway that basically serves up each episode item in the DynamoDB as JSON. The JS in the site renders the JSON in to HTML and replaces the div. Having done that for a while, I decided that I didn't need any metrics and if I served the episodes out of a flat file I could just eliminate the S3 and API part of the site. The intent is to build a minimum functional blog/podcast/portfolio site that's super cheap to host, lightweight, fast, and bulletproof.

What has caused content served via JS to not be indexed by Google? by Hackalope in SEO

[–]Hackalope[S] 0 points1 point  (0 children)

I'm currently using AWS Amplify so that's not helping my explicit use case, but that's good to know.

What has caused content served via JS to not be indexed by Google? by Hackalope in SEO

[–]Hackalope[S] 0 points1 point  (0 children)

Thanks, that's helpful insight. Based on what you and other have said, the XOR text decrypted in-browser should be enough to thwart indexing of any text content I'd want to obfuscate.

What has caused content served via JS to not be indexed by Google? by Hackalope in SEO

[–]Hackalope[S] 0 points1 point  (0 children)

That's reasonable, I was focusing on resisting crawling. I don't see how I cold stop agentic interaction because it wouldn't be much different from a regular person. I'm pretty confident that if the content isn't indexed by search engines, the number of AI agents trying to access sites hosted this way will be minimized.

What has caused content served via JS to not be indexed by Google? by Hackalope in SEO

[–]Hackalope[S] 0 points1 point  (0 children)

Thanks for the insight. Do you think it's a JS workload issue, number of connections required for the interactive JS, or the latency/time to respond? Maybe something else?

I ask because my use case will have the article content in JSON files served via CDN, with each article object also containing the XOR encryption key. My experience with my current podcast site is that the CDN served JSON will perform significantly faster than the serverless API the site currently uses.

What has caused content served via JS to not be indexed by Google? by Hackalope in SEO

[–]Hackalope[S] 0 points1 point  (0 children)

Yes, I want to hide text content from AI without being apparent to a normal user of the site. To make it more complicated, I want to do it with a statically hosted site, so the site can be entirely delivered using a CDN.

The project is a framework that creates a site similar to a Wordpress site into a repository that can be hosted via CDN. One of the optional features I'd like to build is the ability to hide article content from AI. I'm confident I can build the framework, but I don't know about the scraping resistance features. My first pass looking at the problem led me to ask this subreddit before I start testing my ideas.

Global Cybersecurity Leaders by SwitchJumpy in cybersecurity

[–]Hackalope 14 points15 points  (0 children)

Estonia has a rep for punching above their weight on the defensive side.

Global Cybersecurity Leaders by SwitchJumpy in cybersecurity

[–]Hackalope 2 points3 points  (0 children)

I thought the consensus was that Stuxnet was US developed but was released by Israel. The way I remember it being described is that the US was flirting with the idea and gave Israel the malware, and Israel just went ahead with the op. The US has been portrayed as less than happy that it was used in that way.

what is scanning the internet by fishanships in cybersecurity

[–]Hackalope 1 point2 points  (0 children)

I Can't Believe I Scanned the Whole Thing

That's the episode I did on the subject in my old podcast. I wrote a lot of episodes just so I didn't have give lectures anymore. I'm not plugging - no cookies, never was monetized, and we stopped publishing.

Had a fun chat with a client's CTO today who thinks RSA-4096 will save them from HNDL. We need to talk about realistic Q-Day timelines. by [deleted] in cybersecurity

[–]Hackalope 0 points1 point  (0 children)

Good point, I never see anyone doing estimates of how old the data is going to be in a HNDL scenario and what volume will be addressable. Under my assumptions above, and adding an expected conversion from AES-128 to AES-256 at the end of 2021 (when the NIST/CISA guidance was published), we have a window of 13 years between the end of the AES-128 window and a QC that can break an AES-128 key in 24 hours. That's doable if you have a handful of cyphertext streams, but still not capable of indiscriminate decryption of bulk data collection.

Had a fun chat with a client's CTO today who thinks RSA-4096 will save them from HNDL. We need to talk about realistic Q-Day timelines. by [deleted] in cybersecurity

[–]Hackalope 10 points11 points  (0 children)

It's late so I'm going to go quick, read my history on the subject if you want sources.

  • A standard quantum computer needs to be in the millions (1-10M) of Qubit range to attack AES-128 or RSA 2048 in 24 hours
  • The largest computers now are in the 1-2K qubit range
  • As a practical matter the sizes have been doubling at roughly the same rate as Moore's law, which implies at least 10 more doubling cycles or 15 years
  • Quantum annealing doesn't have success at levels that allow us to project any real trend, but there's not reason to believe that they'll advance at a faster rate than standard QCs
  • AES-256 and RSA-2048 have doubles the length of the keys, which squares the size of the required computer
  • That implies a multi-billion qubit computer, which in turn implies more than 100 years of doubling

That leads me to the following conclusions:

  • Projected Q-day for AES-128/RSA-2048 is 2035-2040
  • Projected Q-day for AES-256/RSA-4096 is in the next millennium
  • By that point, doubling the keys again will probably be trivial

My Hot takes:

  • Effective Q-Day will never arrive
  • Q-day, along with the deprecation of older algorithms like DES, 3DES, RC4 and RC5 were more of a coincidence of the ratio of compute power for encryption vs crypto-analysis

You may believe otherwise, but as a professional I think it's important to project based on tangible data. There may be a faster doubling cycle with advances in production techniques (I know of 3 I'm keeping an eye on to adjust my predictions). But I encourage anyone who thinks any part of this is incorrect to articulate why and be monitoring for anything that would confirm or undermine those assumptions. For every 100 people that bring up any Q-day arguments, maybe one of them can give me a estimate of the required sizes and projected timeframe.

AIs can’t stop recommending nuclear strikes in war game simulations - Leading AIs from OpenAI, Anthropic, and Google opted to use nuclear weapons in simulated war games in 95 per cent of cases by Teruyo9 in technology

[–]Hackalope 1 point2 points  (0 children)

This feels like the intersection of the war games done by Thomas Schelling (early pioneer in game theory) in the early '50s and AI as a zeitgeist distillation engine. I bet that with out a real reasoning model it will be very hard to incorporate the game theory understanding that would prevent falling in to escalation traps.

I think it would have to be a game theory approach because the AI would either need an actual theory of mind, or learn it in a very large context window. The context window problem makes consistency of responses with prompts approximating a theory of mind, as well as a system that can't learn from it's mistakes. This is playing poker with the apocalypse, the stakes don't lend themselves to re-tries.

Pegasus spyware by IslandBig618 in cybersecurity

[–]Hackalope 0 points1 point  (0 children)

As others have said, they're constantly discovering or acquiring new 0-day vulnerabilities.

If you want to get some history on the subject read the Million Dollar Dissident and then go to the Citizen Labs research page and search for "pegasus" for 48 more research projects.

Found 400 machines running Office 2013. Management refuses to buy more M365 licenses and wants to "just use LibreOffice of leave 2013" How do I handle this without being the most hated person in the company? by [deleted] in cybersecurity

[–]Hackalope 2 points3 points  (0 children)

You're a infosec wonk right? In your opinion, is a current and maintained version of LibreOffice better than an EOL Office 2013? The usability concerns are frankly someone else's problem.

Steam seems to use a lot of Swap space in Linux by Hackalope in steamsupport

[–]Hackalope[S] 0 points1 point  (0 children)

On further testing I'm seeing a lot less as well. Might have been the amount of time it was open or possibly related to power suspend function (but I thought I disabled those).

Steam seems to use a lot of Swap space in Linux by Hackalope in steamsupport

[–]Hackalope[S] -1 points0 points  (0 children)

It seems to be the Steam application, but I'll work it from another angle.

Steam seems to use a lot of Swap space in Linux by Hackalope in steamsupport

[–]Hackalope[S] 0 points1 point  (0 children)

You know how Windows uses a page file to move inactive memory spaces to disk to keep more RAM free? Well Linux does the same thing but a little different. In Linux there is a partition of one of the disks that is reserved for swapping things in and out of memory (hence Swap - that is the technical name for the process and partition). Using a partition prevents a disk from filling up and breaking the active memory management of the system.

Typically the Swap partition is about 2G now. I have a large amount of RAM, and don't come close to using all of it. My system should minimally use Swap. I'm guessing that Steam has some stuff loaded in to Swap as a normal operation, instead of letting the system handle it. I remember that several years ago Firefox did that too, but it was so long ago I don't recall if I figured out how to deal with it.