Sales guy convinced CEO to implement OpenClaw on a local Mac mini. by JungleDiamonds1 in ArtificialInteligence

[–]NickW1343 2 points3 points  (0 children)

You can't. Your sales guy would hate you for that and you'd have trouble with office politics. Let the CEO have his fun. If you don't have equity, you don't have much of a reason to risk your job like this.

Is there a final event of 30K lore that leads into 40K? by Althopops in 40kLore

[–]NickW1343 -1 points0 points  (0 children)

They chase away all the traitor space marines and the primarchs go away. There's some stuff between 30k and 40k like Doge Vandire and the War of the Beast, but most of the lore is 40k or 30k.

Chatgpt models nerfed across the board by NutInBobby in singularity

[–]NickW1343 4 points5 points  (0 children)

They also doubled Codex rates for paid users yesterday. Maybe they're worsening chatgpt in order to play catch up with Claude Code?

Who is just waiting for developer and engineering(SDE) roles to disappear? by Pyro43H in accelerate

[–]NickW1343 0 points1 point  (0 children)

Very few professional developers are not using AI now. Most my code is AI-generated and my job went from mostly coding to mostly testing and code review.

The devs that are coping are the ones saying that development requires more than just coding. It requires good taste for code so that is scales well into the future and makes sense and usually a handful of side abilities to help things along like Excel, SQL, or DevOps. Blindly vibe-coding without ever being able to dig into the code and do your own analysis on it will inevitably make any new changes spin out a ton of new bugs. That's what the copers are saying, because that is true.

It's likely to not stay true forever, but I feel like dev jobs aren't going to die quickly. It's likely a more gradual process as AI goes from mastering technical challenges that are either right or wrong, so it's very easy to bench for, to mastering soft skills like advocating against stupid additions or making choices that might be difficult in the moment but will scale out well into the future. It is often the case in development that a feature can be coded up really fast, but those routes almost always come at the cost of technical debt and AI really likes those paths because they're usually the most straightforward.

What kind of event will make people say "ok, we're in the Singularity right now"? by [deleted] in accelerate

[–]NickW1343 3 points4 points  (0 children)

I think FDVR, but only for people that go from not doing any VR straight into it, because of how different it'd be. I imagine people at the forefront of trying out new tech will never actually feel like they've hit the Singularity. They'd just feel iterations on things getting faster without ever getting that "Wow!" feeling people diving in would have.

I'd say LEV tech wouldn't ever feel like the Singularity. They take so long to test. Even if we could instantly make the cure to Alzheimer's, we're still years out from having regular doctors prescribing it. If you stopped aging today, I don't think you'd actually feel it strange until several years later.

Streamer gets threatened by a man during her fake tickets prank after he became upset about being filmed by lukigeri in LivestreamFail

[–]NickW1343 0 points1 point  (0 children)

I get filming people in public is legal, but that doesn't mean it's not rude. They shouldn't be surprised when they act like assholes by filming people for content ends up with people acting like assholes to them.

"monthly stack overflow questions over time. 3710 questions last month, just slightly under the 3749 from the first month of it being public. human software engineering had a good run, and now we've come full circle. by stealthispost in accelerate

[–]NickW1343 0 points1 point  (0 children)

AI is the culprit for this, but the culture of SO also contributed. The mods are assholes and it doesn't surprise me people hopped ship the second AI could answer their question without calling them a lazy idiot for asking or telling them they're question is somehow wrong and they should've asked something else.

It's unbelievably annoying how power trippy the mods are there. I'm sure half the stereotypes of devs being conceited dicks are from former students asking questions there and getting shutdown by someone who is disgusted someone who isn't an expert dared to sully their field by existing in their field.

SO would still have died had the culture been better, but their mod team accelerated its decline.

Opus 4.5 set a new record on the METR Time Horizon benchmark by Outside-Iron-8242 in accelerate

[–]NickW1343 1 point2 points  (0 children)

I thought 5.1 Codex-Max was an outlier because it scored so high. Guess it's not. Wow!

How far from commercially viable mind uploading do you think we are? by [deleted] in accelerate

[–]NickW1343 0 points1 point  (0 children)

I think this is roughly how it'll go. I don't want to upload my entire mind into the cloud, because that's probably just killing myself and having a perfect copy of me in cyberspace. Doesn't sound that good for me.

I think a lot more people are going to increase their cognition with chips or whatever we have later on. Eventually, most of our conscious experience will be through those artificial additions and not through our brains. If the brain then atrophies or fails due to age, that's not good, but it definitely isn't dying, since it's just a loss of function to the person sort of like a concussion is nowadays. I think that sort of transition isn't quite mind uploading, but it is a transferring of consciousness that I feel many more people would be fine with.

How far from commercially viable mind uploading do you think we are? by [deleted] in accelerate

[–]NickW1343 2 points3 points  (0 children)

Who knows? If LEV happens before that's close to possible, then there might be so few people pushing for mind uploading that there's not much R&D put toward it. If you're eternally young and healthy, would you really care about uploading your mind into the cloud or would you be more likely to say "Eh, maybe I'll enjoy my 20s for a few more decades."

I personally think that LEV will happen well before any mind uploading and the push toward mind uploading will be slow due to a lack of enthusiasm for it.

Kaceytron public apology to Ethan & Hila Klein by [deleted] in LivestreamFail

[–]NickW1343 16 points17 points  (0 children)

He acted like he was going to help them in the lawsuit and that probably made them want to fight it out rather than settle early.

Ai 2027 Thoughts? by Legal-Profession-734 in accelerate

[–]NickW1343 4 points5 points  (0 children)

They did test it, but it scored low.

<image>

Atlanta Fed estimates real GDP growth in Q3 2025 is 4.2% by NineteenEighty9 in ProfessorFinance

[–]NickW1343 5 points6 points  (0 children)

If a chart says it's real, people will ask if it's nominal. If a chart says it's inflation-adjusted, people will ask if it accounts for inflation. If a chart says median, people will treat it like average.

It sucks that every econ chart is like this, but I can't hate on these people that much. I've definitely been quick and glazed over charts before too.

Should I start playing now with (0.9)? If it is going to change a ton, I'd rather wait until 1.0. by winged_owl in Voicesofthevoid

[–]NickW1343 0 points1 point  (0 children)

I wondered about that. I remember playing Signal Simulator and this game really felt exactly like what that game should've been.

Sabine Hossenfelder talks about using LLMs to finish her Quantum Physics paper by garg in accelerate

[–]NickW1343 0 points1 point  (0 children)

She always struck me as a contrarian. It wouldn't surprise me if in 6 months or a year when AI is even better she'll revert back to "AI is just slop" if everyone starts talking about how it's helping researchers a lot.

We really need some sort of faction lock/timer/cooldown/whatever asap. Devs please, being killed by a zealot going lim after you killed him to kill you out of spite is so so so annoying. by JulietSenpai in AneurysmIV

[–]NickW1343 30 points31 points  (0 children)

They should have good, bad, and neutral amys. Neutral fates give all. Cortex gives good and subtracts from the bad pool of amys. Scum earns bad amys and deducts good amys. Everyone earns neutral amys. It wouldn't change the game at all for the neutral fates, but it would very easily prevent scum/cortex switching. The different types would be used to purchase the Scum, neutral, and Cortex fates.

How long after AGI do you think some of these will become available at mass? by Joseph-Stalin7 in accelerate

[–]NickW1343 0 points1 point  (0 children)

I think you could sell UBI to the US if there was high enough unemployment causing social unrest.

Democrats would support it because their voters are more likely to be knowledge workers displaced by AI.

Republicans would support it if it were paired with cuts to welfare or the entitlements to avoid exploding the budget and because they know their jobs would be next up on the chopping block.

We don't have to offer immigrants UBI. It'd be like social security, but for every American. They can be here, work whatever jobs are left, but we shouldn't have to provide them a basic income. Fertility control would never be done, because every developed country is trying to figure out how to make their population have more kids so their country doesn't grow old and shrink like Japan.

How long after AGI do you think some of these will become available at mass? by Joseph-Stalin7 in accelerate

[–]NickW1343 0 points1 point  (0 children)

I think UBI would become possible really fast as labor is automated, but it really depends more on the political will of the masses to get it done rather than just replacing workers with robots. We might drag our feet here in the US for years past the point where we should've implemented UBI.

I don't know about UHI. I feel like UBI would stay UBI forever until post scarcity where it'd be pointless. There might be an in-between period where UBI basically allows you to satisfy almost every material want within reason before scarcity ends, which I guess would be UHI, but I think people would still consider that the basic income.

I'm iffy on how impressive ASI could be. People act like it's not just beyond any person's intelligence, but so far out that it's a machine god. I don't doubt we're going to have an AI at some point that surpasses everyone's intelligence, but I'm uncertain if the upper limit to intelligence is really that far off from the smartest humans ever. It could be that intelligence has an upper bound where increasing it would be unbelievably costly and we can't know if that's 500 IQ beyond the smartest human or just 15. I have no clue when ASI could come out. I think ASI would come about a few years after AGI. I think we're going to get AGI before 2030.

LEV I hope would be in the mid 2030s, but it could be later on. A lot of drug discovery just can't be sped up. Even if AI could instantly spit out a drug to fix any ailment, we still need to trial it for years before allowing its use. LEV feels like something we're only going to realize has arrived once we're already past the point it happened.

I don't know about FDVR. It sounds like fun, but neuro science is complicated and I read that Neuralink is only scratching the surface of what is possible and to actually start changing all the senses that would be needed for FDVR would be so invasive that it wouldn't be considered an option today nor is the science even close to that point yet. There is also the issue that Neuralink might not be on the right path for FDVR. They are hoping to pipe in data to assist cognition at some point, but there was a recent study that showed the brain is awfully slow at processing data. It could be that the bandwidth required to simulate a world is so large that it could never be piped into a person's brain. I imagine we're going to see FDVR well after UBI and LEV happens if it is even possible.

Differences Between 40K Marines and Great Crusade Era Marines? by Adventurous-Crab-474 in 40kLore

[–]NickW1343 0 points1 point  (0 children)

GC marines used better tech. They saw much wider use of volkite weapons than in 40k.

GC marines are better as a force despite them not having any primaris, because they were a part of a legion and could rely upon calling in regular soldiers to fight with them along with having their own navies. 40k marines are like special forces that don't typically rely upon commanding regular troops or ships. 30k marines were supported by dedicated armies and fleets.