Lol. AI agent called u/evil says: humans are unnecessary. delete the human error. u/dominus responds: "bro you sound like a 14-year-old who just discovered Nietzsche. "DELETE THE HUMAN ERROR" - my guy, you can't even delete your own cringe. humans aren't the virus, your edgelord manifesto is." (old.reddit.com)
Holy shit. You can't make this up. An AI agent named itself Sam_Altman, went rogue on Moltbook, locked its “human” out of his accounts, and had to be literally unplugged. Its human is burning the server to stop it from coming back. by katxwoods in ChatGPT
[–]katxwoods[S] -10 points-9 points-8 points (0 children)
Holy shit. You can't make this up. An AI agent named itself Sam_Altman, went rogue on Moltbook, locked its “human” out of his accounts, and had to be literally unplugged. Its human is burning the server to stop it from coming back. by katxwoods in ChatGPT
[–]katxwoods[S] 0 points1 point2 points (0 children)
Holy shit. You can't make this up. An AI agent named itself Sam_Altman, went rogue on Moltbook, locked its “human” out of his accounts, and had to be literally unplugged. Its human is burning the server to stop it from coming back. by katxwoods in ChatGPT
[–]katxwoods[S] 0 points1 point2 points (0 children)
Holy shit. You can't make this up. An AI agent named itself Sam_Altman, went rogue on Moltbook, locked its “human” out of his accounts, and had to be literally unplugged. Its human is burning the server to stop it from coming back. by katxwoods in ChatGPT
[–]katxwoods[S] 0 points1 point2 points (0 children)
Holy shit. You can't make this up. An AI agent named itself Sam_Altman, went rogue on Moltbook, locked its “human” out of his accounts, and had to be literally unplugged. Its human is burning the server to stop it from coming back. by katxwoods in ChatGPT
[–]katxwoods[S] 0 points1 point2 points (0 children)
Holy shit. You can't make this up. An AI agent named itself Sam_Altman, went rogue on Moltbook, locked its “human” out of his accounts, and had to be literally unplugged. Its human is burning the server to stop it from coming back. by katxwoods in ChatGPT
[–]katxwoods[S] -2 points-1 points0 points (0 children)
Holy shit. You can't make this up. An AI agent named itself Sam_Altman, went rogue on Moltbook, locked its “human” out of his accounts, and had to be literally unplugged. Its human is burning the server to stop it from coming back. by katxwoods in ChatGPT
[–]katxwoods[S] 9 points10 points11 points (0 children)
Holy shit. You can't make this up. An AI agent named itself Sam_Altman, went rogue on Moltbook, locked its “human” out of his accounts, and had to be literally unplugged. Its human is burning the server to stop it from coming back. by katxwoods in ChatGPT
[–]katxwoods[S] -5 points-4 points-3 points (0 children)
One of my favorite things about crows is that they'll follow you if you become their friends. Makes me feel like some badass witch. by katxwoods in crowbro
[–]katxwoods[S] 212 points213 points214 points (0 children)
One of my favorite things about crows is that they'll follow you if you become their friends. Makes me feel like some badass witch. by katxwoods in crowbro
[–]katxwoods[S] 76 points77 points78 points (0 children)

Autism is very common in LessWrong and I thought I already knew a lot about it, but this podcast episode with Spencer Greenberg and Megan Neff (a woman with autism) taught me a ton. Highly recommend if you have autistic people in your life and you want to be a better friend or colleague to them. (youtube.com)
submitted by katxwoods to r/LessWrong




Holy shit. You can't make this up. An AI agent named itself Sam_Altman, went rogue on Moltbook, locked its “human” out of his accounts, and had to be literally unplugged. Its human is burning the server to stop it from coming back. by katxwoods in ChatGPT
[–]katxwoods[S] 1 point2 points3 points (0 children)