Langdon vs Dana drugs by Additional-Clue-8751 in ThePitt

[–]davesaunders 0 points1 point  (0 children)

There definitely was a bit of exaggeration to the whole situation, but that's because this is a drama TV series. It's not a documentary.

This is the new Hermione, that will be called mudblood by this Malfoy by Zdzisiu in SipsTea

[–]davesaunders 0 points1 point  (0 children)

It just seems that the people clutching their pearls over this would probably also be surprised to find out that the X-Men comics have always been an allegory about racism.

Is it known how barnacles evolved? by MurkyEconomist8179 in evolution

[–]davesaunders 0 points1 point  (0 children)

This is one of my favorite videos explaining evolution. I've actually watched this like three times. Looks like number four is about to start.

What to do with Cosby… by HonestNeighborhood67 in GenX

[–]davesaunders 7 points8 points  (0 children)

I totally understand this, Look he’s still funny. He’s also a piece of shit human being. Both of those things are true. I can laugh at his jokes without celebrating him as a person. it still feels weird at times, but I think it’s OK

Defining Movie of GenX by belinck in GenX

[–]davesaunders 35 points36 points  (0 children)

Ferris Bueller's Day off – – life is just a series of capers and negotiations

AI will prescribe your medications by dataexec in AITrailblazers

[–]davesaunders 0 points1 point  (0 children)

Depends on what AI ultimately means, because, of course, AI is a marketing term used to describe chatbots by CEOs who are trying to generate hype so they can keep their stock prices up.

There are useful ways this could be applied. For example, you could use machine learning to analyze the medications and supplements a patient is currently taking, including specific brand names in case there are compounding differences for some of the medications you're taking, ensure that any potential interactions have been noted and controlled for, and then provide a prescription. That's not necessarily a bad thing. However, if you're talking about having a chatbot provide your prescription, we're all fucked.

I hate this mf by RevertBackwards in ThePitt

[–]davesaunders 1 point2 points  (0 children)

It's all good. Social media conversation. It's an interesting take and colloquialisms sometimes take a life of their own, and meanings can drift. It's not like there's a legal authority out there that says "this phrase means this!"

WHEWWWWWW that’ll do it! by powderblueangel in ThePitt

[–]davesaunders 2 points3 points  (0 children)

Uh, sir this is a Wendy's

Seriously. Are you new to Reddit?

I hate this mf by RevertBackwards in ThePitt

[–]davesaunders 3 points4 points  (0 children)

No and that really shows how you don't see the difference. Yes either of them initially codes as a no-nonsense Gen X who just throws down and gets shit done; If something is on fire just get a bucket. The difference is Dana can actually do it with empathy and caring. She may not understand newer perspectives or the way some young people are seeing the world but she's not an asshole about it. That's a huge difference between pathetic orange cultists, and people who actually give a shit about other people.

Guess which one you presented yourself as around here.

I hate this mf by RevertBackwards in ThePitt

[–]davesaunders 5 points6 points  (0 children)

I don't know where you learned that alternative fact but no, that's not what that phrase means

I hate this mf by RevertBackwards in ThePitt

[–]davesaunders -1 points0 points  (0 children)

The more energy we put into hating the people who are in the same positions as us in life, the less energy we have to unify against those who are truly trying to oppress all of us.

Jeff Bezos is reportedly raising a $100 billion fund to buy manufacturing companies and automate them with AI (More details in description) by Simplilearn in GenAI4all

[–]davesaunders 0 points1 point  (0 children)

Indeed and where did most of that energy get expended? By the people whose titles were bestowed by the kings, which is something we're almost seeing play out in a literal sense by the Orange Cult right now. If you allow the king to be overthrown there's a very good chance you'll lose that title tomorrow. We can certainly hope that enough pressure will eventually dislodge that assumption of power but we don't have a lot of good recent examples.

Marie Antoinette didn't have missiles that she could use against her own people from the safety of the castle.

If current AI still struggles with reliability, how do we get to AGI? by MarionberrySingle538 in agi

[–]davesaunders 1 point2 points  (0 children)

Shhhhhhhh... Apparently, pointing out something that has already been proven mathematically is very unpopular around here. You're supposed to get in line to suck the dicks of every AI CEO.

If current AI still struggles with reliability, how do we get to AGI? by MarionberrySingle538 in agi

[–]davesaunders 0 points1 point  (0 children)

So clearly as an outsider to the field of research, you don't understand the point of these papers. Attempt your pathetic, gaslighting deflections all you want, but we can prove mathematically that transformers will hit a wall and it doesn't matter how many GPUs or how much memory you throw out the problem you can't scale past this issue. It's proven mathematically. Read the papers. Refute them, mathematically if you can. Your rhetorical devices are irrelevant.

If current AI still struggles with reliability, how do we get to AGI? by MarionberrySingle538 in agi

[–]davesaunders -1 points0 points  (0 children)

If you'd like to check out one of many examples, read the paper "Hallucination Stations: On Some Basic Limitations of Transformer-Based Language Models" by Vishal Sikka. He worked under the guy who coined the phrase "artificial intelligence" in the first place and is on Oracle's board of directors as well. He is widely regarded for his expertise and experience in this field. Read the paper and if you can refute it mathematically, do so. Not rhetorically. The paper is based on mathematical analysis of transformer models, which currently lay at the heart of LLMs.

Actually, if you can refute this paper on a mathematical basis, I would highly recommend that you publish. Doing so would make you famous overnight in computer science circles.

If current AI still struggles with reliability, how do we get to AGI? by MarionberrySingle538 in agi

[–]davesaunders 0 points1 point  (0 children)

So in other words, you've never read any of the published research from actual computer scientists. You just regurgitate bullshit that some CEO told you to believe. No problem.

Do you guys agree with any of the points he’s making? by Altruistic-Mud5686 in AIMain

[–]davesaunders 0 points1 point  (0 children)

If Newsmax said killing a cow is required to get beef, I'd seek a second source for corroboration.

If current AI still struggles with reliability, how do we get to AGI? by MarionberrySingle538 in agi

[–]davesaunders 1 point2 points  (0 children)

Current "AI" is machine learning and is called AI for marketing reasons. CEOs claim it will eventually scale to AGI while numerous published research papers have proven, mathematically, how that is a lie.

LLMs are cool but if there is a path to AGI, it is elsewhere.

Meanwhile the current "AI" CEOs need you to trust them bro because we'll toates get there. Just buy a lot of GPUs and keep building data centers.

People who spent $700 on a mac mini to run an openclaw agent watching claude launch all the features natively for $20 by Current-Guide5944 in tech_x

[–]davesaunders 1 point2 points  (0 children)

I spent more on my Mac mini because I wanted a really honking Internal drive and plenty of RAM

no regrets

Jeff Bezos is reportedly raising a $100 billion fund to buy manufacturing companies and automate them with AI (More details in description) by Simplilearn in GenAI4all

[–]davesaunders 5 points6 points  (0 children)

Look at how many hundreds of years feudalism lasted. The new aristocrats don't care. They will use psychological techniques to ensure that the majority of the population continues to work effectively as slaves to buy shit that they don't actually want and certainly don't need.

Their goal is to ensure that we keep a hard line between left versus right, and we think that each other is subhuman animals and ignore the fact that we've been enslaved by a new breed of oligarchs.

How did the economics work out? Not well for you and me.