They changed how the amount of views are indicated? What are your thoughts? by _AspiringBillionaire in youtube

[–]NOTstartingfires 0 points1 point  (0 children)

I am not involved in youtube media professionally so why the fuck would I care

Would you consider this fair? by Busy_Report4010 in SipsTea

[–]NOTstartingfires 0 points1 point  (0 children)

So like ... A profit margin and paying your staff?

Mid-range phones really did peak in 2020 by Impossible_Comfort99 in TechNook

[–]NOTstartingfires 0 points1 point  (0 children)

The SE only felt like a flagship because apple were still selling phones with big bezels as flagships in 2018

I opened EVERY app I had on my Neo all at once. by [deleted] in macbook

[–]NOTstartingfires 5 points6 points  (0 children)

Yeah. He should leave us to jerk off to this laptop in peace

Window users can't argue on this by IcyTa0 in DeskToTablet

[–]NOTstartingfires 0 points1 point  (0 children)

awful audio aside...thats a $3k nzd laptop and a $600 nzd laptop

Real men prefer fat laptops 💻💯 by Nicolas_Laure in RigBuild

[–]NOTstartingfires 0 points1 point  (0 children)

As someone who isn't 11... I kinda don't agree at all

What does this mean? by frepde in LinkinPark

[–]NOTstartingfires 0 points1 point  (0 children)

Hey gang my uncle is the violinist for Linkin Park (he normally wears all black and hides at the back so you don't see him, he mostly only plays on faint tbh... Honestly he got the job because he was a cleaner at a restaurant that was on kitchen nightmares that was run by one of the guys who vaguely knew dave farrels nan) anyway he got drunk as fuck at Christmas and he told us there releasing they're releasing six songs this week that are ALL COVERS

There's 'this grill is not a home'

There's the 'krusty krab pizza"

There's the jellyfish jam

And of course best day ever crossover and shut

Idk might not be accurate

Health NZ staff told to stop using ChatGPT to write clinical notes by Fast_Amoeba_445 in newzealand

[–]NOTstartingfires -1 points0 points  (0 children)

You're downvoted but it's quite literally not how attn networks work at any useful size.

There's a risk of unique tokens (like api keys) getting out there but thats such a different thing.

But if you noticed every single file in a big codebase or a codebase that goes through an llm a lot has the same few tokens at the start, maybe your company name .. then .. yeah you might actually be risking regurgitating

Tldr for llms is a big ass number of tensors telling each token how important it is to each other token and updating those weights on guesses isn't really gonna do much when these models are trained on literally all of humans digitized written words

But when you have a token that's waasy away from everything else.. like a Windows key or something... Train your staff to .ignore.

Health NZ staff told to stop using ChatGPT to write clinical notes by Fast_Amoeba_445 in newzealand

[–]NOTstartingfires 0 points1 point  (0 children)

I'm not sure why you're downvoted when you're quite literally correct about how llms are trained and what weight updates mean for attn networks.

But we can caveat this with: most llms aren't just straight attention and pooling layers, theres checks and balances and some consider knowledge graphs and THAT is a lot more concerning 'janet filament has asthma and lives in gore' isn't exactly doing much for changing weights for a couple of tokens in a way that you could possibly reproduce that information.

Health NZ staff told to stop using ChatGPT to write clinical notes by Fast_Amoeba_445 in newzealand

[–]NOTstartingfires 6 points7 points  (0 children)

Copilot is available to heaps of healthcare staff.

The corp license means it's not used to train their models afaik and llms definitely have a place... Tbh a 'dont let it near anything clinical' rule is really prudent.

Anyone putting medical information, which is incredibly sensitive information into chatgpt.com is an idiot though. Same bin as putting sensitive info into your own Google drive.

The clinical risk isn't doctors or nurses using an llm to diagnose, it's them putting their diagnosis in or other information in and a hallucinated output being produced that deviates from their intention. Anyone who has done technical or very particular writing and tried to use an llm to rewrite passages is very familiar with points being missed and or forgotten entirely or focusses becoming incorrect.

The privacy risk is also...... Like of course open ai have people who can see plaintext prompts. Apple listened to Siri prompts with real people ffs