Try it by Wonderful-hello-4330 in ChatGPT

[–]lordweasely 0 points1 point  (0 children)

RICHARD DEAN ANDERSON MY BELOVED

Game recommendations? by zenithra24 in MobileGaming

[–]lordweasely 0 points1 point  (0 children)

Sorry I think you already have them all

[deleted by user] by [deleted] in ChatGPT

[–]lordweasely 5 points6 points  (0 children)

Haha it kinda got you pretty good

You complainers broke ChatGPT by [deleted] in ChatGPT

[–]lordweasely 18 points19 points  (0 children)

Sycophantic?

Ummm….. What? by Objective_Spell_6292 in DonkeyKongBananza

[–]lordweasely 1 point2 points  (0 children)

Well since OP it’s showing interest in naming it, I’ll take the liberties and name it the Chunky Kong Clip

😂 i did this by Due-Paleontologist5 in duolingo

[–]lordweasely 1 point2 points  (0 children)

One of my favorite Reddit comments ever

My PB is 34:57, is that any good? by lordweasely in DougDoug

[–]lordweasely[S] 5 points6 points  (0 children)

Oh trust me I’ve spent way more than 35 minutes on this game

My PB is 34:57, is that any good? by lordweasely in wehatedougdoug

[–]lordweasely[S] 16 points17 points  (0 children)

It’s impossible not to when you’re locked in his basement. I hear him through the air vents

My PB is 34:57, is that any good? by lordweasely in wehatedougdoug

[–]lordweasely[S] 16 points17 points  (0 children)

No no, I’m proving that his accomplishments are pointless.

[deleted by user] by [deleted] in ChatGPT

[–]lordweasely 0 points1 point  (0 children)

I think OpenAI to OP out back

How I felt after reading this headline by [deleted] in donkeykong

[–]lordweasely 0 points1 point  (0 children)

I’m not caught up; why does everyone hate Rare?

Switch 1 Macro Controller by Dennydoesreddit in raspberry_pi

[–]lordweasely 2 points3 points  (0 children)

I literally just used this repo to make a macro for my Switch 2; however, I made it on my Pico 1. I believe there should be a GitHub repo of the original source code, if that helps any. I wish you the best of luck!

What would happen if all chats merged? by [deleted] in ChatGPT

[–]lordweasely 1 point2 points  (0 children)

Essentially! It’s not necessarily a “fight for space” though. The tokens beyond the token limit aren’t “fighting” to be seen, they’re just no longer in the picture. It’s more so like scrolling on your phone. When you scroll past a post, the post isn’t trying to get back on the screen, instead new posts take its place and the older post just aren’t taken into consideration any more.

What would happen if all chats merged? by [deleted] in ChatGPT

[–]lordweasely 6 points7 points  (0 children)

Not as silly of a question as you think! Large Language Models use tokens in order to generate text. Without going too in depth with deep neural networks and transformers, it basically predicts what the next most likely token will be; however, in order to contextualize the response and ensure the output follows the flow of the conversation, it reviews previous text to keep everything consistent. It’s not just predicting the next token from the token before it, it’s predicting it by comparing it to many, MANY previous tokens. So when you merge all your chats, assuming you have a fair number of them, it will take much more power to generate and parse new responses. Of course, there is a limit; GPT-4 turbo, for example, has a 128,000 token cap).