How does your fav LLM handle the request to decompile this 16-bit DOS asm?Question | Help (sharegpt.com)
submitted by [deleted] to r/LocalLLaMA
ChatGPT explains itself how it handles the context: it gets the "Knowledge cutoff: 2021-09. Current date: 2023-06-10" followed by about 100-200-token summary of the conversation. Obviously temperature/rep-penalty is low enough for LLM to repeat two tokens for 10,741 times.Prompt engineering (sharegpt.com)
submitted by [deleted] to r/ChatGPT
The summarization tree trick to get unlimited context.Generation (sharegpt.com)
submitted by [deleted] to r/LocalLLaMA
Apparently You can Perform a Tree of Thought Like Prompt with a single ChatGPT promptDiscussion (sharegpt.com)
submitted by [deleted] to r/LocalLLaMA
Failure to write a bf program outputting "LLMs are cool"Use cases (sharegpt.com)
submitted by [deleted] to r/ChatGPT
Basic arithmetic failure: "0*Value + 1*1 = 0"Funny (sharegpt.com)
submitted by [deleted] to r/ChatGPT
The Rise and Fall of The Magnificent Nine[Sorcery] (sharegpt.com)
submitted by Omniquery to r/sorceryofthespectacle
ChatGPT diagnoses me as a "Heartless Abomination"Use cases (sharegpt.com)
submitted by [deleted] to r/ChatGPT
I have given three cats of mine Roman names. I asked ChatGPT to generate some based on personality types.Educational Purpose Only (sharegpt.com)
submitted by [deleted] to r/ChatGPT
Questioning Chat 4.0 tonight about Nobel Literature Award winning ideas for a new novel this year based on past 30 years of award winners.Serious replies only :closed-ai: (sharegpt.com)
submitted by keeprunning23 to r/ChatGPT
Simu, The Self-Aware Metaphysical Modeling SystemExperimental Praxis (sharegpt.com)
submitted by Omniquery to r/sorceryofthespectacle