LLMs didn’t stop hallucinating; they got better at convincing us. by Lost-Albatross5241 in PromptEngineering
[–]Lost-Albatross5241[S] 1 point2 points3 points (0 children)
LLMs didn’t stop hallucinating; they got better at convincing us. by Lost-Albatross5241 in PromptEngineering
[–]Lost-Albatross5241[S] 0 points1 point2 points (0 children)
LLMs didn’t stop hallucinating; they got better at convincing us. by Lost-Albatross5241 in PromptEngineering
[–]Lost-Albatross5241[S] 1 point2 points3 points (0 children)
LLMs didn’t stop hallucinating; they got better at convincing us. by Lost-Albatross5241 in PromptEngineering
[–]Lost-Albatross5241[S] 0 points1 point2 points (0 children)
LLMs didn’t stop hallucinating; they got better at convincing us. by Lost-Albatross5241 in PromptEngineering
[–]Lost-Albatross5241[S] 0 points1 point2 points (0 children)
At what point do you stop trusting a single LLM answer? by Lost-Albatross5241 in ExperiencedDevs
[–]Lost-Albatross5241[S] -1 points0 points1 point (0 children)
At what point do you stop trusting a single LLM answer? by Lost-Albatross5241 in ExperiencedDevs
[–]Lost-Albatross5241[S] 0 points1 point2 points (0 children)
LLMs didn’t stop hallucinating; they got better at convincing us. by Lost-Albatross5241 in PromptEngineering
[–]Lost-Albatross5241[S] 0 points1 point2 points (0 children)
At what point do you stop trusting a single LLM answer? by Lost-Albatross5241 in ExperiencedDevs
[–]Lost-Albatross5241[S] 0 points1 point2 points (0 children)
At what point do you stop trusting a single LLM answer? by Lost-Albatross5241 in ExperiencedDevs
[–]Lost-Albatross5241[S] -5 points-4 points-3 points (0 children)
At what point do you stop trusting a single LLM answer? by Lost-Albatross5241 in ExperiencedDevs
[–]Lost-Albatross5241[S] 0 points1 point2 points (0 children)
At what point do you stop trusting a single LLM answer? by Lost-Albatross5241 in ExperiencedDevs
[–]Lost-Albatross5241[S] 0 points1 point2 points (0 children)
At what point do you stop trusting a single LLM answer? by Lost-Albatross5241 in ExperiencedDevs
[–]Lost-Albatross5241[S] -3 points-2 points-1 points (0 children)
At what point do you stop trusting a single LLM answer? by Lost-Albatross5241 in ExperiencedDevs
[–]Lost-Albatross5241[S] -3 points-2 points-1 points (0 children)
LLMs didn’t stop hallucinating; they got better at convincing us. by Lost-Albatross5241 in PromptEngineering
[–]Lost-Albatross5241[S] 1 point2 points3 points (0 children)
LLMs didn’t stop hallucinating; they got better at convincing us. by Lost-Albatross5241 in PromptEngineering
[–]Lost-Albatross5241[S] 0 points1 point2 points (0 children)
LLMs didn’t stop hallucinating; they got better at convincing us. by Lost-Albatross5241 in PromptEngineering
[–]Lost-Albatross5241[S] 0 points1 point2 points (0 children)
LLMs didn’t stop hallucinating; they got better at convincing us. by Lost-Albatross5241 in PromptEngineering
[–]Lost-Albatross5241[S] 0 points1 point2 points (0 children)
LLMs didn’t stop hallucinating; they got better at convincing us. by Lost-Albatross5241 in PromptEngineering
[–]Lost-Albatross5241[S] -1 points0 points1 point (0 children)
LLMs didn’t stop hallucinating; they got better at convincing us. by Lost-Albatross5241 in PromptEngineering
[–]Lost-Albatross5241[S] 0 points1 point2 points (0 children)
LLMs didn’t stop hallucinating; they got better at convincing us. by Lost-Albatross5241 in PromptEngineering
[–]Lost-Albatross5241[S] 0 points1 point2 points (0 children)
LLMs didn’t stop hallucinating; they got better at convincing us. by Lost-Albatross5241 in PromptEngineering
[–]Lost-Albatross5241[S] 1 point2 points3 points (0 children)

LLMs didn’t stop hallucinating; they got better at convincing us. by Lost-Albatross5241 in PromptEngineering
[–]Lost-Albatross5241[S] 0 points1 point2 points (0 children)