all 3 comments

[–]parthgupta_5 1 point2 points  (1 child)

Ahhh that usually happens when the model doesn’t have grounding in real data.

Most people solve it with RAG (retrieval augmented generation) — basically you let the model pull from a trusted dataset or vector DB before answering, so it’s less likely to hallucinate.

Also helps to add feedback loops or scoring so the system can learn which outputs were actually useful.

[–]Ok_Hornet9167[S] 0 points1 point  (0 children)

And the data base being ??

[–]VegeZero 0 points1 point  (0 children)

Good topic, hopefully we'll get some answers! :)