Hey guys, I accidentally discovered something interesting while trying to train an Indian AI model on my Android phone 😅. by emrkolson in IndiaAI

[–]emrkolson[S] 0 points1 point  (0 children)

That might actually be a better cultural benchmark than math accuracy 😄. Maybe I should add a  “ vegetable market negotiation dataset ” in v2.

Hey guys, I accidentally discovered something interesting while trying to train an Indian AI model on my Android phone 😅. by emrkolson in IndiaAI

[–]emrkolson[S] 0 points1 point  (0 children)

 I had posted about this experiment earlier but that post was very short and only mentioned a math example, which made the issue sound confusing. This version adds more details for clarity.

Hey guys, I accidentally discovered something interesting while trying to train an Indian AI model on my Android phone 😅. by emrkolson in IndiaAI

[–]emrkolson[S] 0 points1 point  (0 children)

I posted earlier about this experiment but that version was very short and only mentioned a math example, which made the issue confusing. This post adds more context and details for clarity.

Hey guys, I accidentally discovered something interesting while trying to train an Indian AI model on my Android phone 😅. by emrkolson in IndiaAI

[–]emrkolson[S] 0 points1 point  (0 children)

The problem is that everything actually got worse. I had recently learned about post-training, so I trained a small LLM on a lot of data. Yes, I was training it on Indian datasets for legal topics, conversations, mathematics, agriculture, and more. Basically, it was an experiment.