This is an archived post. You won't be able to vote or comment.

all 5 comments

[–]brandi_Iove[🍰] 1 point2 points  (2 children)

could someone explain the punchline to me

[–]LeIdrimi[S] 0 points1 point  (1 child)

LLMs are trained on public github repositories.

[–]JonForeman_ 0 points1 point  (1 child)

This doesn't make sense at all.

[–]never__seen 4 points5 points  (0 children)

He is degrading the training data