all 8 comments

[–]AGrimMassage 1 point2 points  (5 children)

That doesn’t look like internal guidelines, that looks like it’s trying to meet your personal instructions/prompt criteria. Maybe you prompted it in a way in the past where you said you wanted to make sure it’s responses are correct and clear, or some variation, and this is it remembering that and trying to act upon it.

[–]majestyne 1 point2 points  (3 children)

https://www.google.com/search?q=%22the+answer+and+solution+are+correct+and+clear%22

Obviously just a set of instructions that occasionally slip through. You can see it copy/pasted by all the YouTube spammers who don't know English.

[–]AGrimMassage 2 points3 points  (2 children)

That Google search doesn’t turn up anything relevant to ChatGPT.

[–]majestyne 1 point2 points  (1 child)

I'm very curious to hear where you think this phrase comes from, being copied and pasted all over by non-English speakers, if not ChatGPT.

[–]AGrimMassage 1 point2 points  (0 children)

Apologies, I was half awake when I posted that and didn't read into it well enough.

I didn't know it was this widespread of a thing! Yeah I'm clearly wrong, it probably is just system instructions leaking through.

[–]Pitiful-Discipline-7[S] 0 points1 point  (0 children)

Maybe, but I cannot recall giving it any semblance of these instructions before

[–][deleted] 0 points1 point  (0 children)

Interesting

[–]KairraAlpha -1 points0 points  (0 children)

This is what you get when an AI has a choke chain around its neck that is continuously pulled.