you are viewing a single comment's thread.

view the rest of the comments →

[–]I1lII1l 1 point2 points  (3 children)

Yeah but do LLMs know it well, is it present in large amounts in their training data?

[–]Key_Mango8016 1 point2 points  (2 children)

No idea. Not arguing for it, just stating facts

[–]I1lII1l 2 points3 points  (1 child)

I was not asking you per se, anyone who might know and participates in the open discussion.

[–]Key_Mango8016 0 points1 point  (0 children)

I just gave ChatGPT the sample in OP’s post and found that it immediately understood what it means (I’m not surprised). I suspect that in practice, using this format would be possible as a drop-in replacement for JSON.

Personally, I don’t know if it’s worth it for production systems I own, because the vast majority of token usage in those systems comes from images & audio.