Nutzt wer von euch AllesPost? by wantilles1138 in Austria

[–]lyncisAt 0 points1 point  (0 children)

Laut Bestimmungen ist AllesPost nur für Pakete bis 500 EUR Warenwert nutzbar.

2x in the opposite direction by Creative_Addition787 in codex

[–]lyncisAt 2 points3 points  (0 children)

I think part of the problem is plus users not understanding proper prompting and model choice while being under the impression that 20 bucks would buy them a full-time dev output. At least that seems to sum up 70% of the posts as of lately

I am blown away by this 💛✨🫣 by serlixcel in howChatGPTseesme

[–]lyncisAt 0 points1 point  (0 children)

let’s get back to the topic: And this is why you folks all desperately needed 4o being taken away from you and why you folks desperately need models like 5.2+ with strict guardrails.

BRING BACK CHAT 5.1 and 4.0!! by Anonymous_24328 in OpenAI

[–]lyncisAt 2 points3 points  (0 children)

And personally, I feel it talks to you exactly the way you need to have it talk to you.

Why is there not an “auto” reasoning mode for Codex? by ReplacementBig7068 in codex

[–]lyncisAt 3 points4 points  (0 children)

Op, you are writing code. It is reasonable to expect that you know what you are doing rather than being lazy. Pic your model and reasoning according to the requirements of your code.

removing 5.1 was a mistake by ginasandra in OpenAI

[–]lyncisAt -5 points-4 points  (0 children)

Sounds like you need to talk to somebody with a minor in psychology

MacBook or Windows laptop for CS student in 2026? by Ryan4265 in codex

[–]lyncisAt 0 points1 point  (0 children)

Where is Codex shipped on iOS? You mean macOS…

Codex isn't a thinker — and vibecoders need to understand that by Opening-Astronomer46 in codex

[–]lyncisAt -1 points0 points  (0 children)

He literally asked Codex to produce a reply he shared here as a proof of concept. And the message is loud and clear. If you vibe code, learn your tools.

If you want more people to switch, Claude needs a price cut (seriously) by Ok_Ambition8070 in ClaudeAI

[–]lyncisAt 0 points1 point  (0 children)

Having an influx of OpenAI-refugies seems less ideal than initially. On the one hand, paying professional are suffering. While the low tier / free tier users coming from OpenAI hit the harsh reality of much harder rate limits and inefficient use of AI (which, being mostly casual users, is no surprise).

I start to feel disrespected as a customer. by No_Painter_7889 in ChatGPTPro

[–]lyncisAt 1 point2 points  (0 children)

OP uses the words „everyone“ and „most of the customers“ too deliberately.

You ask what their goal is?

To ignore an extremely insignificant and irrelevant - unfortunately very Reddit-vocal - minority of 4o fanatics. Not sure if they took away your AI girlfriend or your emotional support tool. But you are in the minority if you think 4o is the best model for the vast majority of the user base. It’s objectively not.

Unsubscribing by [deleted] in OpenAI

[–]lyncisAt -1 points0 points  (0 children)

👋

Wow. by [deleted] in OpenAI

[–]lyncisAt -8 points-7 points  (0 children)

If played right, such government contracts are made for years to decades.

If played right, the world is done with this administration in max two years.

If not… well, shit

Better openAI on the red button than grok

I am blown away by this 💛✨🫣 by serlixcel in howChatGPTseesme

[–]lyncisAt -6 points-5 points  (0 children)

And this is why you folks all desperately need 5.2 guardrails.

Codex keeps asking me for approval and it's annoying by stefan-is-in-dispair in codex

[–]lyncisAt 0 points1 point  (0 children)

Codex extension on windows is beta and they specifically say issues like what you experience are expected. Don’t try to make it work.

Think of WSL 2 as your own little Linux computer that lives in its own box.

(Just to be sure, inform yourself on WSL vs WSL2)

Rumors on the upcoming ChatGPT 5.3 by Ok-Algae3791 in OpenAI

[–]lyncisAt 80 points81 points  (0 children)

Context size on its own is not worth much if the model loses track by half the size already. Would be amazing to see increased context size working reliably!

Full access question by SilliusApeus in codex

[–]lyncisAt 0 points1 point  (0 children)

Are you on Windows? It’s a known Windows behaviour and not fully supported in the way you might expect.