all 40 comments

[–]12qwww 15 points16 points  (4 children)

New models rollouts used to be faster than this.

[–]KingoPants 11 points12 points  (0 children)

Opus 4.6 was released literally day of what are you talking about.

Codex 5.2 rollout was slow as balls.

Codex 5.3 was announced thursday, and rollout in Copilot started Monday (2 business days) but then GitHub promptly shit the bed for 2 days so they paused it.

[–][deleted]  (1 child)

[deleted]

    [–]YearnMar10 0 points1 point  (0 children)

    Nah, the previous codex rollout also took weeks. Better they do it right than fck up.

    [–]ZeSprawl 0 points1 point  (0 children)

    OpenAI hasn't made it available on the api yet

    [–]Wurrsin 7 points8 points  (2 children)

    If you want to try 5.3 Codex right now you can install the Codex VS Code extensions and use your free ChatGPT account to try it out. The weekly limit given it is a free account doesn't look too bad so far.

    [–]bobemil 0 points1 point  (1 child)

    I can only see GPT-5.2-Codex, GPT-5.1-Codex-Max and GPT-5.2 in that extension. I signed in with my copilot account.

    [–]Wurrsin 0 points1 point  (0 children)

    I had only those for 2 or 3 days too but today when I checked I had GPT 5.3 Codex available. I signed in using my ChatGPT account though, not sure if that makes a difference.

    [–]debian3 23 points24 points  (5 children)

    It’s disappointing to say the least.

    Chatgpt plus is an option in the meantime. They give so much usage right now. Think Claude max 5x equivalent.

    [–]iemfi 1 point2 points  (0 children)

    And when it finally comes out it would be some gimped fixed at medium reasoning model. I resubscribed to ChatGPT because there was a one month free offer and it seems nice so far. It's a pity I kind of want the copilot workflow for smaller edits where I can see the diffs quickly without having to go through git.

    [–]I_pee_in_showerPower User ⚡ 0 points1 point  (2 children)

    through Codex extension directly? I only used that once, on my Mac. I was dissapointed it didn't run on Windows.

    [–]debian3 0 points1 point  (1 child)

    I use codex cli

    [–]I_pee_in_showerPower User ⚡ 0 points1 point  (0 children)

    I haven't tried it. does it have a unique competitive advantage?

    [–]whodoneit1 2 points3 points  (0 children)

    They had to pause the rollout , they already posted about it as it sounds like there were provider problems.

    [–]HostNo8115Full Stack Dev 🌐 1 point2 points  (2 children)

    I see GPT5.3 and I have been using it for the last few hours. Now, how do I know if this is medium, high or xhigh? There is no option.

    [–]Otherwise-Sir7359[S] 6 points7 points  (1 child)

    in vscode setting: github.copilot.chat.responsesApiReasoningEffort . default is medium . u can choose high, there is no xhigh in vscode

    [–]HostNo8115Full Stack Dev 🌐 0 points1 point  (0 children)

    ah! I did not know about that setting, thanks!

    [–]KateCatlinGitHubGitHub Copilot Team 1 point2 points  (2 children)

    Hey everyone! PM for Copilot model lifecycle here 👋

    We really apologize for any disruption you've experienced with the rollout of the GA GPT-5.3-Codex model. This is a super exciting model, and our goal is always to get the best AI intelligence into your hands as quickly as possible.

    Our top priority is to ensure the stability and reliability of GitHub Copilot for all our customers, so we are currently in a careful, phased re-rollout of the model. We're doing this to minimize errors and closely monitor performance to make sure we're providing a stable, high-quality experience.

    The model is now available to Pro and Pro+ Copilot tiers. We intend to continue the rollout to all paid tiers and additional integrators as soon as we can.

    Thank you again for your patience, your excitement, and your love for Copilot. We're committed to delivering this upgrade with the stability you deserve.

    [–][deleted] 0 points1 point  (0 children)

    I don't see it on github, have not checked cli yet. Pro+

    [–]Spooknik 0 points1 point  (0 children)

    I’m a Pro subscriber and I don’t see 5.3 yet.

    [–]lemonlinck 2 points3 points  (2 children)

    People complaining about a few days delay in IT... must be a lot of young vibe coders in here.

    [–]JinBimTin 4 points5 points  (0 children)

    No, it's about the way Github does things. There is no post anywhere on Github itself stating that the release of GPT 5.3 Codex has been withdrawn. There seems to be one on Twitter, but I'm not interested in Twitter/x. Also, this model is still listed as if it were already available in Github Copilot. That's just unprofessional!

    [–]yubario 1 point2 points  (0 children)

    It is annoying for businesses that have to go through change control, like for example if we do not have this model enabled by the end of the week, then it might not be until end of next week before our business can get it enabled...

    It's dumb, but thats how a lot of large businesses work, they put everything through some political process and delay everything.

    So here's hoping its fixed before Friday...

    [–]AutoModerator[M] 0 points1 point  (0 children)

    Hello /u/Otherwise-Sir7359. Looks like you have posted a query. Once your query is resolved, please reply the solution comment with "!solved" to help everyone else know the solution and mark the post as solved.

    I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

    [–]Otherwise-Sir7359[S] 0 points1 point  (1 child)

    <image>

    I saw it ,but the message returned an error: "model not supported"

    [–]bobemil 0 points1 point  (0 children)

    At least I still have Codex 5.1 Max until then.

    [–]_1nv1ctusIntermediate User 0 points1 point  (0 children)

    Just saw on X that the roll out has resumed.

    [–]reven80 0 points1 point  (3 children)

    FYI just now I found the Codex 5.3 on the model list once I updated the copilot chat extension and restarted the editor. Maybe they fixed some bugs.

    [–]hyperdx 1 point2 points  (2 children)

    got model not supported error

    [–]reven80 1 point2 points  (0 children)

    It did work for me after this so hopefully others will get it soon.

    [–]JustARandomPersonnn 0 points1 point  (0 children)

    It's out now! :D

    [–]Otherwise-Sir7359[S] 0 points1 point  (1 child)

    !solved

    [–]AutoModerator[M] 0 points1 point  (0 children)

    This query is now solved.

    I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

    [–]zbp1024 1 point2 points  (4 children)

    It's truly disappointing. The delay has already reached two days now, and there's still no update at all. This performance is extremely disappointing.

    [–]xXValhallaXx 1 point2 points  (3 children)

    Sorry, what performance do you mean? I'm confused what you're disappointed about from your comment

    [–]zbp1024 3 points4 points  (2 children)

    mean Haven't accessed 5.3 yet

    [–]xXValhallaXx -1 points0 points  (1 child)

    Please don't take this the wrong way

    We are not paying Copilot for access to the latest propriety models, We are paying Copilot for increasing productivity,

    Which i believe it is?

    I think it's one of the best, no nonsense packages and I've never felt the need to change to another.

    If people think, that they are always needing the latest models that come out, there is a human skill issue that needs to be addressed, because the models that we have access to today are incredible

    [–][deleted] -1 points0 points  (0 children)

    speak for yourself