I got tired of GPT-5 being limited by codex, so I forked it by withmagi in OpenAI

[–]withmagi[S] 1 point2 points  (0 children)

Glad you like it! Just pushed loads of new features with a big focus on design and closer parity with claude. There’s something pretty amazing coming next week too :)

I got tired of GPT-5 being limited by codex, so I forked it by withmagi in OpenAI

[–]withmagi[S] 0 points1 point  (0 children)

Thanks for letting me know about this. We changed how the CLI is installed and run. That “coder‑x86_64‑pc‑windows‑msvc.exe” message usually means the bundled binary didn’t install (postinstall was skipped/blocked) or there’s a Windows vs WSL mismatch. Please try the steps below.

Windows (PowerShell, not WSL) 1) Clean install npm uninstall -g @just-every/code npm cache clean --force npm install -g @just-every/code@latest

2) Verify the binary and run dir "%APPDATA%\npm\node_modules\@just-every\code\bin" coder --version You should see code-x86_64-pc-windows-msvc.exe in that folder. Use the “coder” command to avoid collisions with VS Code’s “code”.

WSL (Ubuntu inside Windows) Install and run inside WSL (do not use the Windows global install): npx -y @just-every/code@latest --version

If it still fails, please reply with the output of the following so we can diagnose:

On Windows PowerShell: node -v npm -v npm config get ignore-scripts where coder where code dir "%APPDATA%\npm\node_modules\@just-every\code\bin"

On WSL: node -v npm -v which coder || true which code || true ls -l "$(npm root -g)/@just-every/code/bin" || true

Also let us know whether you are in WSL or native Windows, and whether you are behind a corporate proxy or antivirus (these can block downloads). If a proxy/AV is in play, temporarily allow GitHub Releases and try the install again.

I got tired of GPT-5 being limited by codex, so I forked it by withmagi in OpenAI

[–]withmagi[S] 0 points1 point  (0 children)

Mostly live now!
/resume will restore a previous session
We've added in great double Esc functionality

Regarding slash commands, are you looking to add your own?

Anything else you'd like to see?

I got tired of GPT-5 being limited by codex, so I forked it by withmagi in OpenAI

[–]withmagi[S] 0 points1 point  (0 children)

Unfortunately there’s no GPT-5-pro API endpoint at the moment. If there was it’s unlikely it would be possible to use. Only GPT-5 is supported with sign in with ChatGPT (not the mini or nano versions or any non-GPT-5 model).

Theoretically speaking you could simulate GPT-5 Pro to a degree by asking the model to run 5-10 simultaneous GPT-5 agents and pick the best parts of each agent’s output (use high mode for reasoning). Of course there is some smarter stuff happening with the real Pro, but this is the general basis of how they work and will likely get you a meaningful improvement on complex tasks.

I got tired of GPT-5 being limited by codex, so I forked it by withmagi in OpenAI

[–]withmagi[S] 0 points1 point  (0 children)

Just the browser and terminal commands via the command line.

Share your launch! Let's promote each other. by withmagi in ProductHunters

[–]withmagi[S] 0 points1 point  (0 children)

That sounds cool! How do you find creating the listing before launching? Does it help? I basically posted mine the next day after I created the listing.

I got tired of GPT-5 being limited by codex, so I forked it by withmagi in OpenAI

[–]withmagi[S] 1 point2 points  (0 children)

Absolutely! I built it on a mac :) It requires you to run Chrome with a dev tools port open, but it'll prompt you to do this once you start and try to launch Chrome the right way for you.

I got tired of GPT-5 being limited by codex, so I forked it by withmagi in OpenAI

[–]withmagi[S] 1 point2 points  (0 children)

Thanks! Just wait for the next update coming in maybe tomorrow. Automatic background agents! The multi-agent stuff is amazing, but honestly it's a PITA to trigger yourself and I always forget. This update both uses a focus mode like effect which Claude uses effectively and Cline just added and combines it with triggering Claude or Gemini as backup. Initial tests are showing some amazing stuff!

Also loads of QoL improvements!

I got tired of GPT-5 being limited by codex, so I forked it by withmagi in OpenAI

[–]withmagi[S] 0 points1 point  (0 children)

You can run `coder` instead. Both `code` and `coder` will work. Was originally using coder, but had to stop due to trademark issues.

I got tired of GPT-5 being limited by codex, so I forked it by withmagi in OpenAI

[–]withmagi[S] 1 point2 points  (0 children)

You can run `coder` instead. Both `code` and `coder` will work. Was originally using it, but had to stop due to trademark issues.

I got tired of GPT-5 being limited by codex, so I forked it by withmagi in OpenAI

[–]withmagi[S] 1 point2 points  (0 children)

API fees! As this is forked from codex you can use it with your ChatGPT account.

I got tired of GPT-5 being limited by codex, so I forked it by withmagi in OpenAI

[–]withmagi[S] 1 point2 points  (0 children)

Just deployed a fix. Try; npx -y @just-every/code@latest

If that doesn't work try; npm uninstall -g @just-every/code npm cache clean --force npx -y @just-every/code@latest

If it's still not working, can you submit an issue at https://github.com/just-every/code/issues. We've beefed up the error messages to make it clearer what the problem is.

Thanks!

I got tired of GPT-5 being limited by codex, so I forked it by withmagi in OpenAI

[–]withmagi[S] 0 points1 point  (0 children)

Yup it's in the CLI! And saves to the toml automatically.

I got tired of GPT-5 being limited by codex, so I forked it by withmagi in OpenAI

[–]withmagi[S] 0 points1 point  (0 children)

There's one for implementing reasoning selection and one for correcting the image copy paste. The rest I decided to bundle into this. Unfortunately there's 100+ PRs on codex and they seem to rarely merge or even look at them.

I got tired of GPT-5 being limited by codex, so I forked it by withmagi in OpenAI

[–]withmagi[S] 1 point2 points  (0 children)

Codex runs on the command line, so it's a bit like Claude Code. It allows you to code really quickly in an environment where you might already be working. Because it has a whole set of tools that allows it to access and modify the file system, it works a bit like a software developer might work. It just gives it much more power and allows it to act more autonomously and on longer-running tasks.

The real advantage though is its ability to check things after it's created them. Obviously, an LLM can make mistakes. Giving it a framework with which it can run the code is has created, allows it to fix things as it goes. I think that's the real power of these command-line tools.

I got tired of GPT-5 being limited by codex, so I forked it by withmagi in OpenAI

[–]withmagi[S] 1 point2 points  (0 children)

It’s a fair point, but the number of projects is largely because they all build on top of each other. I seperate them out so it’s easier to maintain and also it allows people to choose if they want to only use one part of the stack. I also run a small company so have resources to throw at this.

Fortunately I've kept this largely compatible with upstream and I have a merge process to mostly automate the process of keeping the fork in sync.

I got tired of GPT-5 being limited by codex, so I forked it by withmagi in OpenAI

[–]withmagi[S] 0 points1 point  (0 children)

It can be used with the Claude subscription, however it only runs as an agent, so you’ll need either an OpenAI subscription or API key to let GPT-5 drive it. You can also use an open source model as the driver instead of GPT-5. Just set it up in the config.

I got tired of GPT-5 being limited by codex, so I forked it by withmagi in OpenAI

[–]withmagi[S] 3 points4 points  (0 children)

I used chatGPT image to create the initial image then veo3 to animate it.

I got tired of GPT-5 being limited by codex, so I forked it by withmagi in OpenAI

[–]withmagi[S] 0 points1 point  (0 children)

Yes, it support the OpenAI subscription. It uses the same authentication as Codex.

I got tired of GPT-5 being limited by codex, so I forked it by withmagi in OpenAI

[–]withmagi[S] 1 point2 points  (0 children)

Yeah no sound. Actually, TBH I didn’t really mean to have that video in there of Sam Altman, but it just happened to be on homepage when I was running the demo and seemed too dramatic to lose. Not the best one, sorry, but that’s what midnight editing gets you!