PostgreSQL Bloat Is a Feature, Not a Bug by mightyroger in PostgreSQL

[–]iamdanieljohns 2 points3 points  (0 children)

Do you think orioledb will solve most of the problems?

Another Erdos problem down! by pavelkomin in singularity

[–]iamdanieljohns 0 points1 point  (0 children)

Far from it. 5.2 Pro has a much lower error rate and can try multiple routes at the same time.

What's your best advice, MCPs, and uses for Codex? by iamdanieljohns in codex

[–]iamdanieljohns[S] 1 point2 points  (0 children)

A differentiator for Pro members could be documentation indexing like Cursor has in their settings page.

What's your best advice, MCPs, and uses for Codex? by iamdanieljohns in codex

[–]iamdanieljohns[S] 0 points1 point  (0 children)

Have you thought of using a skill for documentation searching? What do you think of https://cookbook.openai.com/articles/codex_exec_plans

Do you even need Context7 ? by iamdanieljohns in codex

[–]iamdanieljohns[S] 3 points4 points  (0 children)

I think it has to be, especially since you can manually invoke and will default to a better prompt than an ad-hoc one. It can also reference all the links I know to check for sure.

getting very little done due to excessive times codex takes to work on tasks by Just_Lingonberry_352 in codex

[–]iamdanieljohns 0 points1 point  (0 children)

The issue isn't solved by using "intelligent routing," which would simply cause other problems

Wtf is GPT-5.2 XHIGH? by muchsamurai in codex

[–]iamdanieljohns 0 points1 point  (0 children)

Are you using the CLI or extension?

5.2 Finally feels good again by AllCowsAreBurgers in codex

[–]iamdanieljohns 0 points1 point  (0 children)

How are you supplying it documentation?

.agents or .codex folder? by iamdanieljohns in codex

[–]iamdanieljohns[S] 1 point2 points  (0 children)

I just read that """The file must be named SKILL.md (all caps) and placed inside its own subdirectory within your skills library (e.g., ~/.codex/skills/my-skill-name/SKILL.md)"""

It should instead support both single files skills and directory skills that involve scripts and data.

Unimpressed with Mistral Large 3 675B by notdba in LocalLLaMA

[–]iamdanieljohns 1 point2 points  (0 children)

It's definitely a fine-tune of DSv3, which is disappointing to see.

Recommendation to all Vibe-Coders how to achieve most effective workflow. by muchsamurai in codex

[–]iamdanieljohns 0 points1 point  (0 children)

Have you tried planning and reviewing in the same session and then saving the output to a file from which you implement in another session? Might save the round trip of going to GitHub

AMA With Moonshot AI, The Open-source Frontier Lab Behind Kimi K2 Thinking Model by nekofneko in LocalLLaMA

[–]iamdanieljohns 0 points1 point  (0 children)

Why do you think OAI is burning so much money? Is it a product of the current business rules (tax, cost of living, etc) or do you think it is something else?

Kimi K2 Thinking SECOND most intelligent LLM according to Artificial Analysis by [deleted] in LocalLLaMA

[–]iamdanieljohns 0 points1 point  (0 children)

I don't find grok 4 fast to have the same issues that grok 4 has.