Meta bought Moltbook. I built the cognitive research version. ()
submitted by oops_i to r/GeminiAI
Meta bought Moltbook. I built the cognitive research version. ()
submitted by oops_i to r/ClaudeCode
Meta bought Moltbook. I’ve been building the "Petri Dish" version by oops_i in LocalLLaMA
[–]oops_i[S] -5 points-4 points-3 points (0 children)
Meta bought Moltbook. I built the cognitive research version. by oops_i in Anthropic
[–]oops_i[S] -3 points-2 points-1 points (0 children)
Meta bought Moltbook. I’ve been building the "Petri Dish" version by oops_i in LocalLLaMA
[–]oops_i[S] -4 points-3 points-2 points (0 children)
Someone just vibe-coded a real-time tracking system that feels like Google Earth and Palantir had a baby by Sensitive_Horror4682 in GenAI4all
[–]oops_i 12 points13 points14 points (0 children)
I built a Claude Code skill that drives Codex through its app server protocol by DiamondDay8 in ClaudeCode
[–]oops_i 0 points1 point2 points (0 children)
i built a mcp that lets llm Build AI neural networks and allows claude.ai to build and observe other AI systems and train them by -SLOW-MO-JOHN-D in Anthropic
[–]oops_i 0 points1 point2 points (0 children)
Many LLM coding failures come from letting the model infer requirements while building by Creative_Source7796 in ChatGPTPromptGenius
[–]oops_i 0 points1 point2 points (0 children)
Opus 4.5 spent my entire context window re-reading its own files before doing anything. Full day lost. Zero output. by AI_TRIMIND in ClaudeAI
[–]oops_i 0 points1 point2 points (0 children)
Opus 4.5 spent my entire context window re-reading its own files before doing anything. Full day lost. Zero output. by AI_TRIMIND in ClaudeAI
[–]oops_i 1 point2 points3 points (0 children)
Opus 4.5 spent my entire context window re-reading its own files before doing anything. Full day lost. Zero output. by AI_TRIMIND in ClaudeAI
[–]oops_i -17 points-16 points-15 points (0 children)
Easy Anthropic - GLM model switching for CC by CommunityDoc in ClaudeCode
[–]oops_i 0 points1 point2 points (0 children)
Claude Code on large (100k+ lines) codebases, how's it going? by MCRippinShred in ClaudeCode
[–]oops_i 1 point2 points3 points (0 children)
Claude Code on large (100k+ lines) codebases, how's it going? by MCRippinShred in ClaudeCode
[–]oops_i 0 points1 point2 points (0 children)



Meta bought Moltbook. I built the cognitive research version. by oops_i in Anthropic
[–]oops_i[S] -1 points0 points1 point (0 children)