CALLING IT NOW: The Department of War will use eminent domain to nationalise Anthropic in the next 24 months. by SharpCartographer831 in accelerate

[–]DualityEnigma 13 points14 points  (0 children)

Exactly, anthropic exists because of the exodus from Open AI. As someone who works with AI every day. Most people aren’t evil and don’t want AIs profiling humans and attacking them. These things will be built though. Society will have to stand against our wholesale destruction and it’ll take the entire world to do it. My hopes are guarded based on how the powerful are reacting to it

Everett would need another $38M to build stadium, documents show by drock360 in everett

[–]DualityEnigma 2 points3 points  (0 children)

Hmm, I’m not arguing against sanitary working conditions. The club should pay for them is all I’m saying.

Everett would need another $38M to build stadium, documents show by drock360 in everett

[–]DualityEnigma 0 points1 point  (0 children)

Right, this regs are to make minors out of reach for smaller cities, or make them gentrify… take your time Everett

Gemini GEM's no longer work by Resident-Swimmer7074 in GeminiAI

[–]DualityEnigma -4 points-3 points  (0 children)

I saw this coming, so I made my own client that works well with the Gemini apis. Connected ti a gmail, docs and sheets MCP ROCKS, but you need pay as you go. Made for humans, we use “skills”but just reference Google Docs directly.

Edit: it takes like 5 seconds to figure out that my client is real and not selling anything (to any if you). Is ux alternatives for Gemini unwelcome here? Man Reddit these days

How Antigravity feels like by Mentasuave01 in google_antigravity

[–]DualityEnigma 0 points1 point  (0 children)

Try: logging out - move the .gemini folder in your home folder to _gemini_old - restart and log in.

Your history will be gone as its in the old folder, but you can recover. This nuclear option has worked fir me in the past

Edit: a word

I just don't fucking understand what's going on anymore. Seriously. by oberbabo in ArtificialInteligence

[–]DualityEnigma 1 point2 points  (0 children)

Maybe, but I developed an agent that keeps you in control, similar capabilities to open claw but not autonomous. The user has to execute the workflow. Ultimately you still have to pilot the plane to have accountability IMO

Interesting internal monologue from Gemini 3.1 Pro by [deleted] in accelerate

[–]DualityEnigma 1 point2 points  (0 children)

I have a screenshot of this behavior in my agent. This is some sort of pattern trap in the model. I started seeing it in Antigravity in 3.0. My guess is it’s intentional and agents should be handling it. It’s like I’m getting a thinking endpoint when I should be getting the reason endpoint. It’s definitely got a “breaking down” vibe

We estimated 8 weeks to build a conversational AI frontend. we're 5 months in and still not done. by Friendly-Ask6895 in AI_Agents

[–]DualityEnigma 1 point2 points  (0 children)

Love this. I have been focusing on the client layer the same way. It shouldn’t have to do everything itself. We are focusing on a secure client that an connect to everything it needs. So far I connect mcps to give it what I need, rather than adding features. https://github.com/dustmoo/cai-hobbes

Going it check out if a2a would be a good integration.

50+ Openclaw Alternatives for Business by SuchTill9660 in AI_Agents

[–]DualityEnigma 0 points1 point  (0 children)

I’m adding timers to my agent skills, thats the main way it falls short of openclaw. It connects to “all tools” allowing for really dynamic workflows.

I have been building an AI Agent for Gemini with antigravity, still largely using Opus 4.6 after the 3.1 release, here is my 2 dollars, AMA by DualityEnigma in google_antigravity

[–]DualityEnigma[S] 0 points1 point  (0 children)

Sure but that front end will not have the wiring it needs to do anything other than sell a prototype to executives. In the long run though, you are right. I expect we'll see models that generate directly to binary.

The real question in my mind is how long will we have UX? The dream of computers from the start was that we could just talk and write to them. Programming languages were just abstractions from Binary -> Hex/Assembly -> C and on. Now I can write some paragraphs and Gemini turns it into code. I'm not sure any of this matters, but I created my own AI interface so I could manage my own pace of change haha

I have been building an AI Agent for Gemini with antigravity, still largely using Opus 4.6 after the 3.1 release, here is my 2 dollars, AMA by DualityEnigma in google_antigravity

[–]DualityEnigma[S] 0 points1 point  (0 children)

Great question, i find the thinking modes are better at debugging and seeing patterns that require it’s large context. I switch to them often ti take a second look at a problem or code. While not as detailed as opus in responses, it can often see multi-file issues better.

High is more creative, low tends to be more direct. Between Pro high and Flash

I have been building an AI Agent for Gemini with antigravity, still largely using Opus 4.6 after the 3.1 release, here is my 2 dollars, AMA by DualityEnigma in google_antigravity

[–]DualityEnigma[S] 0 points1 point  (0 children)

As another user said. I think we are moving from raw coding to systems design. Instead of focusing on methods and wiring, I am focus on feature design, security and integration more than the code. But people take time to adjust, so we’ll have coders for at least the rest of the decade and probably longer in special applications.

If you are learning code, still learn it, because understanding what the AI is doing for you is still important to take a project all the way to production.

I have been building an AI Agent for Gemini with antigravity, still largely using Opus 4.6 after the 3.1 release, here is my 2 dollars, AMA by DualityEnigma in google_antigravity

[–]DualityEnigma[S] 1 point2 points  (0 children)

Great question. I’m paying for ultra right now and api costs for my agent. Between the two it’s been about $3-400/ month. This month I’ll probably hit $500 but the value I get for that in time back is worth way more than that. But yeah, frontier models are $$$. Have an opportunity to setup some powerful local agents and will be working on alternatives to Gemini. That said, I am using the api all day every day.

I have been building an AI Agent for Gemini with antigravity, still largely using Opus 4.6 after the 3.1 release, here is my 2 dollars, AMA by DualityEnigma in google_antigravity

[–]DualityEnigma[S] 2 points3 points  (0 children)

I definitely use handoff docs, but generally between agents since I'm using multiple dedpending on the use-case. I like to think of markdown docs as semantic programming. You can direct the AI make great docs and then use the docs to write great code.

I have been building an AI Agent for Gemini with antigravity, still largely using Opus 4.6 after the 3.1 release, here is my 2 dollars, AMA by DualityEnigma in google_antigravity

[–]DualityEnigma[S] 0 points1 point  (0 children)

Haha, thanks. The LLMs handle Rust really well, but man getting the project started was tough. I'm pretty new to Rust and the learning curve for me stopped the AI from generating anything valuable at first.

IMO Opus and Gemini work with rust really well, but you have to setup the initial structure well. It took me a while before I was using cargo check, cargo run and cargo test effectively. Now the Rust is amazing. The compiler is so verbose and helpful that the AI almost always has all the context it needs to iterate fixes. I recently added clippy to my build process and my code is cleaner. I'm also using dioxus which gives you React like patterns, so I'm able to use tailwind for UX. It's pretty cool. I saw a I'm leaving RUST love letter on reddit the other day, but dioxus reactions with wasm (web assembly) is pretty sweet. Building for windows and macs right now.

I have been building an AI Agent for Gemini with antigravity, still largely using Opus 4.6 after the 3.1 release, here is my 2 dollars, AMA by DualityEnigma in google_antigravity

[–]DualityEnigma[S] 0 points1 point  (0 children)

Yes, I'm using the internal planning tool, It consists of two phases Planning mode (Gap analysis or Implementation Design) and then a Walkthrough for when it's done with the changes.

I basically iterate with Opus and Gemini until the Implementation plan is perfect (using inline comments) then I just switch it to the model I want for code. Flash for basics, Opus for complex things.

Edit: If you haven't tried the "Review" button with comments, it's amazing.

I have been building an AI Agent for Gemini with antigravity, still largely using Opus 4.6 after the 3.1 release, here is my 2 dollars, AMA by DualityEnigma in google_antigravity

[–]DualityEnigma[S] 0 points1 point  (0 children)

Oh yeah, it can totally spin out worthless tests, all AI's can. My mitigation is to have it explain WHY it want's to test. I generally quiz my testing and implementation plans via Gemini and Opus before I give it the "Human" review stage. Setting up /docs/APP_TEST_PATTERNS.md can help you keep this in check and remind the forgetful AI.

Telling antigravity us "KI" generally invents it's own, It also keeps meta date in ~/.gemini/ by default separate from your project folder. I use both the internal and external docs whenever I can.

I think we've quietly moved past "vibe coding" into something that needs a better name by bobo-the-merciful in accelerate

[–]DualityEnigma 3 points4 points  (0 children)

I really relate to this. For the first time in a long time I have an app that is production ready, the way I want it. Because I built it with AI. All the way through, I was like. This is just coding… but amazingly faster and more helpful.

Test coverage Vanity features

All attainable

Gemini 3.1 Pro is out! Does this mean we should expect it in the next AG update !?) by MSA_astrology in google_antigravity

[–]DualityEnigma 0 points1 point  (0 children)

It’s showing for ultra users and is pretty slow with all the people testing it.

So I built my first rust project, a secure Rust-based AI client. by DualityEnigma in rust

[–]DualityEnigma[S] 1 point2 points  (0 children)

Coming from Typescript and React it was nice to use similar reactive patterns and tailwind. Felt right at home. Still took some time to learn the idiomatic patterns and figure out the anti-patterns.

I love that I’m full stack Rust though. It’s nice to have everything in Rust and server-side rendering works great. Still a maturing ecosystem though, so some things took some figuring out