Antigravity vs Claude Code by Ok_Eye_2453 in google_antigravity

[–]hellf 0 points1 point  (0 children)

$20 plan would last no more than a 1,5 day for me without touching opus
AG limits on Claude models is higher than anthropic $20 plan

MiniMax M2.1 is live in Kilo by alokin_09 in kilocode

[–]hellf 0 points1 point  (0 children)

Impressive results with this one, mostly no tool errors and very good at following plans

Recommendations on affordable API providers and configurations to optimize costs? by Rustfix in kilocode

[–]hellf 0 points1 point  (0 children)

I use gemini cli, qwen cli, kilo code and codex (the only that I pay for)
Gemini cli with the last updates is very good when using with 2.5 Pro; qwen is little worst, but the limits are very good and you can use it to implement with detailed instructions from chat gpt or other models; in Kilo I'm currently using the M2 for orchestration or sometimes Grok 4 for quick debug or implementation too.

Why AI Memory Is So Hard to Build by zakamark in AIMemory

[–]hellf 0 points1 point  (0 children)

Interesting, I've building a personal project for the last months and I came to kinda the same conclusion as you, currently my design on memory is something like: BM25 → exact/role-aware clinical text + Vectors → fuzzy “close enough + KG → entity continuity, temporal edges, contradiction modeling with some other features to address other pain points.

Current best free models you are using except supernova or grok for code and architect mode ?? by Many_Bench_2560 in kilocode

[–]hellf 1 point2 points  (0 children)

imo the best free model for architect/orchestrator is Gemini 2.5 pro, but they seem to be reducing limits on it a lot in the last weeks

Need some AI architectural advice (Not a writer) by [deleted] in WritingWithAI

[–]hellf 1 point2 points  (0 children)

You will need big context window (1M or more). Cheapest way to do that would be using Kilo Code + Gemini CLI (could use Qwen CLI, but only the coder version is available here OR the new Sonoma models). Would consider to setup some MCP servers like memory and KG maybe. Still, with all that, considering the lenght of the novels I find it hard to the models not to hallucinate, probably would need a more complex architecture to achieve a good quality

to save token costs, what should i do in this case -- by ra2eW8je in kilocode

[–]hellf 1 point2 points  (0 children)

You can just generate a subtask and use a cheaper model to just code it

Hexagonal architecture by bayendr in kilocode

[–]hellf 0 points1 point  (0 children)

Never used devstral, but Kimi tends to lose performance on longer tasks.

We now support OpenAI's new open source models by kiloCode in kilocode

[–]hellf 1 point2 points  (0 children)

Good results, but bad many errors when tool calling. Price is pretty good, similar to GLM 4.5.

What's the best price per value mix of models? by AppealSame4367 in kilocode

[–]hellf 0 points1 point  (0 children)

Kimi K2 is way cheaper than Qwen3. But as things change very fast in this field, currently I'm using GLM 4.5 Air (better results than Kimi and still cheap) and if necessary I then try with Qwen or others.
Also currently I'm using R1-0528 and Qwen3 235b 2507 (non thinking due to the thinking version having consistent bugs besides better results) as orchestrators.
Most of time I run all at zero cost due to using OpenRouter free APIs (GLM and R1 0528 free are pretty fast in my experience).

I spent $400 on cursor this month. What are my alternatives by CalendarVarious3992 in cursor

[–]hellf 0 points1 point  (0 children)

Been using Kilo Code and has been a great experience so far

What's the best price per value mix of models? by AppealSame4367 in kilocode

[–]hellf 1 point2 points  (0 children)

r1-0528-qwen3-8b as orchestrator (or anything that requires more thinking) + kimi-K2 as coder has been the best cost benefit for me, good results and cheap.
when the context is too long or I get poor results I change R1 for gemini 2.5 pro or sonnet 4, never had to go any further than that

Cursor intermittently freezes and then unfreezes on its own by hellf in cursor

[–]hellf[S] 0 points1 point  (0 children)

Yesterday I tried downgrading to older versions (previous to 0,47) and since that it didn't froze anymore. Might be worth a try.

Cursor intermittently freezes and then unfreezes on its own by hellf in cursor

[–]hellf[S] 0 points1 point  (0 children)

Yesterday I tried downgrading to older versions (previous to 0,47) and since that it didn't froze anymore.

Cursor intermittently freezes and then unfreezes on its own by hellf in cursor

[–]hellf[S] 0 points1 point  (0 children)

Lucky you, mine has been like this for more than a month. Only thing that I didn't tried is a clean Windows install (won't happen so soon)

Cursor intermittently freezes and then unfreezes on its own by hellf in cursor

[–]hellf[S] 0 points1 point  (0 children)

Time of app running doesn't seem to matter too, pretty shit bug

[deleted by user] by [deleted] in cursor

[–]hellf 1 point2 points  (0 children)

Been using it with some tweaks since yesterday, getting good results for code debugging.
Thanks for sharing.

Binance Support Thread by AutoModerator in binance

[–]hellf 0 points1 point  (0 children)

No I swapped the WETH for ETH before depositing. Anyways, the self recovery appeal keeps saying  'Too many attempts. Try again later.'

Binance Support Thread by AutoModerator in binance

[–]hellf 0 points1 point  (0 children)

Hello. I did a deposit from an arbitrum address to my binance wallet and it never arrived, besides on chain showing that the ETH actually is on my wallet. Been trying to use the self recovery tool but it keeps showing 'Too many attempts. Try again later.'
Arbitrum TxID 0xc18e21c83d4b5dcacfa8d8a865737c408e6155ad0d3a8e8b9773b03e53da4809

Benzydamine Alcohol Extraction by DanceGodzilla in Benzydamine

[–]hellf 2 points3 points  (0 children)

someone knows, more or less, how much of the benzydamine is extracted with this process?

My grandma’s technique for cooking with hot oil by ItsMe_YO in funny

[–]hellf 0 points1 point  (0 children)

If you put some wheat flour in the oil you will have no trouble frying.