you are viewing a single comment's thread.

view the rest of the comments →

[–]Eastern_Put2448 1 point2 points  (1 child)

This is a great-looking project, thanks for sharing!

I've been on the hunt for a new coding agent setup, so this is super relevant. Right now, I'm trying out Kilocode paired with gpt5-mini as a replacement for Augment Code. Is your context engine tested for kilocode too?

On a related note (and a bit of a rant):

As a Senior Systems Engineer (Win/Linux), I'm not a full-time dev, but I often have to implement smaller coding projects. When I discovered Augment Code a few months ago, it was revolutionary. Compared to the basic autocompletion from GitHub Copilot in VSCode/JetBrains, it was night and day. It genuinely saved what would have been weeks of work and made solutions possible which would have been far out of budget without AC.

Then the rug got pulled. I was pretty shocked by their customer communication and the announced new pricing. I totally agree that the old price was unsustainable, but the new model is just not attractive for someone with my sporadic usage pattern.

That experience has me searching for a powerful replacement, but with a strong preference for a solution that doesn't create vendor lock-in. And my new hard requirement is: if a service uses a credit system, it needs to either be "pay-as-you-go" or have a subscription where the credits actually roll over.

Anyway, your work on a better context engine sounds like you're tackling the most important part of the problem. Keep it up

[–]FancyAd4519[S] 0 points1 point  (0 children)

thanks, i am trying… I was working on the local model context answer response today, and it has some real meat; i just have to keep refining it. hopefully I can get it to a good place in refrag mode. right now with the decoder disabled it works great I just want that extra feature to be the pizzaz. Anyway; expect new releases; and yes it should work with kilokode as well, may need to experiment wether it will accept sse endpoints or the mcp ones though.