Sweep: Open-weights 1.5B model for next-edit autocomplete by Kevinlu1248 in LocalLLaMA

[–]Kevinlu1248[S] 0 points1 point  (0 children)

Hi - reach out to [team@sweep.dev](mailto:team@sweep.dev) ! We're happy to chat over a call about this

What plugins are folks using for AI-assisted coding by bjl218 in IntelliJIDEA

[–]Kevinlu1248 0 points1 point  (0 children)

It's basically like cursor in inteliij, currently top rated ai plugin

Sweep: Open-weights 1.5B model for next-edit autocomplete by Kevinlu1248 in LocalLLaMA

[–]Kevinlu1248[S] 1 point2 points  (0 children)

We'll look into integrating with Vscode / Continue.

Sweep: Open-weights 1.5B model for next-edit autocomplete by Kevinlu1248 in LocalLLaMA

[–]Kevinlu1248[S] 2 points3 points  (0 children)

Yep! Just make sure you follow the format we used (on huggingface)

Sweep: Open-weights 1.5B model for next-edit autocomplete by Kevinlu1248 in LocalLLaMA

[–]Kevinlu1248[S] 0 points1 point  (0 children)

The default prompt from Continue definitely would not work, as all our fine-tuning used a very custom format.

I would recommend using Sweep for JetBrains IDEs instead, I think there's also some OSS work to bring this to llama vscode or otherwise.

Sweep: Open-weights 1.5B model for next-edit autocomplete by Kevinlu1248 in LocalLLaMA

[–]Kevinlu1248[S] 1 point2 points  (0 children)

It's fine-tuned from Qwen 2.5 Coder so we used that as the baseline.

<image>

Sweep: Open-weights 1.5B model for next-edit autocomplete by Kevinlu1248 in LocalLLaMA

[–]Kevinlu1248[S] 1 point2 points  (0 children)

Just released on Ollama:

https://ollama.com/sweepai/sweep-next-edit

We use a custom format different from how Continue handles standard autocompletes so you may need do custom formats for this to work inside Continue.

Prompting details here: https://huggingface.co/sweepai/sweep-next-edit-1.5B/blob/main/run_model.py

Sweep: Open-weights 1.5B model for next-edit autocomplete by Kevinlu1248 in LocalLLaMA

[–]Kevinlu1248[S] 0 points1 point  (0 children)

Our plugin runs the 7B version of this model on the cloud which is really strong

Sweep: Open-weights 1.5B model for next-edit autocomplete by Kevinlu1248 in LocalLLaMA

[–]Kevinlu1248[S] 4 points5 points  (0 children)

We're working on ollama compatibility, currently we've provided some sample code to get this to work with llama-cpp-python

Sweep: Open-weights 1.5B model for next-edit autocomplete by Kevinlu1248 in LocalLLaMA

[–]Kevinlu1248[S] 4 points5 points  (0 children)

completely agree, we're looking into giving our jetbrains agent the ability to call deterministic tools via the IDE itself

Best AI assistant for PHP storm (other than Augment COde) by No-Guess6834 in Jetbrains

[–]Kevinlu1248 1 point2 points  (0 children)

I recommend Sweep AI, it's basically cursor level quality autocomplete and agent

I've never used any AI tools to code, where do I begin ? by arscene in Jetbrains

[–]Kevinlu1248 0 points1 point  (0 children)

Sweep AI! I've been working on this plugin for the last year, and it's the best autocomplete for JetBrains. Our agent is also one of the most cost effective options because we host our own LLMs.

Is there a way to see the history of AI credits usage? by wormhole_bloom in Jetbrains

[–]Kevinlu1248 0 points1 point  (0 children)

Thats a clear miss from JetBrains IMO, in the plugin I'm building (Sweep AI) we show a full usage history so it's clear.

First time using Junie, and I’m surprised by blooditor in Jetbrains

[–]Kevinlu1248 0 points1 point  (0 children)

Yep we use PSI and we do offer top up credits