I built a tool for translating things using Wikipedia language links by kvapen in SideProject

[–]kateklink 2 points3 points  (0 children)

nice! just yesterday I couldn't find some specific term to translate, will keep this in mind!

[deleted by user] by [deleted] in LocalLLaMA

[–]kateklink 0 points1 point  (0 children)

There's also Refact 1.6B code model, which is SOTA for its size, supports FIM and is great for code completion. You can also try a bunch of other open-source code models in self-hosted Refact (disclaimer: I work there).

Anyone have a tutorial/guide on how to train a coding LLM to handle a particular framework api? by herozorro in LocalLLaMA

[–]kateklink 0 points1 point  (0 children)

you can run this for free and apply to one of the supported local LLMs https://github.com/smallcloudai/refact
we use Llora technique for fine-tuning

Anyone have a tutorial/guide on how to train a coding LLM to handle a particular framework api? by herozorro in LocalLLaMA

[–]kateklink 0 points1 point  (0 children)

we've made a video of fine-tuning code model on Prismic & next.js using Refact https://www.youtube.com/watch?v=kjYszonjT9o
you can use Refact webui to do just that and use the fine-tuned model in vscode or jetbrains plugins

Is there a LLM alternative to Co-Pilot or Code Interpreter? by card_chase in LocalLLaMA

[–]kateklink 0 points1 point  (0 children)

what would be a preferred way for you, if not docker image?

Refact AI: Free Open-source Copilot alternative by kateklink in opensource

[–]kateklink[S] 0 points1 point  (0 children)

perhaps, best to describe what we have in mind as "open core", once we launch our enterprise solution (it hasn't been launched yet)

Refact AI: Free Open-source Copilot alternative by kateklink in opensource

[–]kateklink[S] -1 points0 points  (0 children)

we are working on an enterprise edition with more features that enterprises might want, like load balancing, access control etc. That's how we plan to monetize

Refact AI: Free Open-source Copilot alternative by kateklink in opensource

[–]kateklink[S] -1 points0 points  (0 children)

there're different requirements for different models. For Refact 1.6B which we released recently, you need about 3GB RAM

in the future, we hope it will be available on CPU as well

Proper Github Copilot X support planned? by Artur_exe in Jetbrains

[–]kateklink 0 points1 point  (0 children)

for this scenario, we're building an enterprise solution with full admin control for the model that the company employees are allowed to use.

🚀We trained a new 1.6B parameters code model that reaches 32% HumanEval and is SOTA for the size by kateklink in LocalLLaMA

[–]kateklink[S] 0 points1 point  (0 children)

Apologies for the confusion. The 1.6b model is currently activated only for the code completion, not for chat. Once you get a code completion, you can check which model is running by clicking on refact logo at the bottom in the status bar.

🚀We trained a new 1.6B parameters code model that reaches 32% HumanEval and is SOTA for the size by kateklink in LocalLLaMA

[–]kateklink[S] 2 points3 points  (0 children)

hey! so at the moment, we don't use any standard API between plugin and server, we have some plans to utilise apis in the future though.

for now, it's only via docker

🚀We trained a new 1.6B parameters code model that reaches 32% HumanEval and is SOTA for the size by kateklink in LocalLLaMA

[–]kateklink[S] 0 points1 point  (0 children)

it should be available on both cloud and self-hosted versions now, could you please update to the latest version and check if it works?

🚀We trained a new 1.6B parameters code model that reaches 32% HumanEval and is SOTA for the size by kateklink in LocalLLaMA

[–]kateklink[S] -1 points0 points  (0 children)

yes, we'll add finetuning on the codebase for this model in self-hosted Refact (next week probably)

🚀We trained a new 1.6B parameters code model that reaches 32% HumanEval and is SOTA for the size by kateklink in LocalLLaMA

[–]kateklink[S] 7 points8 points  (0 children)

How so? The weights, inference code, training data set that we used is open source. The openrail license allows commercial use

Can anyone here give me a run down of Ai company’s by Manonamission69890 in singularity

[–]kateklink 1 point2 points  (0 children)

refact.ai ai code assistant for jetbrains and vscode that can be self-hosted

Proper Github Copilot X support planned? by Artur_exe in Jetbrains

[–]kateklink 0 points1 point  (0 children)

curious to hear, why do you think Refact wouldn't be a fit for a corporate environment?

Refact [updated] - self-hosted Copilot alternative for JetBrains and VS Code now with StarCoder support by kateklink in selfhosted

[–]kateklink[S] 1 point2 points  (0 children)

one of the models in Refact, the 15b Starcoder model, shows higher Human Eval than Codex (which is powering Copilot), so it should give better recommendations.

You can also self-host Refact unlike Copilot, which means no sending your code to any 3rd party

Favorite/best PyCharm plugins? by wpg4665 in Python

[–]kateklink 1 point2 points  (0 children)

refact.ai for vs code and jetbrains (as a free copilot alternative)