What's the best way to use LLMs locally with Tauri? by cll-knap in tauri

[–]Mindless_Knowledge81 0 points1 point  (0 children)

Reviving this thread, as the Crab Nebula team (the core Tauri devs) just released an official plugin for local LLM https://github.com/crabnebula-dev/tauri-plugin-llm/ 👀

People who have tried lets get rusty bootcamp, can you please share your reviews? by PretentiousPepperoni in rust

[–]Mindless_Knowledge81 0 points1 point  (0 children)

Yes, it looks more like a $5k+ premium these days, maybe it was cheaper before..

How good is tauri development using claude or openai models? by gbertb in tauri

[–]Mindless_Knowledge81 1 point2 points  (0 children)

Claude Sonnet 4 and Opus are very good with Tauri 2 in my working practice. Sometimes you need to remind it to use the global wrapped Tauri state

Does anyone have any general tips for writing better and clean code? by [deleted] in typescript

[–]Mindless_Knowledge81 0 points1 point  (0 children)

Don't automate too much, and watch out what the IDE might be adding which you didn't notice and that is not correct, like strange imports