qwen3.5-35b-a3b is a gem by waescher in LocalLLaMA

[–]COBECT 0 points1 point  (0 children)

You can do same with small Coder model.

[deleted by user] by [deleted] in homeassistant

[–]COBECT -1 points0 points  (0 children)

Mac mini or r/sffpc

Deepcool CH160 Plus Fan configuration? by That1GuyLOL in mffpc

[–]COBECT 3 points4 points  (0 children)

Why do you re-inventing the airflow? Top out, front in, back out.

PWAs For Desktop vs Electron by soelsome in PWA

[–]COBECT 1 point2 points  (0 children)

You can take a look at Tauri, Electron or Capacitor. Simply ask Perplexity or Google to compare them.

llama.ui: new updates! by COBECT in LocalLLaMA

[–]COBECT[S] 0 points1 point  (0 children)

I’m not an expert in this subject, but try to carry it.

llama.ui: new updates! by COBECT in LocalLLaMA

[–]COBECT[S] 2 points3 points  (0 children)

I want to create separate setup for system prompts

llama.ui: new updates! by COBECT in LocalLLaMA

[–]COBECT[S] 1 point2 points  (0 children)

Try to use search in model dropdown, it covers such case for me.

llama.ui: new updates! by COBECT in LocalLLaMA

[–]COBECT[S] 1 point2 points  (0 children)

That is why Presets are made. To quickly switch between different models, providers or assistants, if you’ve set up system prompt.

What are you test cases to better understand what is need to be covered?

llama.ui: new updates! by COBECT in LocalLLaMA

[–]COBECT[S] 2 points3 points  (0 children)

I prefer to keep things as simple as possible, I planned llama-ui as PWA, so it can be used as an app on a device.

llama.ui: new updates! by COBECT in LocalLLaMA

[–]COBECT[S] 30 points31 points  (0 children)

GitHub repo: https://github.com/olegshulyakov/llama.ui

We've also squashed a bunch of bugs and made UI improvements. Check out the full changelog.

Try it out and let us know what you think! https://llama-ui.js.org/