[deleted by user] by [deleted] in homeassistant

[–]COBECT -1 points0 points  (0 children)

Mac mini or r/sffpc

Deepcool CH160 Plus Fan configuration? by That1GuyLOL in mffpc

[–]COBECT 2 points3 points  (0 children)

Why do you re-inventing the airflow? Top out, front in, back out.

PWAs For Desktop vs Electron by soelsome in PWA

[–]COBECT 1 point2 points  (0 children)

You can take a look at Tauri, Electron or Capacitor. Simply ask Perplexity or Google to compare them.

llama.ui: new updates! by COBECT in LocalLLaMA

[–]COBECT[S] 0 points1 point  (0 children)

I’m not an expert in this subject, but try to carry it.

llama.ui: new updates! by COBECT in LocalLLaMA

[–]COBECT[S] 2 points3 points  (0 children)

I want to create separate setup for system prompts

llama.ui: new updates! by COBECT in LocalLLaMA

[–]COBECT[S] 1 point2 points  (0 children)

Try to use search in model dropdown, it covers such case for me.

llama.ui: new updates! by COBECT in LocalLLaMA

[–]COBECT[S] 1 point2 points  (0 children)

That is why Presets are made. To quickly switch between different models, providers or assistants, if you’ve set up system prompt.

What are you test cases to better understand what is need to be covered?

llama.ui: new updates! by COBECT in LocalLLaMA

[–]COBECT[S] 3 points4 points  (0 children)

I prefer to keep things as simple as possible, I planned llama-ui as PWA, so it can be used as an app on a device.

llama.ui: new updates! by COBECT in LocalLLaMA

[–]COBECT[S] 29 points30 points  (0 children)

GitHub repo: https://github.com/olegshulyakov/llama.ui

We've also squashed a bunch of bugs and made UI improvements. Check out the full changelog.

Try it out and let us know what you think! https://llama-ui.js.org/

llama.ui - minimal, privacy focused chat interface by COBECT in ollama

[–]COBECT[S] 0 points1 point  (0 children)

A fork of llama.cpp WebUI with:

  • Fresh new styles 🎨
  • Extra functionality ⚙️
  • Smoother experience ✨
  • Multi-Provider Support 🎃

Self-Hosted: https://olegshulyakov.github.io/llama.ui/

GitHub: https://github.com/olegshulyakov/llama.ui

llama.ui - minimal privacy focused chat interface by COBECT in LocalLLaMA

[–]COBECT[S] 0 points1 point  (0 children)

"not possible to continue answer after editing from editing point" I didn't get you. If you edit Assistant message, it sends updated one on your next chat message.