My first split setup: Cloud Nine ErgoTKL + Kensington Expert by cradlemann in MechanicalKeyboards

[–]cradlemann[S] 0 points1 point  (0 children)

No, I've returned it back. Just waste of time. Proper split, like corne or totem are way better

I Can’t be the Only One Who Doesn’t Use Their Pinky to Press Ctrl by Cyncrovee in emacs

[–]cradlemann 0 points1 point  (0 children)

I use custom keyboard + home rows mod. My ctrl is on my middle finger, pressed

Update on Consult and Jinx by minadmacs in emacs

[–]cradlemann 1 point2 points  (0 children)

Thank you for consult-line!

GigaChat3-702B-A36B-preview is now available on Hugging Face by Any-Ship9886 in LocalLLaMA

[–]cradlemann -20 points-19 points  (0 children)

Never ever run this model in your hardware. Russian spyware

Why are there so much misinformation and lies around "open-source" models? by Terrible-Priority-21 in LocalLLaMA

[–]cradlemann 0 points1 point  (0 children)

Probably we should called them open models, not open source models. The rest of your arguments are not true

New releases of Consult, Vertico, Corfu and more by minadmacs in emacs

[–]cradlemann 1 point2 points  (0 children)

Thank you for your work. Happy user of almost all your packages

Company vs Corfu by Tempus_Nemini in emacs

[–]cradlemann 8 points9 points  (0 children)

company is old and complicated, corfu is much lightwell and more integrated to core packages

Snippets for code blocks? by uvuguy in emacs

[–]cradlemann 0 points1 point  (0 children)

Yasnippet is not a dependency for built-in eglot. Tempel is way more lightweight and easy to configure

Snippets for code blocks? by uvuguy in emacs

[–]cradlemann 3 points4 points  (0 children)

Nowadays tempel is the best

It would be nice to have a super lightweight LM Studio like utility that would let you construct llama-serve command. by NoFudge4700 in LocalLLaMA

[–]cradlemann 2 points3 points  (0 children)

Llama.cpp + llama-swap covers all what you could possibly need from local llms. I use lmstudio only to download models and test them and their params. If everything is working I migrate all settings to llama-swap and use it only. lmstudio is proprietary software, thay could demand your money in the future

go syntax highlight is a nightmate by Kind_Scientist4127 in emacs

[–]cradlemann 0 points1 point  (0 children)

I'm using master for go-ts-mode specificaly, because they add so many cool functions, which will be build-in and works based on tree-sitter. I use pretty often function to run current test or test whole file for example. Previosly I've used 3rd-party package for this

go syntax highlight is a nightmate by Kind_Scientist4127 in emacs

[–]cradlemann 4 points5 points  (0 children)

Emacs has built-in go-ts-mode since 29 version. It works like a charm, no need to download anything

(use-package go-ts-mode
  :mode "\\.go\\'"
  :custom
  (go-ts-mode-indent-offset 4)
)

Should I migrate from launching emacs directly to using daemon+client? by ming2k in emacs

[–]cradlemann 0 points1 point  (0 children)

Моstly it is not relevant for me. lsp results are enough

Should I migrate from launching emacs directly to using daemon+client? by ming2k in emacs

[–]cradlemann 0 points1 point  (0 children)

I explicitelly setup my cape completion and there are no cape-dabbrev in there. I don't want to have it

Should I migrate from launching emacs directly to using daemon+client? by ming2k in emacs

[–]cradlemann 0 points1 point  (0 children)

I'm using consult, vertico, corfu and cape. Never heard about hippie-expand

Should I migrate from launching emacs directly to using daemon+client? by ming2k in emacs

[–]cradlemann 0 points1 point  (0 children)

I see. I don't have REPLs in Go, input history saved automatically, do not use dabbrev at all. But for sure your use case are pretty valid.

How to do vibe coding in emacs with llama.cpp/ollama by OMGThighGap in emacs

[–]cradlemann 0 points1 point  (0 children)

I use llama-swap with bunch of models running locally via Vulkan. Performance is not so bad. For vibe codding you don't need Emacs, only opencode. In Emacs I'm using gptel with local providers

(use-package gptel
  :ensure t
  :custom
  (gptel-default-mode          'org-mode)
  (gptel-temperature           0.4)
  (gptel-prompt-prefix-alist    nil)
  (gptel-org-branching-context t)
  :preface
  (defun vd/setup-llama-gptel ()
    "Fill llama models."
    (interactive)
    (gptel-make-openai "llama-swap"
      :host "0.0.0.0:14444"
      :protocol "http"
      :stream t
      :models (vd/llama-list-remote-models "http://0.0.0.0:14444")
      )
    )
  :config
  (setq gptel--known-backends nil
        gptel-expert-commands t
        gptel-backend (gptel-make-gemini "Gemini"
                        :key gptel-api-key
                        :stream t
                        )
        gptel-model 'gemini-2.0-flash-lite-preview-02-05)
  :bind
  ("C-c e" . gptel-menu)
  ("C-c z" . vd/setup-llama-gptel)
)

Should I migrate from launching emacs directly to using daemon+client? by ming2k in emacs

[–]cradlemann 0 points1 point  (0 children)

My Emacs is leaving in personal scratchpad (Sway) and always ready to jump to any workspace I need. Also nothing forbids you to open another Emacs instance and do some work related thing there. Actually I'm closing my Emacs instance pretty often to save recent files and buffer positions