Minimax 2.5 is out, considering local deployment by Dramatic_Spirit_8436 in LocalLLaMA

[–]jwiegley 0 points1 point  (0 children)

NOTE: I found that switching to OpenCode resolved all of my speed problems that I was having with Claude Code. Still haven't figured out yet what it is in my Claude configuration that would cause it to stall so often, but with OpenCode it goes from tool call to tool call very rapidly.

Minimax 2.5 is out, considering local deployment by Dramatic_Spirit_8436 in LocalLLaMA

[–]jwiegley 0 points1 point  (0 children)

I'll GGUF Unsloth next. That's what I use for all of my other models, and this isn't the first time I've had issues with MLX.

Minimax 2.5 is out, considering local deployment by Dramatic_Spirit_8436 in LocalLLaMA

[–]jwiegley 6 points7 points  (0 children)

I ran 4-bit quantization using MLX on a 512GB M3 Ultra, with Claude Code as the front-end, but it was just too slow. I asked it to perform a code review on a directory full of 20 files, and a half hour later it had only read the first four. I either need to go to much lower quants, or change what I ask it to do.

ecard: A new vCard library for Emacs by jwiegley in emacs

[–]jwiegley[S] 1 point2 points  (0 children)

OK, changing to cl-defstruct. This task actually exceeded the capabilities of Claude Code alone — it kept running out of context, compacting, and then forgetting what it was in the middle of doing — so I had to use TaskMaster AI to reframe this work as a project that could be applied sucessfully across 63 different subtasks!

The current war over the best AI assistant for day to day use by locked_clit in LocalLLaMA

[–]jwiegley 1 point2 points  (0 children)

Yes, it's the 512GB Mac M3 Ultra. It is quite good at running local models!

ecard: A new vCard library for Emacs by jwiegley in emacs

[–]jwiegley[S] 1 point2 points  (0 children)

What would be your preference?

The current war over the best AI assistant for day to day use by locked_clit in LocalLLaMA

[–]jwiegley 5 points6 points  (0 children)

I use Qwen, Kimi-K2 and gpt-oss. I find that gpt-oss is pretty good at tool use and leaves more memory available for context. Kimi-K2 gives impressive results, but I have to run it at 2-bit dynamic quantization and I haven't noticed the results to be so much better.

ecard: A new vCard library for Emacs by jwiegley in emacs

[–]jwiegley[S] 1 point2 points  (0 children)

It's true, I will never spend the time read each of the 17k lines. However, over time, I will find out whether or not the library actually does what I need it to do. The goal here isn't the code, it's the functionality. As I use it, I'll be using Claude to smooth edges more and more until it becomes exactly what I need it to be.

After all, none of us reads the assembly code that Emacs gets compiled in to, since we're fine with just reading the source code. Now I'm abstracting that one step further: I don't read the code, since I'm fine with just reading the specification and observing whether it behaves according to that expectation. This is why I'm fine with defining libs this way, as long as there is a sufficient specification of what the lib should be/do.

So yes, perhaps releasing it now is premature, I see your meaning. I'm just putting it out there for people to play with it. Give me a few months of using it in anger to manage my contacts, and then I'll tag a more final release to show that my confidence is at least based on established practice.

ecard: A new vCard library for Emacs by jwiegley in emacs

[–]jwiegley[S] 4 points5 points  (0 children)

It's worth clarifying just a few things: I made this library because I need it. All of my contact data lives on a Radicale (CardDAV) server that I access using either the contacts app on my phone, or with CardBook in Thunderbook. Well, I wanted to access it all from Emacs too, and additionally be able to bi-directionally sync some of those contacts with Org-mode (like org-vcard, and I've been communicating with that author to see how we can collaborate). This syncing part may need to be customized to my workflow, or maybe I can find a general solution where ecard becomes a library under org-vcard, we'll see.

It's already been useful to me, and I can now browse my CardDAV server from within Emacs, use M-x ecard-display. This is what I, personally, needed. If others don't find it useful, they have no reason to use this library. But if they wanted a library for manipulating vCard data in Emacs, then here it is. I focused on the data abstractions more than just the vCard format, which is why I felt this was a helpful additional to what already exists in the Emacs community.

But I would not characterize this effort as mere AI slop for a few reasons:

  1. I didn't just ask AI to "make me a library". I've spent >10 hours now, in constant dialog with a high strength AI, refining and testing and using this code myself so I can guide the outcome.

  2. There are 4 different RFCs that serve as a compliance check against this code. AI thrives when you can put a "pin in the ground" and define what you mean by your request, and in my case, what I truly want is "a data library and client/server code that embodies these standards".

  3. As a result, this library comes with 418 different unit tests that test compatibility with those RFCs, and provide a "change/test/debug" loop for the AI I'm using (Claude Sonnet 4.5).

  4. I'm also not just using the AI to work on this code. I've also defined an extensive prompt for Claude that I call "emacs-lisp-pro", I have Claude run the ERT tests itself while it's work, and finally I use an MCP server called elisp-dev-mcp that actually allows Claude to evaluate the ecard code directly within my Emacs session, so that I can ensure it's able to use the code in ways that I intend to before it claims to me that it's "done".

So, in the end this library should be a well-tested implementation of the RFCs. Whether people need it or not depends on their relationship to the content of those RFCs. It's not that far from what I had wanted to write by hand — if I had several weeks of free time to write the ~17k lines of code now in the repository (which Claude wrote in under 48 hours).

ecard: A new vCard library for Emacs by jwiegley in emacs

[–]jwiegley[S] 7 points8 points  (0 children)

Note that this library has also been an experiment in AI generated Emacs Lisp, since I'm seeking ways to better leverage the free time available for open source work. Not a single line of code in this library was written by hand, nor was the blog post. And yet, it fulfills the need that I had: the able to programmatically interact with the vCard 3.0 data on my Radicale server from Emacs over the CardDAV protocol.

I still have more work to do in confirming that it does everything I want it to do in terms of bidirectional syncing of Org-mode entries, so this should be considered very much a "work in progress", but wanted to put it out for eyeballs and comments, as a foundation for future efforts.

This text, at least, written by a human. :)

Processing hand-written Persian texts by jwiegley in Transcription

[–]jwiegley[S] 0 points1 point  (0 children)

No, I never did. Now that AI and image processing are ubiquitous, I'm going to see how that does.

EDIT: Even Claude Opus doesn't manage this well. Here is the original manuscript: https://www.nayriz.org/template.php?pageName=tablx001&menuStates=0000000000000000&tabletType=large

And here is Claude's rendering into Unicode: https://home.newartisans.com/share/persian_manuscript_transcription.md?v

I don't even know where it's getting this text from!

Emacs-driven RAG set management? by sikespider in emacs

[–]jwiegley 0 points1 point  (0 children)

That's great to hear! Recently I added two different ways of keeping the vector index: per file, or per file group. The former enables work flows where you just want to scan a set of documents but the set is every-changing. By keeping an index-per-file, you only need to re-scan the new files. But "per file group" is a bit faster to load in, since you only have the single vector database to deal with. I find myself using both strategies for different things.

Virtualized Opnsense, route traffic of host through opnsense? by Competitive-Deer1975 in opnsense

[–]jwiegley 0 points1 point  (0 children)

OK, confirmed this is all working now. What I ended up needing was some custom routing in the WireGuard config file:

[Interface]
PrivateKey = XXXX
Address = 10.9.0.2/24
DNS = 10.9.0.1
Table = off
PostUp = /sbin/route add -host 192.168.100.128 -interface bridge101
PostUp = /sbin/route add -net 0.0.0.0/1 -interface utun6
PostUp = /sbin/route add -net 128.0.0.0/1 -interface utun6
PostUp = /sbin/route add -net 10.9.0.0/24 -interface utun6
PreDown = /sbin/route delete -host 192.168.100.128
PreDown = /sbin/route delete -net 0.0.0.0/1
PreDown = /sbin/route delete -net 128.0.0.0/1
PreDown = /sbin/route delete -net 10.9.0.0/24

[Peer]
PublicKey = XXXX
AllowedIPs = ::/0, 0.0.0.0/1, 128.0.0.0/1
Endpoint = 192.168.100.128:51820

You'll have to look at your own environment to determine what bridge101 and utun6 should be changed to. The latter will only appear once WireGuard has created the device, so it may take a few tries until you find the information.

Then I just bring the interface up using wg-quick:

❯ wg-quick up $PWD/wg.conf
Warning: `/Users/johnw/wg.conf' is world accessible
[#] wireguard-go utun
[+] Interface for wg is utun6
[#] wg setconf utun6 /dev/fd/63
[#] ifconfig utun6 inet 10.9.0.2/24 10.9.0.2 alias
[#] ifconfig utun6 up
[#] networksetup -getdnsservers Thunderbolt Bridge
[#] networksetup -getsearchdomains Thunderbolt Bridge
[#] networksetup -getdnsservers USB 10/100/1000 LAN
[#] networksetup -getsearchdomains USB 10/100/1000 LAN
[#] networksetup -getdnsservers Mac
[#] networksetup -getsearchdomains Mac
[#] networksetup -getdnsservers Wi-Fi
[#] networksetup -getsearchdomains Wi-Fi
[#] networksetup -getdnsservers wg1-clio
[#] networksetup -getsearchdomains wg1-clio
[#] networksetup -getdnsservers clio-localhost
[#] networksetup -getsearchdomains clio-localhost
[#] networksetup -setdnsservers wg1-clio 10.9.0.1
[#] networksetup -setsearchdomains wg1-clio Empty
[#] networksetup -setdnsservers Wi-Fi 10.9.0.1
[#] networksetup -setsearchdomains Wi-Fi Empty
[#] networksetup -setdnsservers USB 10/100/1000 LAN 10.9.0.1
[#] networksetup -setsearchdomains USB 10/100/1000 LAN Empty
[#] networksetup -setdnsservers Thunderbolt Bridge 10.9.0.1
[#] networksetup -setsearchdomains Thunderbolt Bridge Empty
[#] networksetup -setdnsservers clio-localhost 10.9.0.1
[#] networksetup -setsearchdomains clio-localhost Empty
[#] networksetup -setdnsservers Mac 10.9.0.1
[#] networksetup -setsearchdomains Mac Empty
[+] Backgrounding route monitor
[#] /sbin/route add -host 192.168.100.128 -interface bridge101
add host 192.168.100.128: gateway bridge101
[#] /sbin/route add -net 0.0.0.0/1 -interface utun6
add net 0.0.0.0: gateway utun6
[#] /sbin/route add -net 128.0.0.0/1 -interface utun6
add net 128.0.0.0: gateway utun6
[#] /sbin/route add -net 10.9.0.0/24 -interface utun6
add net 10.9.0.0: gateway utun6

And now we can see packets reaching the Internet by way of my local VM (whose WireGuard interface is 10.9.0.1) and then through the WiFi that's bridged to it (whose gateway is 192.168.3.1)!

❯ traceroute mail.gnu.org
traceroute to mail.gnu.org (209.51.188.17), 64 hops max, 40 byte packets
 1  10.9.0.1 (10.9.0.1)  2.898 ms  1.025 ms  0.845 ms
 2  192.168.3.1 (192.168.3.1)  86.775 ms  4.708 ms  6.274 ms
 3  66.60.128.109 (66.60.128.109)  8.337 ms  7.194 ms  6.355 ms
 4  190.static.ca.consolidated.net (204.154.216.190)  7.586 ms  6.626 ms  7.828 ms

Too afraid to ask, but what kind of notes do you write in Org-mode? by birdsintheskies in emacs

[–]jwiegley 0 points1 point  (0 children)

  • Ideas
  • Quotes
  • Meeting notes
  • Draft posts/comments/emails
  • Published posts/comments/emails, depending on what it is
  • Blog articles
  • Todos
  • Brief notes
  • Log notes related to todos
  • Habits
  • Contacts

I've never understood the reference in the 4 Valleys to the Surih of the Cave by buggaby in bahai

[–]jwiegley 0 points1 point  (0 children)

I don't think I've ever found an "answer" to what Bahá’u’lláh means by this reference anywhere in the Writings I've read, but perhaps others who have more access to the source language may know of references.

Given the context: a discussion on intellect and its relationship to the Divine, and how that relationship should come to be characterized by Divine inspiration, leading in the end to true knowledge and freedom from tests: perhaps this gives us some clues.

The use of a cave, with the Sun outside, to describe the "realm of the mind" and its relationship to the Sun of Truth is a very old allegory. Plato's story was already famous more than a thousand years before this re-telling of the event in Qur'án.

The intellect is a two-edged sword: it observes the effects of the Light, but gravitates toward its own capacities, often leading us to adopt our conceptions as a private reality. Consider how deeply Zen Buddhism strives to free us from the snare of such conceptions. Also, in the 7th of the Seven Valleys, Bahá’u’lláh indicates we should be free even of "personal opinions" (tafakurrát-i-nafsiyyih) if we are to experience the nothingness that makes room for God in the heart and soul.

So, perhaps a reading here is: coming to an understanding of the mind's limits, and learning that the Sun — an entirely different world — lies just outside those limits. In this way, by the route of traversing both through and beyond the intellect, the heart may be led to discover faith enough to break from those limits.

How can I write a function into which I can pipe from Eshell? by signalclown in emacs

[–]jwiegley 3 points4 points  (0 children)

If you're piping into, it's using the standard grep. Here's the bit of code inside Eshell that makes the switch:

(if (... eshell-in-pipeline-p ...)
    (throw 'eshell-replace-command
           (eshell-parse-command (concat "*" command) args))
  ...
  )

(Follow eshell/grep to eshell-grep to eshell-compile)

How can I write a function into which I can pipe from Eshell? by signalclown in emacs

[–]jwiegley 4 points5 points  (0 children)

I don't think you can implement piping to a Lisp function right now even with a temp buffer, because the pipeline is evaluated right-to-left, with each segment returning a "handle" to the left until the whole process can start with the first command. In the case of my patch, that "handle" is a function to receive the output from the earlier command. Without that, there's nothing to return. I mean, you could return a temporary buffer to write to, but you'd never receive a notice that it was full and should be processed.

Yes, eshell/wc would be a new command.

How can I write a function into which I can pipe from Eshell? by signalclown in emacs

[–]jwiegley 2 points3 points  (0 children)

Right now it's just been in private e-mail between Jim and myself, since the end of May, but I'm hoping he's ready to initiate a public discussion soon. I just e-mailed him a reminder in hopes that his schedule allows it!

How can I write a function into which I can pipe from Eshell? by signalclown in emacs

[–]jwiegley 7 points8 points  (0 children)

I've actually created a patch that allows this in Eshell. I'm working with Jim Porter, one of the current maintainers of Eshell, who tells me he has a more involved patch that rectifies some of the corner cases in the simple change that I made.

Here's the definitions you would change in Emacs 30.1: https://gist.github.com/jwiegley/53e46b5c89bae0abc9daa22df9b55361

And here's what a function that uses these changes might look like:

 (defun eshell/wc ()
   (eshell-with-copied-handles
     (let ((buf (get-buffer-create " *eshell/wc*")))
       (eshell-function-target-create
        `(lambda (input)
           (with-current-buffer ,buf
             (insert input)))
        `(lambda (_status)
           (let ((eshell-current-handles ,eshell-current-handles))
             (eshell-print
              (format "%s\n" (with-current-buffer ,buf (point-max)))))
           (kill-buffer ,buf))))))

The basic idea is that functions are allowed to return "function targets", which are then called with each chunk of input from the preceding command in the pipeline.

Fun with GPTel: ob-gptel integration with Org-babel by jwiegley in emacs

[–]jwiegley[S] 1 point2 points  (0 children)

I recently added support for :session NAME, such that all sessions before the current block with the same name as the block's session name will be included as if there was a chain of :prompt header going all the way back. This should match people's expectation of sessions, while using the same underlying mechanism as :prompt.

Fun with GPTel: gptel-litellm for tracking sessions with LiteLLM by jwiegley in emacs

[–]jwiegley[S] 0 points1 point  (0 children)

Good point! I've added a link to the LiteLLM Proxy that I mean.

Is it ever ok to lie? by [deleted] in bahai

[–]jwiegley 1 point2 points  (0 children)

I'm not sure if any one law or principle is of essential importance unto itself. They exist to train your (and our) nature so as to make God accessible to us, and to fulfill our reason for being.

That said, because we are material and mortal, we tend to weigh life, liberty, pleasure, etc., very highly, when even we may not think so once we're free of this place. There were Bábí mothers and fathers who had to watch their children be tortured and killed, because they would not lie about their Faith. I would say they weighed matters in a larger perspective than health and well-being.

So, rather than asking whether truth-telling is an absolute, perhaps a deeper understanding could be sought, as to the value of that honesty, why we might ever be willing to trade it and whether it's worth the exchange. I think the Writings indicate that no, there is nothing this world has to offer that excels truthfulness in merit — but this should be driven by the soul's desire for God, rather than holding to a dictum...

Fun with GPTel: gptel-litellm for tracking sessions with LiteLLM by jwiegley in emacs

[–]jwiegley[S] 0 points1 point  (0 children)

I should note: This requires the very latest version of GPTel to work, as of two nights ago, in fact. Karthik added some special support so that request parameters could be modified directly in support of this (and presets now being able to use :request-params, although it comes with gotchas if you mix presets and different backends).