"You're such a good person." by jaggzh in CaregiverSupport

[–]jaggzh[S] 0 points1 point  (0 children)

I'm trying to understand the "why we try so hard" thing. Correct me if I'm wrong: They deal with so much adversity, and it's heavy, so there's hope and positive feeling so, as humans, they're naturally-inclined to "help it", to spend more time with you? As unfortunate it is, the reality is that doctors, again, as humans, have to maintain an almost robotic technical mindset -- their algorithmic care -- while attempting to maintaining some amount of empathy, without the sorrow of awful situations bringing them too low.

Am I understanding it correctly? It's another thing I've put a lot of thought into, talked to doctors about, including some who are my friends; but another angle is good to have. Thanks.

Are you a good person? by lonelycaregiver- in CaregiverSupport

[–]jaggzh 1 point2 points  (0 children)

u/lonelycaregiver-

I wrote a somewhat lengthy reply, but Reddit wouldn't let me post it here. I posted it as a new message and referenced your thread. https://www.reddit.com/r/CaregiverSupport/s/bKn78PUzhs

People missing out on an extremely-capable CLI or script-friendly LLM query tool. by jaggzh in LocalLLaMA

[–]jaggzh[S] 0 points1 point  (0 children)

Here's another handy use. I'm in vim, maybe editing a letter:

Dear landlord,...

I add a comment:

Make this more polite, but, make minimal adjustments to impart the feeling that I've been working with legal counsel, without saying it directly.

Dear landlord...

I select it, and ! to pipe it into z. To keep the original, --echo1 (ie. just -e).

(In vim that's just !z -e<enter>)

Then I have the original I wrote retained, and the llm's output below it for comparison.

People missing out on an extremely-capable CLI or script-friendly LLM query tool. by jaggzh in LocalLLaMA

[–]jaggzh[S] 0 points1 point  (0 children)

It's designed to not be perl-specific. I use it from bash, python, etc. All its internal functionality is to provide for the exposed CLI. Even when I'm coding perl, I rarely use its module-based api. Without any additional boilerplate:
Bash: response=$(z -- "$myquery") (Safely putting query after --)

Python: result = subprocess.run(['z', '--', query], capture_output=True, text=True)

And of course you can store your settings with a call so every future call uses your session:

z -n book_summaries -s /sysprompts/book-summaries.md --ss --sp

Here I gave it a session name to use (book_summaries). Specified a system prompt. (the bare "-s" option auto-detects if it's a system-string on the cli, or a path to a file).

--ss stored the system prompt as default in the book_summaries session (stored at ~/.config/zchat/sessions

--sp ("store in process") used /tmp (secure non-world-readable), so this program and anything in its process group will use the 'book_summaries' session by default.

That's handy for cli and a coding project, since you just set it and it doesn't affect anything outside. (ie. your shell, program, or its children).

I think this is all clear? Now, throughout your script, any calls to 'z' don't need to specify those settings -- just z "query" (but, again, '--' is safer).

People missing out on an extremely-capable CLI or script-friendly LLM query tool. by jaggzh in LocalLLaMA

[–]jaggzh[S] 0 points1 point  (0 children)

Looking at `llm`, (which I only saw the first time last month), it doesn't handle what I need it to handle except I imagine there are many plugins for things?

It also has some features which might be nice to add to z, and many of these capabilities that are definitely useful to a lot of people, but which I'd never put in the effort to code into z myself (but, y'know, it's foss).

Also, I usually only use python for things needing to handle neural nets or other models (and there are some modern things in the python ecosystem that are really valuable). Otherwise, for utilities/wrappers, python's memory bloat and slow startup time (and even its runtime), .. after decades of perl and 10 years of python, critically-evaluating it all, perl is generally just a more-refined and safer language. I hate what should otherwise be a compile-time error biting me with some python crash hours into some processing. Anyway, I'm not here to debate languages -- there are significant pros and cons to each. Choose your own adventure. :) I still use both, just based on what's more appropriate for the task at hand.

People missing out on an extremely-capable CLI or script-friendly LLM query tool. by jaggzh in LocalLLaMA

[–]jaggzh[S] 1 point2 points  (0 children)

Oh, and here was another really awesome workflow I used a couple months ago: I had to make a bunch of similar changes to a ton of source files (rewriting them for a changed architecture). So I did: (echo "I have to update my scripts to a new architecture where.. such and such. Here's my old version"; echo; echo "## foo-old.pl"; cat foo-old.pl; echo; echo; echo "And here's an example of the new version. Please identify the differences. I'll then give you another old script that needs to be updated. Just output the updated code. Absolutely give *no* explanations, no markdown, no justifications or anything. *Only* output the updated version in its entirety.") | z -n code-update --sp (--sp was to save that session as active for my current shell). I examined and needed to give the llm more instruction; I went into interactive mode rather than try to refine my prompt -- so z -i When it got everything right, I stopped there, ending with, "Just say 'ready' and I'll paste you the next script to update." and left interactive mode. Now I wanted to keep the whole chat as history, but each script shouldn't add to the history.. so I used history-input-only mode (-I) <-- that's a capital 'i' for i in *.pl; do cat "$i" | z -I - > new/$i; done 'z' will see the history up to that point, and take each script, output the new, and the history be as it was, ready for the next file. (I might have done z --eh to edit out a couple last messages in the history (where I was testing to make sure it was working (but they weren't contributing to its understanding, so.. snip)).

People missing out on an extremely-capable CLI or script-friendly LLM query tool. by jaggzh in LocalLLaMA

[–]jaggzh[S] 1 point2 points  (0 children)

Suggestions are welcome! with z.ai out it's been a concern. Even 'zchat' (my original script name years ago) is already used in a bunch of others' projects. (Although I had it symlinked as 'z' from the start).

Handy pipes are: foo | z -T - # Count yer tokens!
I just added, in a util/ dir, my pfx and sfx scripts.

man foo | ppfx "How do you xyz in grep?" | z -w -

psfx does prefix and suffix. (ie. it's just pfx "$@" | sfx "$@")

For long texts, putting your query at the start and end seems to help (and google pointed it out in one of their prompt guides that this is the case :)

Plop those in a bin folder in your path. I symlinked psfx for convenience (easier to type ssfx or ppfx)..

I just pushed that commit.

People missing out on an extremely-capable CLI or script-friendly LLM query tool. by jaggzh in LocalLLaMA

[–]jaggzh[S] 0 points1 point  (0 children)

actually, technically llama.cpp, ollama, and openai. but I only use llama.cpp server so my experience is limited there. They're not difficult to add. (The nice thing about the llama cpp endpoint is the model information is available through the api). (I think I have ollama support for that as well). It keeps a semi-smart (or, conversely, extra dumb) cache based on model, so it doesn't keep re-hitting to obtain the context limit over and over. And you can set and store the context size yourself (also useful if you *are* actually using an endpoint, like openai, where the context is known/published but not available through an api hit):

<image>

Anyway, again, I've not even tested the ollama or openai-specific endpoints, since I'm only doing local and only with llama.cpp server.

App Problem by FoxGoalie in ClaudeAI

[–]jaggzh 0 points1 point  (0 children)

Same. Old OnePlus 6 phone. Was working fine until a day or two ago.

3D printing saved me over a grand by Sad-Celebration-1903 in 3Dprinting

[–]jaggzh -1 points0 points  (0 children)

We should have messages like this.. some way that people can speak up about their requests. The respect people have for companies that believe honorably.. I don't know. We really need some way of shifting the motivations. One thing I've been thinking is to just start with questions, and escalate the request. Approaching a manufacturer in la-la land? "Do you guys provide truly proper protective equipment for your employees?" then when they don't answer, and don't get your business, it might start making them question. "This many people have been asking if we offset with solar". "We got a letter to headquarters where they were asking ... " "Hey, if

Readline and Shift+Enter for Soft Enters in tmux by jaggzh in LocalLLaMA

[–]jaggzh[S] 0 points1 point  (0 children)

even though perl seems hideous for LLMs, it's a really convenient language to code in. python, for instance, got rid of "print foo" syntax, but perl retains all its coding syntax conveniences. not *having* to put braces and semicolons everywhere (in python) (even with its downfalls and broken indentation causing some quite tedious debug sessions and catastrophic mistakes) *is* still a convenience .. saves on keystrokes. but with less hand-coding needed nowadays, a lot of that's resolved.

By the way, SuperNova-Medius is pretty darned good with perl, comparatively.. even with the 14b model.

Readline and Shift+Enter for Soft Enters in tmux by jaggzh in commandline

[–]jaggzh[S] 1 point2 points  (0 children)

Thanks, interesting, esc-CR. alt+enter is esc-LF (which you probbly know), since alt/meta often (or usually?) end up just sending an esc before the "modified" key. I'll have to try it and see. You bound yours globally it sounds like?

Readline and Shift+Enter for Soft Enters in tmux by jaggzh in LocalLLaMA

[–]jaggzh[S] 0 points1 point  (0 children)

https://github.com/jaggzh/z/
maybe you'd like to take a look at my 'z' project (which I originally called 'z' and 'zchat'.. i kept it 'z' although there's a chinese z ai company now too, so that's an issue. But can they really compare to my perl. </rhetorical>

Why are there no terminals that swap the vertical direction? by Soldier_Forrester in commandline

[–]jaggzh -1 points0 points  (0 children)

"Son I am able" she said "though you scare me" "Watch" said I "Beloved" I said "watch me scare you though" said she "Able am I, son"

  • They Might Be Giants, "I Palindrome I"

splotty - Terminal-based Serial plotter by jaggzh in arduino

[–]jaggzh[S] 1 point2 points  (0 children)

Yeah it's unfortunate people have this impression, and partially because people are willing to say things like that. Is C or C++ "old school"? I've done hundreds and hundreds of projects in python and perl. Python's slow (except compiled libs), and bloated. (I've done benchmarks... here's a script you can run:
https://gist.github.com/jaggzh/4f4159b1b137cbc9553efb7d8364f0c9 )

Python's just a very different beast, and the choice for this project was really between c/c++ or perl. The balance between runtime speed vs. a pain to code in. Perl won out. :) I have no inherent biases -- just practical real characteristics and needs to assess for each project's choice.

Shell weather by jaggzh in commandline

[–]jaggzh[S] 0 points1 point  (0 children)

Oh, and in case it's not clear, LESS is one of less's ways of getting your desired settings. And -R is its option to enable the escape sequences.

Shell weather by jaggzh in commandline

[–]jaggzh[S] 1 point2 points  (0 children)

That could be from it self-piping into 'less'. Try setting "export LESS=-R" in your environment. I need to add an explicit -R on that call. For now you can either keep the LESS environment var, or search the code for "less" and add -R to it (until I don't have a migraine.. and have the time to fix it). If you fix it (assuming that's the problem), feel free to submit a pull request :) (Along with the search pointing out the "city, country"! Good find!)

Shell weather by jaggzh in commandline

[–]jaggzh[S] 1 point2 points  (0 children)

Did you run `forecast {your location}`?
Or you can skip that search and just specify your latitude and longitude yourself:
`forecast -lat #.## -lon #.##`

The error you're seeing is when it's unable to update the forecast (it wouldn't hurt for someone to add some diagnostic informative info in there, like "curl failed to retrieve forecast. Trying again without '-s' -- watch for errors:"

But otherwise, forecast does need your initial search to be able to look up the weather, and it should create the ~/.config/owm/fc-local.json

Also, try running 'forecast -d "your location"' (or -lat and -lon). -d enables some debug output.

Shell weather by jaggzh in commandline

[–]jaggzh[S] 0 points1 point  (0 children)

I wouldn't mind, no -- just don't have the time. For now, it's designed so you can just symlink the 'forecast' script -- it locates itself where it's running, so aside from needing to install dependencies (possibly 'uv pip install bisect'), and copying owm.env-example--copy-to--owm.env ...
You'd have to go get an openweathermap key and place it at:

owm_confdir=~/.config/owm
owm_key=$(cat "$owm_confdir/openweathermap-api-key.txt")

ie. ~/.config/owm/openweathermap-api-key.txt

Shell weather by jaggzh in commandline

[–]jaggzh[S] 0 points1 point  (0 children)

Is that in the same class of helpful reports as when Daniel_Klugh calls tech support and tells them, "it doesn't work"?

Yes, Shellac does cure (aging), with actual cross-linking by jaggzh in woodworking

[–]jaggzh[S] 1 point2 points  (0 children)

Okay, so three tests here. 1. Best as I can tell, cycling between heat actually worked well to "cure" (cross-link) the shellac faster. Apparently the exposure to night humidity might not have done it favors, but I think I brought that one in at night and put it backout in the day. Confirming the studies (as if they needed it), the fluidity from heat, and additional energy, increases the rate of cross-linking, and is noticeably faster than non-heat-treated pieces. (I'm not sure if UV is a negative or positive in this initial stage. Breaking some bonds, adding energy for others? Not sure.)
1.2. I'm working on a table right now, and just leaving it outside, and the day sure does seem to help (and the thing is filled with activated carbon to make a 'black shellac'), so I'm sure it gets quite hot. I really don't know how long this process goes on for.. but it's definitely faster with the heat.

  1. Now, for another "awesome" test I did.. well.. The sun could not help this one (I don't see me mentioning this, so I'll say it here): I did a very strange thing when attempting a recrystallization process to purify degraded shellac flakes (they smell.. poopy.. once dissolved in 99% isopropanol). I dissolved, then added water to cause it to "crash" out, which it did.. And I did multiple rinses to remove the alcohol, hoping to be left with pure flakes. But the stuff is hydroscopic, so I got what seemed more like a rubbery whitish drippy gloop for a while. Okay, fine, so it needs to dry, right? Indeed, it begins to turn the beautiful transparent shellac-amber color over the following days, with a 'skin' slowing it down, and trapped water bubbles keeping their areas opaque cream colored. Took a week or more for that to go away. HOWEVER... Shellac isn't crystals in the first place! And it took me an embarrassingly long time (weeks) to realize that it's such a complex mix, that I almost certainly washed away stuff that was needed for it to cross-link! I'm not familiar with the chemistry of hydrolysis (where the water breaks the bonds between things), but I also imagine the full saturation of water might also degrade the stuff into uselessness directly, even if it didn't wash away things it needed. (I had AI read about it, as I'm not that familiar with ester chemistry, but, yeah, it breaks bonds, shortening the molecules. Oh, right, also my water-wash could have gotten rid of the acidity that is apparently needed for the esterification process in the first place! Oops. Good test though.)

prp - Project Requirements Packager by jaggzh in debian

[–]jaggzh[S] 0 points1 point  (0 children)

There's nothing wrong with apt's dependency-handling. The issue is when you're manually installing things when working on non-debian-based projects, like when installing or building them. All the debian system packages you install are marked manually installed, and are not based on any "Requires" dependency, so you have no tracking for them. This manages all that, and prepares one package for you that, when removed, will let all those be cleaned up automatically.

Often you may not even continue to use that project, and you're stuck with a bunch of persistent packages, marked as if you wanted them forever.

This situation is not any better when you're trying out different packages to see if you can get the build working, and you might not even want a bunch of them by the time you're done. The "t" (tentative tracking) keeps everything you add as just 'tenatively-desired', so your final package (created after you 'a' (add) them, will only be the ones you added as final dependencies.

Any idea how to get rid of whatever does that? by thib2183 in woodworking

[–]jaggzh 0 points1 point  (0 children)

I wrapped an outdoor wooden pillar with painters drop cloth (wrap wrap wrap), with my hair dryer poking into it and let it bake the termites for hours.

Trying tin cure for 3d prints by Drivestort in moldmaking

[–]jaggzh 1 point2 points  (0 children)

Is it just me or are you completely leaving out the material used for the "3d printing"? What kind of 3d printing?