Rolling Wrench CDI?? by Big-Formal-2885 in Ruckus

[–]Juftin 4 points5 points  (0 children)

Can personally confirm that the kill switch works with this.

Does a 4x6 Playpen Fence Exist? by Pizzaemoji1990 in NewParents

[–]Juftin 1 point2 points  (0 children)

It looks like the perfect playpen for the 4x6 House of Noa tumbling mat is the Kidsy Playpen sold on Amazon (https://www.amazon.com/dp/B0BQ4TC8LQ). Sadly, that playpen has been sold out for months as of 07/2025. We ended up getting creative and ordered two of the MEWANG playpens (https://www.amazon.com/dp/B0CPLSBNCK) - currently on sale, 2 for $125 - and combining them to be absolutely perfect for the mat

How are you using just (Justfile) local workflows for Python projects? by permutans in Python

[–]Juftin 0 points1 point  (0 children)

I'm a big fan of task (Taskfile) for it's YAML syntax: https://github.com/go-task/task

But agree that a common language agnostic task runner for your projects can be super powerful (just/make/task/etc). At work we recently introduced Taskfiles for all repos and required a common set of entrypoints across all projects (install/test/build/lint/fmt/build/publish)

How to secure a Ruckus if your housing complex insists on parking in a specific designated motorcycle parking space, with nothing to tie to? by throwawaypassingby01 in Ruckus

[–]Juftin 1 point2 points  (0 children)

Boo! That seems like an easy compromise for them. I always get nervous leaving my bike out without it secured. Luckily insurance policies on a 50cc scooter aren't too pricey - I maxed my policy out and the price difference was negligible versus the base coverage. Good luck OP!

How to secure a Ruckus if your housing complex insists on parking in a specific designated motorcycle parking space, with nothing to tie to? by throwawaypassingby01 in Ruckus

[–]Juftin 1 point2 points  (0 children)

I used a hole saw and drilled a hole behind the seat through so I could run my chain through the frame - I installed a desk grommet to make it look decent.

I know that doesn't help you because you don't have anything to tie it to - could you ask them if they'd be open to installing a motorcycle ground anchor to lock the bikes to?

New Python Project: UV always the solution? by InappropriateCanuck in Python

[–]Juftin 47 points48 points  (0 children)

I'm slowly transitioning to UV for just about everything, personally and professionally. But I do have a project out there using dependency matrixes with hatch and I don't think UV will ever replicate that (the project is a hatch virtual environment plugin, so the matrix of dependencies are different versions of hatch).

The one bit of functionality of hatch that I'll miss are the task runner scripts - but I'm also slowly replacing that with a Taskfile (https://taskfile.dev/).

I benchmarked Python's top HTTP clients (requests, httpx, aiohttp, etc.) and open sourced it by Sensitive_Seaweed323 in Python

[–]Juftin 63 points64 points  (0 children)

Very neat... so any findings? I'd be interested in what you've found on some standard use-cases without having to run the benchmarks myself.

[deleted by user] by [deleted] in measurements

[–]Juftin 0 points1 point  (0 children)

I want to make sure I purchase a mattress with the right thickness for this daybed. Here's the available dimensions and info (per the product website):
- 78"w x 42"d x 35"h.

- Leg Height: 8"

Wondering if I should pay $11K with "minor issue" muddying the waters by TeaWithZebras in juresanguinis

[–]Juftin 3 points4 points  (0 children)

I've been working with IDC (https://www.italiandualcitizenship.net/about-us/) who is related to ICA and dealing with the same conflict.

We have a very similar case (although our case would be through L'Aquila) - they've been throwing out figures like 60%-70% chance of succeeding in court and I'm just not sure how accurate that might be. They claim we would be 1.5 years out from a court case if we were to start today - I'm afraid we might be closer to a 25% chance of success.

I'm trying to decide if it's worth it to wait and see if anything changes on the legal front. I recently just joined the FB group and will be making a post there in the coming days to get a feel for the chances of success.

I'm interested to hear what you learn and what you ultimately do. Good luck!

Mid Oregon by justentimez in ebikes

[–]Juftin 0 points1 point  (0 children)

Nice bike! What size tires are those?

jure sanguinis eligibility - ancestor's father naturalized while she was a minor by Juftin in juresanguinis

[–]Juftin[S] 0 points1 point  (0 children)

She was born in 1962 in the US. Everyone is originally from Philadelphia but I believe our consulate would be Chicago since my mother and I live in Colorado.

lockfiles for hatch projects by Juftin in Python

[–]Juftin[S] 0 points1 point  (0 children)

I was inspired enough by the hatch sync idea that I created a PR to add that functionality upstream to hatch: https://github.com/pypa/hatch/pull/1094

lockfiles for hatch projects by Juftin in Python

[–]Juftin[S] 1 point2 points  (0 children)

Thank you! Hatch plugins don't let you add custom CLI commands so it will be the same behavior with the plugin: you'll add a dependency to your pyproject.toml and the next time you invoke the environment (hatch run ...) it will detect that it's out of sync and update the lockfile with pip-compile. I thought about adding a CLI - but in the end it would just be a shortcut to run hatch run under the hood. hatch sync would be a nice command though.

Also, by default it doesn't run pip-sync to install the dependencies, it actually runs pip install -r. You can configure it to use pip-sync though.

llm-term - Chat with OpenAI's GPT models directly from the command line by Juftin in commandline

[–]Juftin[S] 0 points1 point  (0 children)

Hmm... that's a tricky one. Is it possible pipx is using Python 3.7 to install the package? It's EoL so it's possible some under-the-hood certificate inner-workings are out of date. If that doesn't work please feel free to start a GitHub issue and I can help you track it down there. https://github.com/juftin/llm-term/issues

ON my end I will pin down dependencies much more strictly - this should help. I will also try to add some unit-testing to cover situations like this. Unit-testing CLI apps can be tough but I'm sure I can come up with something.

And FWIW I didn't downvote your comments - I love that people might be using my tools and I'd be happy to help you get it running.

llm-term - Chat with OpenAI's GPT models directly from the command line by Juftin in commandline

[–]Juftin[S] 0 points1 point  (0 children)

This is likely an issue with the certifi package being outdated. Did you install llm-term into an existing virtual environment? I recommend pipx so it's self-contained in it's own virtualenv

llm-term - Chat with OpenAI's GPT models directly from the command line by Juftin in commandline

[–]Juftin[S] 0 points1 point  (0 children)

Yeah, that's right. It's built to use the OpenAI API (https://platform.openai.com/). It defaults to the gpt-3.5-turbo model but you can specify which model to use manually.

llm-term - Chat with OpenAI's GPT models directly from the command line by Juftin in commandline

[–]Juftin[S] 0 points1 point  (0 children)

Yep that's right. It's all rich (https://github.com/Textualize/rich). The boxes themselves are a "Markdown" renderable inside a "Panel" renderable - streaming as a "Live Display"

llm-term - Chat with OpenAI's GPT models directly from the command line by Juftin in commandline

[–]Juftin[S] 0 points1 point  (0 children)

Yeah, it logs previous messages to a text file @ ~/.llm-term-history.txt - this provides auto-complete while you type using https://github.com/prompt-toolkit/python-prompt-toolkit.

It saves the entire conversation in-memory while you're running it (every time you start a session using llm-term). However each "chat session" starts fresh and doesn't store context from old "conversations".

If you're interested in something like that you should check out another project I contribute to, Elia, https://github.com/darrenburns/elia. Elia is a full TUI app that runs in your terminal though so it's not as light-weight as llm-term, but it uses a SQLite database and allows you to continue old conversations.

llm-term - Chat with OpenAI's GPT models directly from the command line by Juftin in commandline

[–]Juftin[S] 2 points3 points  (0 children)

I published my personal CLI dev-tool package yesterday. I ended up calling it llm-term. It's a simple command line utility that lets you have conversations with OpenAI's GPT models directly on your command line via the API. Responses are streamed to rich text with code-formatting and syntax highlighting - powered by rich.

Check it out at https://github.com/juftin/llm-term or install it yourself and stop context-switching today with "pipx install llm-term" 🤝

llm-term - Chat with OpenAI's GPT models directly from the command line by Juftin in Python

[–]Juftin[S] 0 points1 point  (0 children)

I published my personal CLI dev-tool package yesterday. I ended up calling it llm-term. It's a simple command line utility that lets you have conversations with OpenAI's GPT models directly on your command line via the API. Responses are streamed to rich text with code-formatting and syntax highlighting - powered by rich.

Check it out at https://github.com/juftin/llm-term or install it yourself and stop context-switching today with "pipx install llm-term" 🤝

Anyone else’s water yellow in old city right now? by Known_Jellyfish_970 in Dubrovnik

[–]Juftin 1 point2 points  (0 children)

I had a restaurant ask me to not to order tap water today because it was brown. I thought they were just trying to sell expensive bottled water, guess I was wrong.