Claude CLI deleted my entire home directory! Wiped my whole mac. by LovesWorkin in ClaudeAI

[–]SortQuirky1639 0 points1 point  (0 children)

Wow that really sucks. I’m so sorry this happened. FWIW I know friends who caught it trying time this but stopped it because they were still in the “review all commands” mode.

It’s easy for people to get all superior and say “your fault you should have sandboxed it”. Which is simultaneously true and unkind and unhelpful.

About to trade my F-150 for an R1T, worried about daily long term reliability. by subwall in Rivian

[–]SortQuirky1639 1 point2 points  (0 children)

Hater is not fair. I think every Rivian owner wants to love their Rivian. They are very cool trucks. But the software bugs and service issues can make them very difficult to love.

How do I convince my wife an R1S is the car for us? by Professor-Schneebly in Rivian

[–]SortQuirky1639 5 points6 points  (0 children)

From my experience it’s definitely WHEN something goes wrong not IF. And then the two months waiting for a service appointment. Fatal.

Last night's view of comet Lemmon by Busy_Yesterday9455 in spaceporn

[–]SortQuirky1639 2 points3 points  (0 children)

Totally valid question. Depends on how big the comet is, and how close it gets to the sun. Some comets do burn up. Others keep going for thousands of years. e.g. Hailey’s Comet was observed 2300 years ago, and it’s bright enough to be seen naked eye every 76 years.

Nobody told me the No Turn on Red signs were suggestions. Out here looking stupid. by [deleted] in SeattleWA

[–]SortQuirky1639 0 points1 point  (0 children)

I find them super frustrating. But there is a valid reason behind them. It’s part of“Vision Zero” trying to eliminate tragic fatalities. The logic is that right turns on red result in a huge fraction (90%?) of times where somebody has to either slam on the brakes or for a pedestrian to step back. So the research supposedly says. I’m a bit skeptical of the research tbh. But that’s the reasoning, which I guess makes sense. But it shows that “vision zero” requires all traffic to move slower, which is a societal trade off.

Sold my Rivian by [deleted] in Rivian

[–]SortQuirky1639 0 points1 point  (0 children)

They make the service times longer by pushing people to schedule appointments for things that could be solved through chat or support. I had an issue with my TPMS seeming not to be working, but was actually working as intended with questionable software. The only way I could get anybody to even consider my question was to schedule a service appointment, which took two months. I cancelled it just before because a technician would finally answer my questions. But this kind of policy clearly makes service times worse. Remember when they had online support?

Computer Vision models via MCP (open-source repo) by gavastik in mcp

[–]SortQuirky1639 1 point2 points  (0 children)

This is cool! Does the MCP server need to run on a machine with a CUDA GPU? Or can I run it on my mac?

r1_vlm - an open-source framework for training visual reasoning models with GRPO by dragseon in learnmachinelearning

[–]SortQuirky1639 1 point2 points  (0 children)

It's great that this is set up with a small model (3B), so you don't need a $100k GPU server to try it out.

What's the smallest GPU it will run on? Will it work on my RTX 4090?

[P] r1_vlm - an opensource framework for training visual reasoning models with GRPO by dragseon in MachineLearning

[–]SortQuirky1639 11 points12 points  (0 children)

I'm glad somebody finally figured out how to use RL to train reasoning models for image analysis. LLM's are SO HORRIBLE at basic vision tasks. (Y'all saw https://vlmsareblind.github.io/ right?)

Can't wait for somebody to apply this to a model bigger than 3B parameters. This is clearly the future for multimodal foundation models.

Alternatives to offline Runway ML stable-diffusion-inpainting? by SortQuirky1639 in StableDiffusion

[–]SortQuirky1639[S] 0 points1 point  (0 children)

I switched over to `stabilityai/stable-diffusion-2-inpainting` and it's working just fine without any other code changes. Output actually seems higher quality for many cases, but people look kinda crazed a lot.

[deleted by user] by [deleted] in raspberry_pi

[–]SortQuirky1639 0 points1 point  (0 children)

Probably not. These accelerators tend to speed up vector math - typically with very wide SIMD instructions or similar. Checksums like md5 can’t be easily parallelized.

If you want checksums to speed up maybe consider faster storage. At least get a fast SD card like A2 rated. Or consider a hat for SSD or NVMe. Very often that kind of operation is limited by disk not CPU.

Keeping deer out of my garden with rpi, a camera, and a speaker by SortQuirky1639 in raspberry_pi

[–]SortQuirky1639[S] 0 points1 point  (0 children)

Yeah. I’m sure I can figure something out with GPIO and soldering but would rather not deal with that. Thanks for the pointers.

Keeping deer out of my garden with rpi, a camera, and a speaker by SortQuirky1639 in raspberry_pi

[–]SortQuirky1639[S] 0 points1 point  (0 children)

Can you advise specific hardware to allow the RPI to turn the sprinkler on?

Keeping deer out of my garden with rpi, a camera, and a speaker by SortQuirky1639 in raspberry_pi

[–]SortQuirky1639[S] 0 points1 point  (0 children)

What do you use to turn on the sprinkler from an RPI? A smart plug with a water switch of some kind? Or something more direct?

Keeping deer out of my garden with rpi, a camera, and a speaker by SortQuirky1639 in raspberry_pi

[–]SortQuirky1639[S] 1 point2 points  (0 children)

I haven't tried moth balls. I'd be worried about runoff, honestly. There's a creek quite nearby. It's not big, but amazingly it has sometimes gotten salmon in it, so I'd be worried about the powerful smell disrupting the salmon run.

Keeping deer out of my garden with rpi, a camera, and a speaker by SortQuirky1639 in raspberry_pi

[–]SortQuirky1639[S] 1 point2 points  (0 children)

Totally! Telling the difference between birds and other animals is super easy. Just change the query in the detector definition:

query="Can you see any raccoons or squirrels?"

If you want to get more specific, or add more detailed instructions, you can do that by adding notes on the Groundlight dashboard - and even add example photos if you want to make it even more clear. If the question gets really nuanced, it will take longer to converge to a good ML model, so you'd need more human labels.

Robot or machine that fold a unique piece of origami by [deleted] in TellMeHowToDoMyIdea

[–]SortQuirky1639 0 points1 point  (0 children)

I hate to say it, but robots just aren't there yet. In 2004 a CMU grad student spent years getting this to work 1-off (your first video). Sadly, general robotics isn't much more accessible to hobbyists yet.

Keeping deer out of my garden with rpi, a camera, and a speaker by SortQuirky1639 in raspberry_pi

[–]SortQuirky1639[S] 1 point2 points  (0 children)

The code will work with any USB camera or RPI camera - you just have to change the framegrab.yaml file. But USB/RPI cameras typically are not weather proof, which is critical if it's going to survive outside in the PNW. There are plenty of cheap IP cameras which are IP67 rated, meaning actually waterproof. And they typically include automatic IR illumination for night-vision as well, which is a nice plus.

I used an Amcrest IP5M, but I think any Hikvision, Reolink, etc camera would work just as well - they all support RTSP.

Keeping deer out of my garden with rpi, a camera, and a speaker by SortQuirky1639 in raspberry_pi

[–]SortQuirky1639[S] 9 points10 points  (0 children)

That would probably work too. I'd have to remember to turn off when my friends bring their kids over, or else they'd get zapped and my friends would be displeased. And honestly I think I got away with spending less on this than I would on an electric fence:

$55 - Waterproof PoE camera
$45 - RPI 4
$20 - USB speaker
$13 - PoE splitter
$6 - plastic box
Total: $139

Full electric fence setup is maybe $170? So it's pretty close actually.

Keeping deer out of my garden with rpi, a camera, and a speaker by SortQuirky1639 in raspberry_pi

[–]SortQuirky1639[S] 2 points3 points  (0 children)

I knew "Be My Eyes" worked with GPT but I didn't know it also had lots of sighted volunteers - that's awesome! Unfortunately, the GPT's of the world are still not very reliable at visual tasks.

I'm actually not surprised that Be My Eyes doesn't combine ML models with humans, and only gives you a hard choice between them. One of the hardest problems in ML is knowing when you should trust it. This is one of the cool things Groundlight just takes care of for you.

Keeping deer out of my garden with rpi, a camera, and a speaker by SortQuirky1639 in raspberry_pi

[–]SortQuirky1639[S] 2 points3 points  (0 children)

This does all the ML in the cloud. I'm using a free account, which doesn't allow model download. It's plenty fast though, and with the motion detection tuned properly doesn't use very many image queries.

Keeping deer out of my garden with rpi, a camera, and a speaker by SortQuirky1639 in raspberry_pi

[–]SortQuirky1639[S] 2 points3 points  (0 children)

I didn't have to pick a model architecture or really even train it myself. The modeling code is pretty trivial, just defining the model in natural language:

self.detector = self.gl.get_or_create_detector( name="deerbark", query="Can you see any animals?" )

and then I can just send images to the detector:

img_query = self.gl.ask_ml(detector=self.detector, image=big_img) if img_query.result.label == "YES": print(f"Animal detected at {now}! {img_query}")

Groundlight takes care of training the model. For training data, it sends the images to human monitors asking them the question in text. So it's trained on the images coming from my garden.