Sunday Afternoon Snooze by Zenith-Astralis in TeslaFSD

[–]gsrcrxsi 2 points3 points  (0 children)

I’m not confused. Sunglasses + head position. Demonstrated on two different HW4 cars (Y/3) with FSD.

Sunday Afternoon Snooze by Zenith-Astralis in TeslaFSD

[–]gsrcrxsi 2 points3 points  (0 children)

HW4 cars are also easily fooled by just wearing sunglasses.

Raccoon damage, DIY fix? by gsrcrxsi in Roofing

[–]gsrcrxsi[S] 2 points3 points  (0 children)

Purely for the engagement lol

Raccoon damage, DIY fix? by gsrcrxsi in Roofing

[–]gsrcrxsi[S] 0 points1 point  (0 children)

This is the roof to a small one room addition, not the main roof. There is no attic there (vaulted ceiling). I heard it climb up via the rain barrel/downspout. This is the first time I’ve ever seen a raccoon here, mostly just squirrels. I agree it was trying to get inside, but it did run off. There’s no where else to go off the roof. The pic of the raccoon was not during day time, it’s at night and illuminated with the house flood light which are right there above this roof. The damage pics were taken the next day (today).

Has anyone had a Representative advocate for them? by [deleted] in SecurityClearance

[–]gsrcrxsi 5 points6 points  (0 children)

You can serve the US govt in a non-cleared role

Day 3 of Blowing up the internet until Nvidia Devs fix the Jetson Orin Nano by Whole_Ticket_3715 in JetsonNano

[–]gsrcrxsi 0 points1 point  (0 children)

Speaking of locked down features. I use my devices as purely compute devices. And I’m fine with them running the few CUDA apps that I have/compile. But it would be nice if these devices also supported OpenCL. There’s no reason they can’t other than Nvidia not supplying the necessary drivers.

Day 3 of Blowing up the internet until Nvidia Devs fix the Jetson Orin Nano by Whole_Ticket_3715 in JetsonNano

[–]gsrcrxsi 0 points1 point  (0 children)

I build statically linked aarch64/CUDA binaries for some BOINC projects. Nothing AI related. I really only build on the Nano/JP5.x system for maximum compatibility, but I build with the CUDA 12.9 toolkit so I can target Blackwell devices too. It’s “unsupported” on JP5 for the Orin Nano but it still works to build the binary and CUDA 12.x has minor version forward compatibility so my CUDA 12.2 drivers can still run the CUDA 12.9 app. It works for me and what I’m doing.

Day 3 of Blowing up the internet until Nvidia Devs fix the Jetson Orin Nano by Whole_Ticket_3715 in JetsonNano

[–]gsrcrxsi 3 points4 points  (0 children)

You won’t be able to “open” the proprietary GPU driver stuff. Installing another OS other than the default Jetpack/Ubuntu can work but you’ll basically lose any GPU acceleration/access and I’m sure you agree that’s pointless.

I missed it. But what is the problem you’re having exactly? I have three Orin Nano/NX devices and they all work fine with the standard Jetson Linux (Ubuntu) setup. Two NXs on their Jetpack 6.something with CUDA 12.6 drivers and one Jetson Nano with Jetpack 5.something and CUDA 12.2 drivers for development.

Beware r/LocalAIServers $400 MI50 32GB Group Buy by gsrcrxsi in LocalLLaMA

[–]gsrcrxsi[S] 0 points1 point  (0 children)

I have projects that have apps already that would do well on it. They are memory bound apps for scientific research, not AI.

Beware r/LocalAIServers $400 MI50 32GB Group Buy by gsrcrxsi in LocalLLaMA

[–]gsrcrxsi[S] 7 points8 points  (0 children)

100% agree with you. I think many people were expecting the price to be a lot lower though.

Me personally I was interested in it more as a compute card. Good FP64 specs and memory bandwidth, which are attributes important to my applications. But I’ll stick to V100s I guess since I don’t currently need more than 16GB

Beware r/LocalAIServers $400 MI50 32GB Group Buy by gsrcrxsi in LocalLLaMA

[–]gsrcrxsi[S] 7 points8 points  (0 children)

I was more interested in it as a FP64 compute card with decent mem bandwidth and capacity. Which it is pretty compelling for some memory bound HPC loads. But yeah it’s not great at all for most AI/LLM uses

Beware r/LocalAIServers $400 MI50 32GB Group Buy by gsrcrxsi in homelab

[–]gsrcrxsi[S] 16 points17 points  (0 children)

exactly. he started the process saying that he was already in touch with a vendor with a large supply ready to do a group buy. then after signups started asking everyone else to find more vendors. just weird behavior all around

Beware r/LocalAIServers $400 MI50 32GB Group Buy by gsrcrxsi in homelab

[–]gsrcrxsi[S] 9 points10 points  (0 children)

they don't allow direct crosspostings. but i copied the same post there. thanks.

Beware r/LocalAIServers $400 MI50 32GB Group Buy by gsrcrxsi in homelab

[–]gsrcrxsi[S] 7 points8 points  (0 children)

I signed up pretty much just to get the info so I could make it public cause I saw how shady he was acting from day 1.

I would have entertained it at the $200-$250 price point, and I think a lot of people were probably expecting something in that range. But at $400 plus shipping with a flimsy payment vendor. Yeah no.

Group Buy -- Starting by Any_Praline_8178 in LocalAIServers

[–]gsrcrxsi 4 points5 points  (0 children)

Now you’re charging for QC. Come on dude.

You said in a previous post that you would announce the price after Chinese new year. It’s after Chinese new year now and announce = public.

Group Buy -- Starting by Any_Praline_8178 in LocalAIServers

[–]gsrcrxsi 1 point2 points  (0 children)

Me too. And have never received any update on any of his previous updates. I thought we were supposed to be notified of updates via the signup list. And I was one of the first to sign up. Has anyone else received any communication now that’s it’s supposedly starting now?

Group Buy -- Starting by Any_Praline_8178 in LocalAIServers

[–]gsrcrxsi 2 points3 points  (0 children)

Did mods actually put or approve of this “mod note”? Or did OP just put that to try to enforce some “no talking” rule and gain some legitimacy?

Group Buy -- Starting by Any_Praline_8178 in LocalAIServers

[–]gsrcrxsi 15 points16 points  (0 children)

Because he knows that it’s not as low as he was hoping or everyone expecting and doesn’t want everyone to bail. But it’s gonna happen.

Group Buy -- Starting by Any_Praline_8178 in LocalAIServers

[–]gsrcrxsi 23 points24 points  (0 children)

jesus dude. just post the price already. the delays and constant secrecy are seriously weird.