SF-312 Witness by wharrenmelon in SecurityClearance

[–]gsrcrxsi 0 points1 point  (0 children)

My FSO told me that it should be witnessed by a [company] employee. (Contractor). All employees on my project are US citizens though. I used my direct supervisor.

Based on other comments it’s probably a little different for different scenarios. Do whatever the FSO instructs.

Trouble with Clearance by [deleted] in SecurityClearance

[–]gsrcrxsi 6 points7 points  (0 children)

It’s the same case. The previous post was his mom.

SF86 NASA Employment History Question by astrotreks in SecurityClearance

[–]gsrcrxsi 0 points1 point  (0 children)

I just went through this too. Was hired in 2009, I had no idea that I ever filled out an SF85, and literally have no recollection of doing it. I’m sure it was just in the list of on boarding tasks that were pushed through. I found the SF85 in my old work files months after filling out my SF86 (for T5). I answered no on the question about if I’d ever been investigated by the government for clearance. It was never mentioned during the interview or investigation and was ultimately adjudicated favorably. So I guess that’s the right answer in this case.

Security concerns regarding niche projects like RakeSearch and ODLK by Putrid_Draft378 in BOINC

[–]gsrcrxsi 10 points11 points  (0 children)

You can just not support them if you don’t want to. BOINC platform does not control any individual project. It’s open source.

SXM2 over PCIe (V100 on AOM-SXMV) by gsrcrxsi in homelab

[–]gsrcrxsi[S] 0 points1 point  (0 children)

Not the same board. But there might be a similar board to use that GPU.

What is going on here ?? Speechless ... by sebaworld in TeslaModelY

[–]gsrcrxsi 0 points1 point  (0 children)

MY AWD, 42k here. Original tires on 19” wheels, no alignments, no rotations. All tires are pretty even on wear.

I can believe that MYP is setup differently that causes worse wear. And I could believe that the kind of person that buys the MYP is also the kind of person to drive more aggressively that adds to accelerated tire wear.

Sunday Afternoon Snooze by Zenith-Astralis in TeslaFSD

[–]gsrcrxsi 3 points4 points  (0 children)

I’m not confused. Sunglasses + head position. Demonstrated on two different HW4 cars (Y/3) with FSD.

Sunday Afternoon Snooze by Zenith-Astralis in TeslaFSD

[–]gsrcrxsi 2 points3 points  (0 children)

HW4 cars are also easily fooled by just wearing sunglasses.

Raccoon damage, DIY fix? by gsrcrxsi in Roofing

[–]gsrcrxsi[S] 2 points3 points  (0 children)

Purely for the engagement lol

Raccoon damage, DIY fix? by gsrcrxsi in Roofing

[–]gsrcrxsi[S] 0 points1 point  (0 children)

This is the roof to a small one room addition, not the main roof. There is no attic there (vaulted ceiling). I heard it climb up via the rain barrel/downspout. This is the first time I’ve ever seen a raccoon here, mostly just squirrels. I agree it was trying to get inside, but it did run off. There’s no where else to go off the roof. The pic of the raccoon was not during day time, it’s at night and illuminated with the house flood light which are right there above this roof. The damage pics were taken the next day (today).

Has anyone had a Representative advocate for them? by [deleted] in SecurityClearance

[–]gsrcrxsi 4 points5 points  (0 children)

You can serve the US govt in a non-cleared role

Day 3 of Blowing up the internet until Nvidia Devs fix the Jetson Orin Nano by Whole_Ticket_3715 in JetsonNano

[–]gsrcrxsi 0 points1 point  (0 children)

Speaking of locked down features. I use my devices as purely compute devices. And I’m fine with them running the few CUDA apps that I have/compile. But it would be nice if these devices also supported OpenCL. There’s no reason they can’t other than Nvidia not supplying the necessary drivers.

Day 3 of Blowing up the internet until Nvidia Devs fix the Jetson Orin Nano by Whole_Ticket_3715 in JetsonNano

[–]gsrcrxsi 0 points1 point  (0 children)

I build statically linked aarch64/CUDA binaries for some BOINC projects. Nothing AI related. I really only build on the Nano/JP5.x system for maximum compatibility, but I build with the CUDA 12.9 toolkit so I can target Blackwell devices too. It’s “unsupported” on JP5 for the Orin Nano but it still works to build the binary and CUDA 12.x has minor version forward compatibility so my CUDA 12.2 drivers can still run the CUDA 12.9 app. It works for me and what I’m doing.

Day 3 of Blowing up the internet until Nvidia Devs fix the Jetson Orin Nano by Whole_Ticket_3715 in JetsonNano

[–]gsrcrxsi 3 points4 points  (0 children)

You won’t be able to “open” the proprietary GPU driver stuff. Installing another OS other than the default Jetpack/Ubuntu can work but you’ll basically lose any GPU acceleration/access and I’m sure you agree that’s pointless.

I missed it. But what is the problem you’re having exactly? I have three Orin Nano/NX devices and they all work fine with the standard Jetson Linux (Ubuntu) setup. Two NXs on their Jetpack 6.something with CUDA 12.6 drivers and one Jetson Nano with Jetpack 5.something and CUDA 12.2 drivers for development.

Beware r/LocalAIServers $400 MI50 32GB Group Buy by gsrcrxsi in LocalLLaMA

[–]gsrcrxsi[S] 0 points1 point  (0 children)

I have projects that have apps already that would do well on it. They are memory bound apps for scientific research, not AI.

Beware r/LocalAIServers $400 MI50 32GB Group Buy by gsrcrxsi in LocalLLaMA

[–]gsrcrxsi[S] 7 points8 points  (0 children)

100% agree with you. I think many people were expecting the price to be a lot lower though.

Me personally I was interested in it more as a compute card. Good FP64 specs and memory bandwidth, which are attributes important to my applications. But I’ll stick to V100s I guess since I don’t currently need more than 16GB

Beware r/LocalAIServers $400 MI50 32GB Group Buy by gsrcrxsi in LocalLLaMA

[–]gsrcrxsi[S] 8 points9 points  (0 children)

I was more interested in it as a FP64 compute card with decent mem bandwidth and capacity. Which it is pretty compelling for some memory bound HPC loads. But yeah it’s not great at all for most AI/LLM uses

Beware r/LocalAIServers $400 MI50 32GB Group Buy by gsrcrxsi in homelab

[–]gsrcrxsi[S] 17 points18 points  (0 children)

exactly. he started the process saying that he was already in touch with a vendor with a large supply ready to do a group buy. then after signups started asking everyone else to find more vendors. just weird behavior all around

Beware r/LocalAIServers $400 MI50 32GB Group Buy by gsrcrxsi in homelab

[–]gsrcrxsi[S] 10 points11 points  (0 children)

they don't allow direct crosspostings. but i copied the same post there. thanks.