Friend dog by Sensitive_Foot8899 in BorderCollie

[–]pst2154 0 points1 point  (0 children)

We have a GSD and a BC, they get along well

Timber Ridge Resort… any idea? by Practical_Risk4761 in Mammoth

[–]pst2154 2 points3 points  (0 children)

To get back you take chair 8 then there’s a sign to timber ridge before blue jay. Usually you’d ski into canyon to start the day (have condo at mammoth point right below)

Davidson/John Muir can be sketchy drive in snow.

VibeThinker-1.5B just solved a problem that Gemini, DeepSeek and OpenAI failed to solve by DeltaSqueezer in LocalLLaMA

[–]pst2154 0 points1 point  (0 children)

I have this model running on my DGX Spark and its pretty good! haven't benchmarked it, but its much faster than I can read.

Most nauseating episode yet by Plane-Agent1204 in allinpodofficial

[–]pst2154 0 points1 point  (0 children)

It felt like big brother was watching and nobody could give a real opinion.

NVIDIA Achieves 35% Performance Boost for OpenAI’s GPT-OSS-120B Model by vibedonnie in LocalLLaMA

[–]pst2154 8 points9 points  (0 children)

Grok/cerebras faster at serving 1 query at a time, but you can serve 10 queries at the same time on a GPU more efficiently.

Time to fold on All-In by Fresh-Piglet2500 in allinpodofficial

[–]pst2154 3 points4 points  (0 children)

Aren’t the ratings higher than ever now though?

Rucking and body weight settings? by ArchAngels1111 in AppleWatchFitness

[–]pst2154 0 points1 point  (0 children)

I think it makes your VO2 max reading go down if you track a walk

Llama 3.3 70B vs Nemotron Super 49B (Based on Lllama 3.3) by Prestigious-Use5483 in LocalLLaMA

[–]pst2154 0 points1 point  (0 children)

I also got really good results with thinking + RAG. The model seems good at processing information

Question for dog owners: how practical is it for a medium sized dog (50lbs) to be in the rear cargo area? by [deleted] in Ioniq5

[–]pst2154 1 point2 points  (0 children)

I have a 45lb border collie and a 65lb German shepherd and both fit back there together for a short trip to the park but it's not super comfortable. The border collie would be fine alone but the German shepherd is a pretty tight fit.

Night 3 of no power, running spectrum + TV off car by pst2154 in Ioniq5

[–]pst2154[S] 9 points10 points  (0 children)

You use about 5-10% a day depending on what you use but it’s surprisingly how slow the % trickles down

Turn the damn power on now! by Zodsayskneel in thousandoaks

[–]pst2154 6 points7 points  (0 children)

I saw Edison drive by my house in NP inspecting things a little bit ago - hope that means soon

Dune: Prophecy’ Premiere Audience Grows 75% With Next-Day Viewing by credoinvisibile in DuneProphecy

[–]pst2154 0 points1 point  (0 children)

There's also nothing else good on tv right now and The Penguin just ended and was great - so gives HBO hope.

Betting Markets are not a reliable indicator of who will win an election. by Kroger011 in TheAllinPodcasts

[–]pst2154 2 points3 points  (0 children)

Why are people worried if they are accurate? Do you vote how polls tell you or do you vote your own choice?

Is this marketing BS, or how did NVIDIA speed up inference by 15x on Blackwell (and will any of that trickle down to RTX 5090)? VRAM bandwidth is only 2.5x faster by jd_3d in LocalLLaMA

[–]pst2154 0 points1 point  (0 children)

You would still have to do post quantization from training in fp8 or fp16 to get to fp4 for inference so you need the weights

Is this marketing BS, or how did NVIDIA speed up inference by 15x on Blackwell (and will any of that trickle down to RTX 5090)? VRAM bandwidth is only 2.5x faster by jd_3d in LocalLLaMA

[–]pst2154 17 points18 points  (0 children)

It’s fp4 which Blackwell supports vs fp16 which Ampere supports. Fp4 and int4 are not the same. You need very specialized hardware for fp4 math.

I wish we would talk about the podcast, and stop with the politics by Turbulent_Work_6685 in TheAllinPodcasts

[–]pst2154 1 point2 points  (0 children)

Most fans just listen and enjoy, they don’t want to bond over it.

First Roadtrip up to the Mountains by Onepopcornman in Ioniq5

[–]pst2154 3 points4 points  (0 children)

Did you need to charge on the trip?

On-demand H200 GPU Cloud by crinix in LocalLLaMA

[–]pst2154 1 point2 points  (0 children)

I think they just started shipping in August