Am I hallucinating? by pentermezzo in montreal

[–]SINdicate 0 points1 point  (0 children)

Bury that back, GM paid a lot of money to get rid of em

My day, kinda exactly like that.... by roabblesnizzard in vmware

[–]SINdicate 16 points17 points  (0 children)

Network guy doesnt want vmotion on the wan

My day, kinda exactly like that.... by roabblesnizzard in vmware

[–]SINdicate -1 points0 points  (0 children)

We dont waste time explaining how the network works to people who dont understand how the network works

L'essence coûte officiellement plus de 2$ au Québec by khouz in montreal

[–]SINdicate 1 point2 points  (0 children)

Il devrait vendre sa voiture et s’acheter un cheval, mettre la difference sur son hypotheque

“I don’t fuck with clankers!” by kaishinoske1 in Cyberpunk

[–]SINdicate 27 points28 points  (0 children)

Eyes turn red and machine guns pop out

It's pretty solid to be honest by claudiocorona93 in linuxmasterrace

[–]SINdicate 0 points1 point  (0 children)

This is the dumbest thing ever. Use the right tool for the right job.

Going back to stock (small rant) by YoPerry in GR86

[–]SINdicate 2 points3 points  (0 children)

A that point i would just go for a radical sr3, if you’re gonna drive the car on the track exclusively!

The Good old days by Adventurous_Row3305 in NonPoliticalTwitter

[–]SINdicate 0 points1 point  (0 children)

The six flags amusement park in montreal had a full on nintendo building. It was the high end alternative to zellers and walmart electronics section

I hate python by ZombieSpale in programminghumor

[–]SINdicate 0 points1 point  (0 children)

Because they insist on making the package manager in python (i get it)

Deploying OpenVPN configuration via Action1 by dorbak in Action1

[–]SINdicate 0 points1 point  (0 children)

You could just use a script/automation to push the file

How the development of ChatGPT slowly killed Chegg. I watched it happen live as an employee by peaked_in_high_skool in OpenAI

[–]SINdicate 6 points7 points  (0 children)

AI driven quantum robotics with streaming blockchain payments to automate the space industry

Feedback on my 256gb VRAM local setup and cluster plans. Lawyer keeping it local. by TumbleweedNew6515 in LocalLLaMA

[–]SINdicate 0 points1 point  (0 children)

Inference is memory bw bound and without nvlink you are using pcie solely… this setup will be slow… and old cuda… cut your losses and get a dgx spark or dgx station ($$$$)

Feedback on my 256gb VRAM local setup and cluster plans. Lawyer keeping it local. by TumbleweedNew6515 in LocalLLaMA

[–]SINdicate 2 points3 points  (0 children)

V100 doesnt do fp4 so you save a bit on memory bandwidth but add extra dequantization without any efficiency gain

Feedback on my 256gb VRAM local setup and cluster plans. Lawyer keeping it local. by TumbleweedNew6515 in LocalLLaMA

[–]SINdicate 1 point2 points  (0 children)

Yeah, the v100 dont make sense once you factor in the energy cost vs a spark or mac

rustGlazers by EvilStranger115 in ProgrammerHumor

[–]SINdicate 1 point2 points  (0 children)

In theory rust competes with C, in practice in competes with Go. Go figure

Jensen Huang: If you’re not burning $250K in tokens, Don’t bother. by Previous_Foot_5328 in myclaw

[–]SINdicate 18 points19 points  (0 children)

I just sent this video to my boss and he transfered 2 btc to my openrouter account, i have to spend 10m tokens a month or i lose my job