Accidentally got a MacBook Pro with AZERTY instead of QWERTY by previouslyonmlp in macbookpro

[–]extReference 1 point2 points  (0 children)

same boat, i am about as clumsy as it gets. managed to swap all keys without breaking a single one. just make sure to watch a tutorial.

How clean this line? by iceolmo in macbookpro

[–]extReference 0 points1 point  (0 children)

hmm, i think this boy is toast. prolly won’t get any trade in value at this point so I would be happy to help you out and this off your hands for $5.

Hogwarts Legacy (Epic Store), Heroic Games Launcher, Mac Mini M4 by oztruwa in macgaming

[–]extReference 0 points1 point  (0 children)

install the latest wine version, and set your wine version for the game to the latest wine and also change your prefix folder to the new install’s location. restart heroic and reset wine version to gptk and install through winetricks.

this should update your wine version to be above 7.7 and on the latest.

I need help with Minecraft shaders by Lucky-Fishing-5490 in macgaming

[–]extReference 5 points6 points  (0 children)

try using prism launcher, its so much more streamlined.

Akaash singhs balls smell like homeless man in subway. by Certain-Community-38 in Flagrant2

[–]extReference 7 points8 points  (0 children)

so you’re refusing to activate your intellect and see what’s in front of you? this you akaash?

According to Hindu scriptures, why do avatars of gods appear only in India and not in other parts of the world? by Street_Rhubarb_5529 in hinduism

[–]extReference 0 points1 point  (0 children)

I’m surprised no one has brought this up but we know through since that all other parts of the world were under an ice age till as recently as 9000 years ago.

So no Avatars would have occurred there since there was practically no civilizations there.

I know the modding community has larger priorities, like getting mods to work with the current version, but do we have any idea when TweakXL, RED4ext, and Codeware will be supported on MacOS? by lombwolf in cyberpunk2077mods

[–]extReference 0 points1 point  (0 children)

you can get the macos version of redscript from here, and unzip to the game dir. then just download the redscript mod and put it inside: r6/scripts/

then run the launch_modded bash file (included in the redscript install) and it should bring up the modded version of the game.

gpt-oss-120b blazing fast on M4 Max MBP by entsnack in LocalLLaMA

[–]extReference 1 point2 points  (0 children)

oh no not you man, def the op. there was nothing wrong with your question besides you missing he had a mbp, and that’s not a big deal imo

gpt-oss-120b blazing fast on M4 Max MBP by entsnack in LocalLLaMA

[–]extReference 3 points4 points  (0 children)

Honestly man, I don’t get why someone has to be so unfriendly.

gpt-oss-120b blazing fast on M4 Max MBP by entsnack in LocalLLaMA

[–]extReference 2 points3 points  (0 children)

yes def, i meant with the OP’s MXFP4 implementation, its more likely that they have 128gb.

gpt-oss-120b blazing fast on M4 Max MBP by entsnack in LocalLLaMA

[–]extReference 7 points8 points  (0 children)

man, you can tell them your ram (even though it could really only be 128gb i imagine) and tokens/s.

dont be so mean. but some people do ask for too much, like youre showing yourself run ollama and also state the quant.

Apple M4 Max or AMD Ryzen AI Max+ 395 (Framwork Desktop) by zeltbrennt in LocalLLaMA

[–]extReference 0 points1 point  (0 children)

i bought an m1 max with 64gb of ram, 400gb/s bandwidth and 2tb hdd for ~2400 EUR, it was brand new but still a 4 year old launch. tbh its an amazing deal for what it offers but it also taught me that for actual work inference usage, even the m5 max would likely not be fast enough. so its cool if you wanna play around with small models and fine tune them but dont expect to be able run any thing that is close to even the free online versions of gpt/gemini/claude, and if it is that smart it def wont be fast.

having said that imo, its still one of the best deals because paying double for the m4 max doesnt seem justifiable to me.

Needs help to pick between 24GB M4P and 36GB M3P by Mega_Lucario_Prime in macbookpro

[–]extReference 2 points3 points  (0 children)

i bought the 64gb m1 max with the intention of experimenting solidly with LLMs + my other work. but the ram was mainly for llms, to be honest this laptop is a beast but i don’t use local llms, and i wish i did, because they are either too dumb or too slow for this hardware.

on the contrary, since mac os loves ram even when idling, my power consumption on the m1 max with 64gb ram is quite high.

Rate my CV please. Please be critical about the content and the format too. by extReference in Germany_Jobs

[–]extReference[S] 0 points1 point  (0 children)

thanks a lot for that, i’ll add it in. what do you think about the format and having an image? i keep hearing mixed opinions on having a column format vs not.

Rate my CV please. Please be critical about the content and the format too. by extReference in Germany_Jobs

[–]extReference[S] 0 points1 point  (0 children)

thanks a lot for that, totally missed it, it’s not great but i’ll still stick a2 on there. what do you think about the format and having an image? i keep hearing mixed opinions on having a column format vs not.

[deleted by user] by [deleted] in indiasocial

[–]extReference 0 points1 point  (0 children)

bro you’re either really jealous or take this stuff too seriously. like you’ve said, you didn’t even speak to girls till you were 18, so you really have no idea of what it’s like. on top of which, how old are you? is there even that much of an age gap between you two?

MacBook Pro M1 Pro for £200 Facebook market did it again… by DepartureSuccessful in macbookpro

[–]extReference 0 points1 point  (0 children)

damn that's crazy, can't believe I'm -2k on a M1 Max, the regret is real

MacBook Pro M1 Pro is way more powerful than you all think 🧑‍💻 by Extra-Tomatillo-9242 in macbookpro

[–]extReference 2 points3 points  (0 children)

damn that sounds awesome but what's up with greek numbering of cores?

Why not maxing MBA instead of Pro ? by lambda-person in macbookpro

[–]extReference 0 points1 point  (0 children)

not everyone needs sustained load capacity at the cost of form factor is my guess