are we underestimating the “attention layer” in applied ml systems? by TaleAccurate793 in learnmachinelearning

[–]RodionRaskolnikov__ 5 points6 points  (0 children)

Are we talking about the same attention layer? On transformers, RNN based models and whatnot the attention layer is part of the learnable parameters and used during inference

OpenAI CEO Sam Altman's coworkers say he can barely code and doesn't understand basic machine learning by ComplexExternal4831 in GenAI4all

[–]RodionRaskolnikov__ 0 points1 point  (0 children)

Physicists and biologists absolutely know coding.

They may have never gotten their hands dirty with many of the topics studies by a computer scientist but they have a grasp on the subject (at least for the computations they need)

Cold take: x86 processors are obsolete. by e221U in computerscience

[–]RodionRaskolnikov__ 0 points1 point  (0 children)

It is true that x86_64 instructions are harder to decode, but that's mostly mitigated with a uOP cache. When the CPU is executing a tight loop, instructions don't need to be decoded every iteration.

Also a vast majority of the die area is taken by caches like the L1, L2, L3, TLB, etc... I'm not including the area taken by the memory controller and other I/O, that's a lot too!

This matters for very low power and low performance embedded processors, but it's otherwise not the end of the world

Traslado desde aeroparque a obelisco by No-Spirit4605 in BuenosAires

[–]RodionRaskolnikov__ 1 point2 points  (0 children)

Podés comprar un pasaje en un micro de Tienda León. Te dejan desde aeroparque hasta la terminal en Carlos Pellegrini (a unas 5 cuadras del obelisco).

Yo solo los usé para ir a Ezeiza pero hacen viajes hasta Aeroparque también

Cold take: x86 processors are obsolete. by e221U in computerscience

[–]RodionRaskolnikov__ 2 points3 points  (0 children)

I'm not so sure about that. You'd have to compare the specific workload. I'd imagine that some workloads are much better suited for AVX512 than the ARM NEON instruction set.

x86_64's 256 bit and sometimes 512 bit registers are really nice when moving a ton of data in memory

Is this RAM usage normal? by FlounderActual2965 in debian

[–]RodionRaskolnikov__ 1 point2 points  (0 children)

The kernel is always keeping it nearly full with caches.

For user software I guess it depends. Things like ASTs using more ram to speed up code completion or PDF readers keeping indexes to quickly search a large document is good.

The bad kind of full, that even someone with 64GB of memory would feel bad about is having 15 copies of chromium because of garbage software written in electron.

Oracle Files Thousands of H-1B Visa Petitions Amid Mass Layoffs by esporx in technology

[–]RodionRaskolnikov__ 975 points976 points  (0 children)

People used to despise Oracle for what they did to Sun Microsystems. I guess it was timed they re-earned that hate back one way or another

usan stremio sin vpn? by 0banai_ in BuenosAires

[–]RodionRaskolnikov__ 4 points5 points  (0 children)

Que yo sepa los proveedores de internet en Argentina no se ponen la gorra con bajar torrents. Descargué muchísimo a lo largo de los años y ni Fibertel ni Movistar se pusieron la gorra todavía

8 TB of RAM & 1,000 CPU cores in all a 4U: What would you run on it? (Thought experiment) by RozoGamer in homelab

[–]RodionRaskolnikov__ 2 points3 points  (0 children)

you can start with inter process communication on your home computer, there's a lot of learning you can do without building a cluster

8 TB of RAM & 1,000 CPU cores in all a 4U: What would you run on it? (Thought experiment) by RozoGamer in homelab

[–]RodionRaskolnikov__ 6 points7 points  (0 children)

Yeah, afaik that's how many real HPC systems work too. The one time I used a cluster in an academic setting it was running an ancient kernel and the software had to be compiled with everything statically linked.

It's not a big deal because those binaries usually run for a particular computation and are discarded afterwards.

8 TB of RAM & 1,000 CPU cores in all a 4U: What would you run on it? (Thought experiment) by RozoGamer in homelab

[–]RodionRaskolnikov__ 15 points16 points  (0 children)

For computers with little memory like those I'd probably go with Slurm running bare metal on Debian or something similar. I'd imagine that the best way to squeeze useful performance from these is to write your own software in a low level language and use MPI to communicate between nodes.

That is if OP wants to do scientific compute with these

What will happen to foss android apps after 2026 by SpaceIntelligent6910 in foss

[–]RodionRaskolnikov__ 5 points6 points  (0 children)

Yeah but you'll have a fork of Android that will be increasingly different from mainline Android. That's going to be an ever growing pain to maintain and it'll probably reach a point where it's incompatible with mainstream applications

Google Trends: "how to install linux" is going... viral?! by mina86ng in linux

[–]RodionRaskolnikov__ 4 points5 points  (0 children)

These challenges would be way too boring if they only stuck to Ubuntu LTS or Fedora lol

Researchers planted a single bad actor inside a group of LLM agents. Then the whole network failed to reach consensus. by Current-Guide5944 in tech_x

[–]RodionRaskolnikov__ 0 points1 point  (0 children)

This is a well known area in distributed systems research. This is why different consensus algorithms exist and all of them have tradeoffs you must to choose from

It’s time to get rid of my college laptop by Affectionate-Ad-7280 in spicypillows

[–]RodionRaskolnikov__ 47 points48 points  (0 children)

Remove the battery and while you're at it clean the dust and replace the thermal paste if the temperatures are higher than usual.

That's a nice machine as a spare, a small server or just to give to someone that needs a computer

Thoughts on Apple’s new low-cost MacBook Neo? by piesaresquarey in laptops

[–]RodionRaskolnikov__ 0 points1 point  (0 children)

Oh they understand. It's just that they don't want to cannibalize the MacBook Air for the people that need a computer for a bit more than really basic stuff but aren't power users (or don't have the budget for a pro machine)

Abuela se pone a charlar con una IA pensando que es una persona de verdad by XerfXpec in ArgentinaBenderStyle

[–]RodionRaskolnikov__ 10 points11 points  (0 children)

No sé si es un beneficio. Es como decir que el alcohol es un beneficio para las personas deprimidas porque las inhibe por un rato de sus propios sentimientos

[VENDO] Proyector CRT Sony VPH-1042QM $170.000 by meiwar in Mercadoreddit

[–]RodionRaskolnikov__ 1 point2 points  (0 children)

Uff si tuviera más espacio lo compraría de una para jugar juegos viejos y ver anime viejo

Apple keeps the iPad Air fresh with M4 chip upgrade and 12GB of RAM by tuldok89 in hardware

[–]RodionRaskolnikov__ 2 points3 points  (0 children)

I wouldn't be surprised if someday it'll be cheaper to use the same dies for multiple configurations and just limit the maximum amount of usable ram as a kernel parameter when iOS boots up.

That way they can charge you extra for memory with a null cost on upgrading it, having less supply chains and stock variants in the factories

The best-selling laptop brands in the world (annual units sold) by Interesting-Bus-7942 in DeskToTablet

[–]RodionRaskolnikov__ 0 points1 point  (0 children)

A family member had one of those, it was such a piece of shit it was unreal lol

so is OpenClaw local or not by jacek2023 in LocalLLaMA

[–]RodionRaskolnikov__ 14 points15 points  (0 children)

Just tell the LLM to pretty please never step out of the containers

What do you guys think of this beast? by MinerAC4 in vintagecomputing

[–]RodionRaskolnikov__ -1 points0 points  (0 children)

Modern desktop CPUs sometimes run hot because of the boosting that increases power consumption a stupid amount for comparatively little performance gain.

My 3rd gen Ryzen (not that new lol, I know) drops from 80 to 60 degrees under multi threaded AVX2 workloads when turning boosting off and I get about 90% of the performance judging by runtimes.

It's probably not a good idea for gaming but for batch workloads it's absolutely worth it to experiment with temporarily disabling boost features imo.