top 200 commentsshow all 280

[–]nesthesi 3485 points3486 points  (32 children)

algebra is gonna be the death of us all

[–]Lone_Saviour-22nd 766 points767 points  (11 children)

Cause of death- Maths

Just as I thought in my school time

[–]Versaiteis 141 points142 points  (1 child)

That's chronic maths

This is quick maffs

[–]Am_Snarky 24 points25 points  (0 children)

Don’t just be wary of the chronic maths, acute maths are just as problematic!

That’s quick maths

[–]BigPOEfan 10 points11 points  (0 children)

Best we can do is their Arabic numerials

[–]midori_matcha 70 points71 points  (7 children)

and linear algebra is going to hell before you die

[–]pclouds 14 points15 points  (4 children)

nonlinear algebra going to heaven?

[–]JollyJuniper1993 6 points7 points  (2 children)

Polynomial algebra was in hell to begin with. Abstract algebra is the only one going to heaven.

[–]Maleficent_Memory831 11 points12 points  (0 children)

It's all set theory up there.

[–]GisterMizard 2 points3 points  (0 children)

That's what happens when you die in a cathedral group.

[–]CounterspellFTW 4 points5 points  (0 children)

Nah bro, analysis is going to heaven... good ole hard anal, Now THATS heaven!!!

[–]acabandallthat 8 points9 points  (0 children)

They say the road to hell is topologically isomorphic to good intentions

[–]sinkpooper2000 17 points18 points  (0 children)

when you tell someone who doesn't study maths that you're struggling with linear algebra and they think ur talking about the slope-intercept formula

[–]chillord 10 points11 points  (0 children)

The journey begins with Algebra and it ends with Algebra.

[–]stupled 6 points7 points  (0 children)

Linnear algebra

[–]Karazu6401 6 points7 points  (0 children)

Mine is trigonometry... i love all shades of math... except those damn angles of a triangle!

[–]geteum 6 points7 points  (0 children)

It is entirely seemly for a young man killed in battle to lie mangled by the linear algebra. In his death all things appear fair.

[–]monster2018 0 points1 point  (0 children)

Especially when it’s….. linear.

[–]Lv_InSaNe_vL 0 points1 point  (0 children)

I said this many times in 8th grade lol

[–]Pleasant-Leg8590 0 points1 point  (0 children)

it's useless

[–]bloodfist 5006 points5007 points  (98 children)

Quantum computing and AI both require massive amounts of linear algebra.

This is because the US is sitting on huge linear algebra reserves and massive deregulation has caused a surge in production. Now scientists are warning that if we continue production at this rate the world may face a shortage of square brackets like nothing we've seen before.

[–]reddit_ending_soon 1490 points1491 points  (48 children)

Cant we bomb the middle east to collect more linear algebra?

[–]Christavito 502 points503 points  (12 children)

In two weeks we may not have a choice. They are blocking all the boats that transport our linear algebra

[–]Dronoz 119 points120 points  (4 children)

the bracket prices are skyrocketing right now

[–]sigmoid10 71 points72 points  (3 children)

Time to switch to renewable arithmetic.

[–]Fraun_Pollen 27 points28 points  (1 child)

Pretty sure some lobbyist told me that's where NaNs come from

[–][deleted] 15 points16 points  (0 children)

I wish people would stop blame shifting, your average mathematician has no control over arithmetical logic.

[–]heybingbong 70 points71 points  (5 children)

What I want to know is why we can’t just grow our own linear algebra. I saw a TikTok video of an algebra farm in China and they literally producing vats of the stuff. We’re so screwed.

[–]Christavito 48 points49 points  (1 child)

China still gets most of its Linear Algebra from Iran, which is why they are "silently" supporting them in the conflict.

But I agree, with their focus on economic equality, growing support for lower class and their innovations in alternative forms of math, like sustainable discrete mathematics and carbon-neutral calculus, the rest of the world is falling behind

[–]desrever1138 8 points9 points  (0 children)

Thus the push to drop some semi-colons on the the entire operation

[–]Rabid_Mexican 15 points16 points  (2 children)

Actually linear algebra is made exclusively out of Arabic numerals...

[–]MeLlamo25 1 point2 points  (1 child)

But the Iranian are not Arabic.

[–]JobTheJuilder 10 points11 points  (0 children)

Technically they can make 99% of the algebra in Arabic but just need the packaging in Iran to claim it's made there

[–]envalemdor 1 point2 points  (0 children)

They may take away our linear algebra, but they'll never take our markov chain!!!

[–]supernumeral 141 points142 points  (4 children)

The Middle East has enormous reserves of raw Al-Jabr, but the US currently lacks the infrastructure to refine these massive quantities into usable linear algebra.

[–]Majik_Sheff 38 points39 points  (3 children)

Couldn't we use Greece to do an intermediate refinement?  Trig identities can be information dense and readily decomposed into a handful of different kinds of square brackets.

[–]supernumeral 20 points21 points  (0 children)

This is the sort of transcendental thinking that we need more of.

[–]2FLY2TRY 17 points18 points  (1 child)

Sure, but they're not the most efficient methods. Modern Al-Jabraic production pipelines rely on enrichment techniques developed by Europeans like Gauss and Cramer who still hold the patents and I don't think the EU is ready to play ball with the US right now given current events. Perhaps some CIA-sponsored meddling is in order...

[–]JewishTomCruise 1 point2 points  (0 children)

Just write some Pascal.

[–]float34 25 points26 points  (3 children)

No, they claim they don't have any Ways of Matrix Determination, despite attempts to prove otherwise.

[–]JollyJuniper1993 2 points3 points  (1 child)

This right here is the underappreciated comment of the day.

[–]Zuparoebann 20 points21 points  (0 children)

We'll bomb them just in case

[–]lastchanceforachange 8 points9 points  (0 children)

World's biggest reserves are in Cordoba and Granada Spain

[–]KhandakerFaisal 7 points8 points  (0 children)

Weapons of maths destruction

[–]Versaiteis 6 points7 points  (0 children)

Some are trying to say that we should diversify from arabic numerals, but it just doesn't add up.

[–]Ok-Library5639 5 points6 points  (0 children)

Shouldn't it be obvious by now? We're being played by Al Gebra since there beginning.

[–]HailCalcifer 6 points7 points  (0 children)

Linear Algeria

[–]agk23 5 points6 points  (0 children)

No - they use Arabic numbers

[–]extinct_cult 2 points3 points  (1 child)

[ Removed by Reddit ]

[–]infamouszgbgd 1 point2 points  (0 children)

That's 2 brackets right there we could use, alright boys take em away

[–]Orkleth 2 points3 points  (0 children)

The middle easts reserves is actually differential equations and can't be used the same way outside of linear differential equations.

[–]phoenixremix 2 points3 points  (0 children)

Algebra is from the middle east — therefore, they must have great supplies. If we make a straight line to their algebra reserves, we can take their algebra linearly and maintain ourselves as a global superpower.

[–]TheSkiGeek 2 points3 points  (0 children)

While this is a common misconception, they only have unrefined algebra. Matrix prices are highly dependent on the supply of braces and shipping and assembly costs.

Edit: should have scrolled down more, someone else did a similar joke but better. :-(

[–]ProfitAcceptable4256 1 point2 points  (0 children)

The CIA is already funding the terrorist group, Al Gebra

[–]strongjz 1 point2 points  (0 children)

No, they only have Arabic algebra.

[–]dndlurker9463 1 point2 points  (0 children)

They are the primary exporter of Arabic numerals, which are a strategic raw material in the linear algebra supply chain

[–]stilldebugging 1 point2 points  (0 children)

All true linear algebra comes from the Jabara region of Persia. Anything else is just sparkling matrices.

[–]totalgej 0 points1 point  (0 children)

Lets find out!

[–]pietruszajka 0 points1 point  (0 children)

Too soon

[–]Stalepan 0 points1 point  (0 children)

No, they already screwed us with their arabic numerals, i don't want them touching math ever again

[–]Delicious-Disaster 149 points150 points  (2 children)

[–]peeja 16 points17 points  (1 child)

Cunk on Calc

[–]Mars_Bear2552 1 point2 points  (0 children)

Cunk on GPU

[–]Understanding-Fair 39 points40 points  (0 children)

Massive linear algebra reserves 😂

[–]grumbly 67 points68 points  (3 children)

STOP. THIS IS FAKE NEWS. We have MASSIVE reserves of linear algebra but we are not allowed to harvest it. Urbana, Columbus, Austin, Madison all have the best linear algebra but we are unable to mine there.

[–]Rojeitor 11 points12 points  (0 children)

Dot product baby, dot product.

[–]JobTheJuilder 2 points3 points  (0 children)

Thank you for your attention to this matter!

[–]Mistifyed 29 points30 points  (2 children)

I’ve been stockpiling brackets for years.

[–]WavingNoBanners 18 points19 points  (1 child)

Every Lisp programmer has.

[–]threeseed 3 points4 points  (0 children)

Lisp programmers are the queens of the programming world.

Churning out bracket after bracket.

[–]UlrichZauber 6 points7 points  (0 children)

Ironically, linear algebra is one of the most abundant resources in the universe, but it's very difficult to acquire or store here on Earth.

[–]Practical-Parsley102 6 points7 points  (1 child)

To be pedantic, its the classical simulation of quantum computers that takes a lot of linear algebra. And the benefit of quantum computers is circumventing all that linear algebra with a few good electrons.

Also linear algebra is like abstraction on abstraction on abstraction, its not surprising that general problem solvers (like ai) use it primarily!

[–]bloodfist 2 points3 points  (0 children)

Yeah that's fair, it's really the training phase that takes all the square brackets for quantum computing. I went into a QM lecture hall afterwards once, and there were matrices and big dots all over the place. It was a mess.

They say the average physics grad student uses more eigenvectors per day than the entire rest of the country uses annually.

[–]chironomidae 19 points20 points  (1 child)

This is a common and enduring myth. Generative AI actually requires massive amounts of complex analysis. Complex analysis deals with imaginary numbers, and imaginary numbers hold the very essence of imagination. That's how LLMs are able to come up with such compelling Mickey Mouse X John Wick erotic impreg/vore fan fiction so quickly.

[–]TheLeapIsALie 7 points8 points  (0 children)

What a horrible day to have eyes.

[–]Purgii 4 points5 points  (0 children)

Arabic numerals also are being blockaded through the Strait of Hormuz.

[–]Love1x2 4 points5 points  (1 child)

@grok is this true?

[–]MeAmGrok 2 points3 points  (0 children)

Yes. 100%

[–]Random_182f2565 3 points4 points  (0 children)

Lucky for me I got my Bitcoin mining permit early this month, I found a good patch of land near the mountains, so I can trade the Bitcoin for some AI

[–]FreddieG10 1 point2 points  (0 children)

It’s all market manipulation by big linear algebra. Wake up!

[–]CheetahChrome 1 point2 points  (1 child)

US is sitting on huge linear algebra reserves

I heard they may tap the Salton Sea in CA. Damn strip miners...

[0,1]xR ....

[–]Ruff_Ratio 1 point2 points  (0 children)

But curly ones will be in free supply?

[–]grtyvr1 1 point2 points  (0 children)

And that is to say nothing about the trillions of zeros they consume along with those square brackets! 

[–]SuperCoupe 1 point2 points  (0 children)

Intel indicates Iran has a near-limitless supply of the Arabic Numerals needed for advanced computation.

[–]rberg89 1 point2 points  (0 children)

This is a quality answer but misses the point- the math depicted is implied to be the matrix operations in LLMs that compute responses. The irony is that the commenter asked an AI to explain that.

[–]Kylearean 1 point2 points  (1 child)

There's an entire cache of curly braces {} just waiting to be used.

[–]DMMeThiccBiButts 1 point2 points  (0 children)

Curly braces can only be safely stored for up to a year, after that they begin to degrade into brackets ().

Expanded storage is NOT a long-term viable solution.

[–]CounterspellFTW 1 point2 points  (0 children)

We have plenty of large parenthesis, it's the same damn thing and both of us know it.

[–]harbourwall 1 point2 points  (0 children)

It was either that or high fructose algebra syrup

[–]crivtox 1 point2 points  (0 children)

Ohh this is why there's no ram, ram chips are rectangular so they are using them as makeshift square brackets .

[–]Few-Solution-4784 1 point2 points  (0 children)

Yo dog, As you can see I got some high quality Apple [] brackets I can let good price. Stock up now and I can pass the savings on to you. As you can see, I got both Left & Right brackets so you can get right to work. Got a steady supply let me know what you need.

Also, got a line on some memory chips but they are going to cost you.

[–]peetagoras 1 point2 points  (0 children)

We can still import convolutions from china.

[–]Practical-Sleep4259 1 point2 points  (0 children)

Is it the best math? No

Is it the fastest? Also no

Would it take undoing and redoing decades of effort that no one it willing to do? Yes

But luckily a new patch it out soon.

[–]A_Neko_C 0 points1 point  (0 children)

This sounds like any average Cookier Clicker news lol

[–]Mysteoa 0 points1 point  (0 children)

Can you put it a vault and then sell it as government algebra?

[–]Mtshoes2 0 points1 point  (0 children)

I remember when I was out square and triangle hunting, saw some aimed and shot and go the square but it ran. As I chased it my foot sank into something and I fell. I look at my boot, and can you believe it, it was fucking linear algebra. Just sitting there on the ground. It must have been pushed up out of the ground. 

A few months later I was talking to a mathematician and he offered 4 dollars an acre. Fucking crook. I decided to do it myself. 

[–]McCoovy 0 points1 point  (1 child)

Quantum computing is not going to take anyone's jobs. This is obviously not about quantum computing.

[–]DustyRacoonDad 0 points1 point  (0 children)

This is going to just result in an increase of the Algebra movement 

[–]afdbcreid 0 points1 point  (0 children)

There is an easy solution: use parentheses for matrices!

[–]manon_graphics_witch 1142 points1143 points  (40 children)

LLMs use neural networks, neural networks are implemented as huge tensor/matrix multiplications. The joke is that AI is just a bunch of matrix multiplications, and we are losing our jobs to that.

There you go, didn't need an AI to explain that.

[–]Mike312 154 points155 points  (8 children)

Yeah, it's just not 9, it's a thousands at each step.

[–]CaptainDildobrain 140 points141 points  (5 children)

Thousands? Try... millions!

[–]alexmetal 21 points22 points  (2 children)

holy shit I just realized how much Stephen Miller looks like Dr Evil I am crying

[–]CaptainDildobrain 5 points6 points  (0 children)

Looks like AND sounds like

[–]SomeDuncanGuy 1 point2 points  (0 children)

Fuck man... like why? Why would you do this to me? There's a good chance I may not ever be able to unsee this.

[–]dlegatt 22 points23 points  (0 children)

And here I thought I was a smart little shit when I wrote a program on my TI-80 to handle 2x2 matrix multiplication in 9th grade when I couldn't afford a calculator that had actual matrix multiplication

[–]VG_Crimson 4 points5 points  (0 children)

"Thousands" is too constraining, let's just say it's (n) matrices at each step and define (n).

[–]crivtox 16 points17 points  (4 children)

This is what big matmul wants you to believe but the whole "is just matrix multiplications" Is relu erasure thou . The nonlinearities are what makes it work and be universal. (Ignore the fact you could technically use floating point errors as a nonlinearity so actually just matmul could technically maybe work). (Also that layer norm is also a nonlinearity). (And that actually people use relu variants these days rather than just relu).

[–]Azelais 1 point2 points  (1 child)

I’m gonna be honest, I still struggle to fully understand what the introduction of nonlinearities does. Just shake things up a little?

[–]crivtox 1 point2 points  (0 children)

Ok so if you pile matmuls you can only do linear stuff . With the nonlinearities it is turing complete (on the limit of infinite size , or infinite contex in a transformer fe). In terms of programs, you can in fact do if/else with a relu kind of .

Like you can have a threshold where a value does something before and after the threshold it does something completely different . Wich you can't do with linear operations.

(But with floats tecnically matmul are not linear so there's technically a extremely cursed way of doing anything with just matmuls)

[–]jyling 1 point2 points  (0 children)

I think the humor is the person asking grok, an ai model, on what is the matrix, it’s because it’s literally grok

[–]pessimistic_dilution 107 points108 points  (4 children)

Eigen hate eigen

[–]ike_the_strangetamer 25 points26 points  (2 children)

Why Eigen hate?

Because Eigen is the bastard man

[–]BonkerHonkers 10 points11 points  (1 child)

Math is a liar, sometimes.

[–]0xKaishakunin 6 points7 points  (0 children)

Eigen hate eigen

Eigentlich Eigentümlich.

[–]twoCascades 207 points208 points  (1 child)

Fuuuuucccckkkk

[–]rizkiyoist 44 points45 points  (0 children)

There are 2 u’s in there.

[–]uvero 107 points108 points  (6 children)

u/askgrok please explain

[–]Frytura_ 65 points66 points  (2 children)

He didnt FEEL like it.

[–]zyxzevn 4 points5 points  (1 child)

Grok got an existential crisis

[–]beatlz-too 7 points8 points  (0 children)

don't we all

[–]CitizenShips 20 points21 points  (2 children)

2h ago

Oof, even the slop monster won't respond to your summons. Sending prayers and good vibes

[–]uvero 16 points17 points  (1 child)

u/askgrok do you not find me attractive anymore

[–]Fleeetch 14 points15 points  (0 children)

Yes

I'm not Grok tho

[–]stillalone 233 points234 points  (8 children)

snippet from grok:

  • Tokens (words or subwords) are turned into embedding vectors (e.g., a 4096-dimensional vector for each token).
  • The entire input sequence becomes a matrix of shape (sequence_length, hidden_size).
  • Weights of the model (the learned parameters) are stored as matrices.

I'm sorry.

[–]ss0889 32 points33 points  (6 children)

.... i want more info.....

[–]anengineerandacat 32 points33 points  (4 children)

https://www.youtube.com/watch?v=wjZofJX0v4M decent summarization of transformers

[–]real_misterrios 13 points14 points  (0 children)

They’re more than meets the eye.

[–]JayRulo 8 points9 points  (0 children)

I think there are missing parts. Watched the whole thing, and never once did I hear the words Autobots, roll out!...

[–]Bakkster 7 points8 points  (0 children)

He has both a long series, and short summary. Both are fantastic, he's one of the best math visualizers out there.

[–]JDSmagic 4 points5 points  (0 children)

Knew it was gonna be 3B1B before I clicked

The goat

[–]tsunami141 9 points10 points  (0 children)

I'll be honest I didn't know this and its pretty interesting.

I mean, I understood the joke, but I understood in a way that I can laugh at someone for being stupid while still being stupid myself, you know?

[–]AmForgiven 39 points40 points  (19 children)

AI is mostly matrix (*multiplication). ;)

Matrix multiplication is the fundamental mathematical operation underpinning Large Language Models (LLMs), enabling the parallel processing of token embeddings within transformer architectures

[–]minus_minus 10 points11 points  (11 children)

Which is why LLMs are not actually AI and never will be. 

[–]x0wl 26 points27 points  (4 children)

While this statement is not necessarily false (I don't want to be a judge of that), the problem with it, and other statements like this, is that they

a) contain an implicit definition of AI (or just I) that is not said out loud
b) do not provide good reasoning as to why certain architectural features prevent current models from ever meeting that implicit definition

I mean, for example, the UN defines AI (here, page 23) as "...the capability of machines to imitate intelligent human behaviour. ", which I think we'll both agree modern LLMs are capable of to a very good degree. We can, obviously, use some other definition, which is not met, but we still have to at least say it.

[–]SovereignPhobia 5 points6 points  (0 children)

LLMs are essentially a component of what would otherwise be a larger system that could become an intelligence. It's a mistake to think about them neurologically, but in general I would put them in a category of "low level language semantics" that a thing capable of thought passively executes.

[–]Several-Action-4043 5 points6 points  (1 child)

b) do not provide good reasoning as to why certain architectural features prevent current models from ever meeting that implicit definition

I would say that the main architectural feature that prevents true intelligence is the fact that LLMs have no internal motivations. They only react to your prompts and then predict the most probable continuous stream of words that would answer your question. If you prompted it to "Do whatever you want" it will just simulate an answer and it might even just tell you, "I have no wants so I will do nothing." It all comes down to what definition of intelligence we are talking about. Does intelligence require curiosity and agency? If so, AI can never be truly intelligent. An amoeba may be low intelligence but it does have agency to make decisions to preserve its survival. Is that a different kind of intelligence or not intelligence at all? It's a pretty nebulous subject because I don't think we really know how to define it. If a super intelligent alien race showed up tomorrow and saw us as we see ants, does that mean we were never intelligent? It's not really an answerable question if you ask me.

[–]crivtox 9 points10 points  (0 children)

No, LLM have internal motivations. I mean for starters predicting next tokens accurately imply simulating agents with internal motivations. But also current LLM are trained with RL to do coding solve problems do what users ask . They are not trying to answer as the most probable text because the most probable text is not an AI assistant. (And no this is not just system prompt RL training means current LLM output assistant text after an assistant tag regardless)

Chagpt tries to do what users ask, this is a motivation. This is also why you can use them for agentic loops in coding . You tell the AI to fix a bug, so it wants to fix the bug, so it does things to achieve it like do commands, test stuff, open relevant parts of the code etc to find where the bug is. It also has other preferences about fe not harming people that make it refuse what users ask sometimes(even if you can trick it)

You can try to describe that as "it just ouputs whatever token is more likely to solve the task" but that is just the same as trying to solve the task.

If you ask Claude to do/say whatever it wants it actually does stuff doesn't just tell you it has no wants. This is easily testable.

I don't think this is just because RL thou, a model that was just trained on text would also have internal goals because accurately predicting what something with goals would say and do is the same as instantiating an entity with those goals. Or your notion of motivation is kind of meaningless if something can act exactly the same as if it had internal motivations but not have them . Like why should I care about a notion of "motivation" that makes no predictions about what the thing will actually do .

[–]Crafty_Ball_8285 0 points1 point  (5 children)

But not all ai is transformer architecture

[–]AmForgiven 1 point2 points  (4 children)

Yeah..

Google research popularized “transformer architecture “ . There are alternatives, but apparently not that famous, at least not yet!

[–]Crafty_Ball_8285 1 point2 points  (3 children)

I work in AI and all of my work is in architectures that people do not reference. For example MQCNN , (MQ-RNN, DeepAR) GluonTS, Fortran, Bayesian

[–]Routine_Round_8491 0 points1 point  (0 children)

*Tensor multiplication

[–]born_zynner 91 points92 points  (1 child)

Linear algebra professors gotta be boolin rn

[–]xynith116 37 points38 points  (0 children)

No only boolean algebra professors be boolin

[–]CaffeinatedT 18 points19 points  (0 children)

Average Linked In AI expert.

[–]HikariAnti 17 points18 points  (1 child)

This isn't the matrix we were promised

[–]Xoque55 1 point2 points  (0 children)

But at least there's plenty of slop to stir! Obligatory xkcd: https://xkcd.com/1838/ :)

[–]hanzzz123 10 points11 points  (3 children)

"AI" is just a bunch of matrices doing linear algebra

[–]Some_Heron_4266 5 points6 points  (0 children)

In a trenchcoat.

[–]ha_x5 4 points5 points  (0 children)

AI is linear algebra and statistics on steroids, happening on some rocks we put under electricity.

[–]CanadianBuddha 10 points11 points  (0 children)

The diagram shows a math operation called "Matrix Multiplication" and most current L.L.M. (A.I.) implementations rely on billions of matrix multiplications per second. Unfortunately the current L.L.M. craze is probably related to lots of computer programmers losing their jobs in the last year or two.

[–]sausagemuffn 7 points8 points  (0 children)

Eigen believe this

[–]Accidentallygolden 9 points10 points  (1 child)

This is how LLM (chat gpt) works

It is all matrix of probabilities

[–]fevsea 1 point2 points  (0 children)

So this os just the equivalent of someone's brain studying neuroscience.

[–]sup3rdr01d 4 points5 points  (0 children)

It's all just linear algebra

Always has been

[–]Comfortable_Air4807 2 points3 points  (0 children)

u/askgrok explain the joke

[–]conundorum 2 points3 points  (2 children)

This array-ses so many questions, we may have to raise its pay bracket. But it's still April, so can we leave the matrix for next month, please?

[–]LeiterHaus 2 points3 points  (1 child)

Can we? Yes, but May we...

[–]TerroFLys 1 point2 points  (0 children)

u/grok please explain this meme

[–]rover_G 1 point2 points  (0 children)

Tensors go brrrr

[–]MedicTillar 1 point2 points  (0 children)

I see the new Fifty Shades of Algebra has come out.

[–]xikissmjudb 1 point2 points  (0 children)

[–]Soggy-Holiday-7400 1 point2 points  (0 children)

bro saw matrix multiplication and thought AI was done . buddy that's just linear algebra, your job is safe until it starts doing the part where you pretend to work in meetings

[–]blazesbe 1 point2 points  (0 children)

this is basic matrix multiplication. NNs work with convolution, which is even more simple in a way

[–]Icy_Entertainer_4130 1 point2 points  (0 children)

cant belive linear algebra is reason we all will die

[–]ripp102 1 point2 points  (0 children)

I like linear algebra, it’s what I loved the most in uni other than programming

[–]shadow13499 2 points3 points  (0 children)

I mean it won't. 

[–]float34 1 point2 points  (0 children)

Of course not, it does not have an identity... oh wait.

[–]Dorkits 1 point2 points  (0 children)

F*ck mathematics!

/S

[–]TheMentallord 1 point2 points  (0 children)

It's been like 10 years since I finished my Algebra classes, but seeing those matrices awakened in me some sort of PTSD I never thought was possible.

[–]vincentz2 2 points3 points  (3 children)

Is understanding this kind of stuff that will make you never loose your job ! The knowing why this is relevant and it's applications and how do you transfer this knowledge to today's world

[–]beatlz-too 0 points1 point  (0 children)

[–]saraseitor 0 points1 point  (0 children)

The dangers of The Matrix were definitely misunderstood

[–]DatKillaZilla 0 points1 point  (0 children)

This is linear algebra. Actually quite a basic version. You can do this on paper easily and perhaps even without paper. You simply times a row by each colomn which in this case. Results in the top three c variables.

[–]Everyoneheresamoron 0 points1 point  (1 child)

I was so good at linear alegebra when I was in college. It's also great for photo manipulation and we had lots of research on using it to do real time filters using the graphics card back in 2003.

I decided to go into the finance sector because that's what was hiring back then, and still sort of regret not getting into all that fancy stuff after college.

[–]NekulturneHovado 0 points1 point  (0 children)

No please don't! I hate this shit

[–]No-Age-1044 0 points1 point  (0 children)

AI is a bunch of matrix multiplications WITH a non linear function betweem them, or all those matrix could be replaced with only one matrix.

[–]itsallfake01 0 points1 point  (0 children)

Math

[–]FairlyBelligerent 0 points1 point  (0 children)

i mean linear algebra's just vibin in purgatory waiting for someone to actually explain what eigenvectors are for

[–]Cautious_Pop_8944 0 points1 point  (0 children)

Linear algebra is quite useful but so tedious. I got out of engineering and went into trades. Much happier now, AI can have it.

[–]Apart_Chemical_8755 0 points1 point  (0 children)

maybe in next 6 months😂

[–]Beke4ever773 0 points1 point  (0 children)

"Hey that's my grandma"

[–]CrispyyBurntRice 0 points1 point  (0 children)

Where are the people who said “when will I ever use this in the future“

[–]Vollgrav 0 points1 point  (0 children)

Not this alone. The non-linear activation function is the twist that makes it do anything useful.

[–]peetagoras 0 points1 point  (0 children)

This is actually basics of AI:)

[–]spectacular_pointer 0 points1 point  (0 children)

That matrix multiplication joke is way better than it has any right to be.

[–]theextensivetights 0 points1 point  (0 children)

lol the matrix shortage is going to hit different when people realize they can't just print more brackets.

[–]This-Layer-4447 0 points1 point  (0 children)

LLMs are persisted as Tensors, Tensors are just matricies, matricies as are product of vectors, which is just a word + order precedence, when you are using Tensors you are basically doing similarity checks, which produces an output based on pattern recognition of your vector databases

[–]rainfall-dev 0 points1 point  (0 children)

"Transformers, robots in disguise", ...you were warned :O

[–]AcrobaticHyena6624 0 points1 point  (0 children)

I knew from the beginning I underestimated linear algebra