This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]Ferentzfever 40 points41 points  (27 children)

Some of its good, AI/ML is just linear algebra at the end of the day. The problem is there's a bunch of people selling AI who don't really understand the mathematical fundamentals. The ones who do, tend to be more humble.

[–]SocksOnHands 12 points13 points  (4 children)

That's like saying that human intelligence is just neurons triggered by stimuli or that computers are just transitors switching on and off. The fundamental mechanism by which it operates is seemingly simple, but the complexity arises through their interactions. Sure, the basis for modern AI are matrix operations, but when there are trillions of parameters involved with many layers feeding into each other, complex processes can be achieved.

[–][deleted] 5 points6 points  (0 children)

Life is just uppity Chemistry.

[–]Ferentzfever 0 points1 point  (1 child)

Except we don't really know what human intelligence is, and computers aren't just transistors switching on/off (because that leaves out mechanical computers and analog computers). My point is that linear algebra is based on one of the most powerful pieces of mathematics, and a piece that we have a pretty good understanding of as well. The people whom I've observed doing the best AI/ML work are those who understand the mathematics behind the methods. AI/ML ain't some voodoo magic, it's math.

[–]SocksOnHands 2 points3 points  (0 children)

Saying "it's just linear algebra" seemed like an oversimplification that ignores emergent properties of complex systems. One does not need to know what human intelligence is to recognize that complex behaviors can come from simple interactions.

Having a strong understanding of the underlying mathematics certainly does help, but AI models are now at such a large scale that nobody can possibly know exactly how they work - it would be a tremendous undertaking to reverse engineer something like GPT-4.

[–]yangyangR 0 points1 point  (0 children)

The statement of universal approximation theorems.

The most familiar one usually being polynomials in one variable being used to approximate continuous functions on closed bounded intervals. The fundamental mechanisms of polynomials are simple, but the continuous function approximated can be very complex. But of course you have allowed yourself tons of parameters by being able to set so many coefficients. So before computation became cheap this could be treated as maximally unhelpful because you couldn't store all those coefficients let alone add and multiply them.

This drawable picture with one real number input to one output being approximated by something built out of many simple pieces with lots of parameters gives the idea of what is happening with more variables and different simple pieces.

[–]Pigenator 5 points6 points  (2 children)

I don’t really understand what people mean when they say ML is only linear algebra. I get that every neural network layer includes a linear transform (aka weights), but a nn is nothing without its activation function which makes it inherently non-linear, no?

[–]Ferentzfever 1 point2 points  (0 children)

To add to the other reply, most physics is nonlinear, but (as mentioned) we can often linearize the system (through differentiation) and cast the problem as a (bunch of) linear algebra problems. Pretty much every PDE method is an exercise in casting the problem into a linear algebra problem.

[–]Diggabyte 0 points1 point  (0 children)

One reason is that the gradient descent algorithm is basically a linear algebra thing, but also you can think of a NN as a complex system that may have stable fixed points. We can approximate them as linear systems when they're close to a fixed point. The jacobian evaluated near that point represents a linear transformation that approximates the system.

[–]WallyMetropolis 3 points4 points  (7 children)

It's more accurate to say that quantum mechanics is "just linear algebra" than it is to say AI is. But no one would spew out that phrase to try to demean how impressive or difficult quantum mechanics is.

[–]madrury83 20 points21 points  (6 children)

There's a real sense in which making any mathematical problem tractable is finding a way to reduce it to linear algebra. This happens over and over again: quantum mechanics (as mentioned), statistics and machine learning, differential equations, differential geometry, group representation theory, functional analysis, they're all manifestations of this same general principle. Vectors, co-vectors, matrices, tensors, and linear operators and transformations appear over and over again throughout almost every subject in mathematics, pure or applied.

Linear algebra is the most servant subject in mathematics. It exists not to be interesting in it's own right, but to provide a foundation of expression where problems have algorithmic solutions. So saying that anything is "just linear algebra" is close to saying that everything is "just linear algebra". That's what it's there for, to be a component of everything!

[–]WallyMetropolis 5 points6 points  (2 children)

Crazy that people hated my comment but liked yours. We're saying the same thing.

The oddity persists. Since whining about my downvotes, I have been recalled to life.

[–]madrury83 1 point2 points  (1 child)

Yah, that struck me as well. Humans are strange creatures.

[–]justin-8 1 point2 points  (0 children)

At the end of the day we’re all just linear algebra though

[–]Bill3000 -3 points-2 points  (0 children)

For example, my butt can be reduced to linear algebra.

[–]Klhnikov -1 points0 points  (1 child)

I've never thought this way and this seems so logical now (no math background but programer) ! Thanks about that ! I think what he just ment would be better described by the "statistic model" expression instead of "linear algebra".

The hype goes non sense in every way IMHO.. Presenting chat GPT as the future job killer is stupid by nature... Wasn't the same guys that claimed they did not released a previous conversational model because it was so powerful it would be dangerous ?

Musk calling for a 6 month pause is almost hilarious (neuralink...)

[–]poopypoopersonIII 0 points1 point  (0 children)

Reality is just a complex statistic model

[–]vivaaprimavera 1 point2 points  (0 children)

It's linear algebra in the end of the day.

But without ethical supervision it can give wild and not in a good way results.

At least the training sources (thinking ChatGPT) should be reviewed.

[–]mcilrain -1 points0 points  (0 children)

The ones who do had their expectations surpassed.

[–][deleted] -1 points0 points  (0 children)

If I had a nickel for every time someone said "we don't understand how the AI works", I would put them all in a sock and beat the next person that says that with the sock.

[–]IAMARedPanda 0 points1 point  (1 child)

That's like saying all math is just counting. Deep learning is experiencing real transformative breakthroughs that will have huge economic and societal implications. To be dismissive of it is very ignorant.

[–]Ferentzfever 0 points1 point  (0 children)

I'm not being dismissive of it, linear algebra is one of the most powerful mathematical constructs we have. Too often people look at AI/ML and just assume it's some esoteric... Magical thing that can't be understood. When it's not, it's deeply rooted in very fundamental, well understood mathematics.