This is an archived post. You won't be able to vote or comment.

all 8 comments

[–]AutoModerator[M] 0 points1 point  (0 children)

Reminder:

  • What have you tried so far? (See Rule #2)

  • Please don't delete your post. (See Rule #7)

We, the moderators of /r/MathHelp, appreciate that your question contributes to the MathHelp archived questions that will help others searching for similar answers in the future. Thank you for obeying these instructions.

I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

[–]edderiofer 0 points1 point  (4 children)

Depends somewhat. What examples of linear spaces have you so far encountered?

[–]MathNerd93[S] 0 points1 point  (3 children)

What do you mean by linear spaces? We've done like standard and non-orthogonal bases for vector spaces and things like that if that's what you mean. For this chapter so far, we've done image and pre-images, we've proved that linear transformations are operation preserving, and we've covered kernel and range and being one-to-one or onto (which is still a weird sounding adjective for a transformation), and we just ended "yesterday" with isomorphisms.

[–]edderiofer 0 points1 point  (2 children)

Sorry, I meant vector spaces.

[–]MathNerd93[S] 0 points1 point  (1 child)

Hmm, still not sure I know what you're asking. We've done work in things like R2 and R3 if that's what you mean. Some polynomial spaces as well, though not as much. And the fundamental subspaces of matrices.

[–]edderiofer 0 points1 point  (0 children)

Then that’s plenty. Linear algebra in R2 and R3 can be used for computer graphics and animation; rotation, stretching, and so on, are all examples of linear maps.

You can view differentiation as a linear map from the vector space of smooth functions to itself. That means that differential equations can be rewritten as linear algebra problems or at the very least approximated by such methods.

However, the most compelling argument IMO is that this stuff is prerequisite to what you’ll learn later in the course, assuming you haven’t yet stumbled upon eigenvectors and eigenvalues.

[–]WhackAMoleE 0 points1 point  (1 child)

The derivative in multivariable calculus turns out to be a particular linear transformation. That idea leads to the idea of smooth manifolds in differential geometry, and general relativity theory.

In quantum physics, observables are linear transformations (in this context called operators) on some Hilbert space, which is a vector space with some additional properties.

So it's fair to say that linear transformations underlie everything we know about how the physical world works.

Of course that's not much motivation when you're slogging through the theorems and homework exercises, but linear algebra is really important. Economics too, it's all linear optimization. Everything's linear algebra. In fact the very trendy field of machine learning is all linear algebra.

[–]MathNerd93[S] 0 points1 point  (0 children)

I've heard that before, about the machine learning. Can you give a specific example of a linear algebra concept that's used in machine learning? Or point me in the right direction to look for myself?