What unique contributions to math did linear algebra make?

  • #1
NoahsArk
Gold Member
243
22
TL;DR Summary
What unique contributions to math did linear algebra make?
I've been struggling to understand what was the key insight or insights that linear algebra brought to math, or what problems it allowed the solving of that couldn't be solved before. To make a comparison with calculus, I understand that calculus' two key insights were finding a method to determine the slope at a point on a curved line, and finding a way to calculate the area under a curve. With linear algebra it's much more vague to me what the contributions to math knowledge were. Also, calculus has two founders, Newton and Leibniz, whose work in discovering calculus is preserved, and it's clear what they discovered (derivatives and integrals as described above). I don't know if, if anyone, is the founder of linear algebra and what their key idea/s were in developing it.

First, you have ancient the Chinese manuscript from around 2,000 years ago describing how to solve a system of linear equations- this is used as an early example of linear algebra. Solving systems of linear equations, though, is a subject taught in regular algebra books and doesn't seem to be a unique contribution of linear algebra. Then there's the idea of matrices and vectors and how to multiply and perform other operations with them. While matrices and vectors are interesting to me as far as a unique way to symbolize groups of numbers, nothing jumps out at me about how or why this was the key insight of linear algebra, if it was the key insight at all.

For example, @Mark44 , I previously was asking for a practical example of how dot product multiplication was useful, and you gave an example:

1715819236592.png


I've been meaning to ask about this for a while: So when you multiply these two vectors you get the result is $15,575 (25x100 + 30x120 + 15x85 + 20x140 + 40x135). This is a problem, though, that can be easily figured out without use of linear algebra and without using vectors. It seems the only unique thing here mathematically is the use of vectors as symbols (assuming those are even unique to linear algebra and weren't already used in other areas of math) to represent repeated multiplication and addition. I don't see where the teeth, are, though, in using the vector symbols in the operation as opposed to just multiplying each of the store's products by its price and adding all the dollar amounts together.

Similarly, with matrix multiplication, here we are also just multiplying and adding groups of numbers which we could've done without the language of matrices. Although the idea of a linear transformation, which can be done through matrix multiplication, is also interesting, it seems like this is again just repeated multiplying and adding of numbers.

Thanks
 
Physics news on Phys.org
  • #2
Linear algebraic groups aka classical Lie groups, linear differential equation systems, tangent vector bundles, differentiation of multivariate functions, analytical geometry, mechanics, tensor algebras, Graßmann algebras, Lie algebras, Kolmogorov’s Theorem, and so on and so on. Linear algebra is basic to so many fields that there are countless examples.
 
  • Like
Likes pinball1970 and NoahsArk
  • #3
There are many areas of linear algebra that are useful in telling you something that you can't get otherwise. Things like determinants, eigenvalues, eigenvectors, basis, vector spaces, etc. Linear algebra is probably just as important and widespread in its uses as Calculus is, if not more widespread given its use in almost every area of math, science, and computing.

I think the issue is that much of linear algebra that you're used to dealing with doesn't look very 'new' or 'different' compared to how Calculus looks when you learn it. But I think that's just partly an artifact of most people learning some basic concepts of linear algebra before they learn calculus.
 
  • Like
Likes pinball1970 and NoahsArk
  • #4
NoahsArk said:
I don't see where the teeth, are, though, in using the vector symbols in the operation as opposed to just multiplying each of the store's products by its price and adding all the dollar amounts together.
You asked for a real-world application of the dot product, and that's what I gave you.
 
  • Like
Likes NoahsArk
  • #5
A recent example of effectiveness of linear algebra in solving mathematical problems is here.
There are few linear algebra solutions in the thread. This is the most elegant, short, and general.
 
  • Like
Likes NoahsArk
  • #6
@Mark44 Your example was helpful, and I didn't mean to imply that it wasn't. Your example was in the context of a different post, though. In this post I am just trying to understand what's unique about linear algebra compared to other areas of math. I used your example here only because it is basic (which is what I wanted), and helps me frame where I'm struggling- i.e. that linear algebra operations appear to be just repeated arithmetic and adding, and therefore I'm not sure why linear algebra is it's own subject. I'm trying to get to the heart of what LA is all about and what it can do that can't be done elsewhere. I do have a feeling that it's subtle, and do believe it's helpful, I just haven't been able to really define how or why, and I think it would give me a deeper understanding of LA if I could do that.
The idea of being able to describe a line in any number of dimensions was neat for me, but I'm not sure if that's LA's main theme or if it's one of them. Thanks
 
  • #7
NoahsArk said:
TL;DR Summary: What unique contributions to math did linear algebra make?
What is you math knowledge/level? The question is labeled "B", which means high school level, but you seem to be asking questions and expecting answers at level "I".
 
  • Like
Likes NoahsArk
  • #8
martinbn said:
What is you math knowledge/level? The question is labeled "B", which means high school level, but you seem to be asking questions and expecting answers at level "I".
I'm pretty much self taught. I didn't study math in college and only started teaching myself afterwards. I got up to calculus and at one point could solve basic derivative and integral problems but am rusty in a bunch of areas. My focus has been LA for the last few months. I'm considering taking courses at a local college to help me stay committed.
 
Last edited:
  • #9
NoahsArk said:
TL;DR Summary: What unique contributions to math did linear algebra make?

I've been struggling to understand what was the key insight or insights that linear algebra brought to math, or what problems it allowed the solving of that couldn't be solved before.
I think the question suffers a weakness. It appears to me as if you asked what was counting good for. There is no list of occasions where counting is an improvement towards the comparison of objects with fingers and toes. Any single item on such a list, e.g. counting improves shopping, would almost sound artificial. We are used to counting, and use it whenever necessary without even thinking that we do. The better question would be: why is linear algebra so successful that it shows up in basically every single branch of science and engineering?

There is one main reason for this in my opinion: we live in a world that is large enough so that linear approximations to real-life quantities are almost always sufficient to use! This concerns our direct neighborhood that is best described by a real vector space in space and time, and we were furthermore lucky once again, that all smooth functions we need otherwise can locally be approximated by linear tangent spaces. That linearity in space and time isn't global took us 5,000 years to become aware of, and the concept of linear approximations to non-linear functions was the birth of modern science and the reason why we can compute even non-linear equations by linear iterations.

The origin of linear algebra lies historically in the early eighteenth century. People already knew Cartesian coordinates but they were inconvenient because the axis had often been chosen badly. Even Leibniz had already dreamt of a Geometric Analysis. The first attempt to avoid coordinates was when people began to use complex numbers for the Euclidean plane.

J. Dieudonné said:
The addition of complex numbers as an operation has already been widely used in Kinematics and Dynamics for the composition of forces and velocities. It is actually quite surprising that it hasn't earlier been tried to use this operation in Algebra.
Dieudonné continuous to speculate that perhaps the absence of the concept of a unique origin, the zero in a vector space might have been a reason. People used so-called free vectors with a variable starting point as it is necessary in Mechanics. Bellavitis - not knowing Gauß's notes that weren't published during Gauß's lifetime -first introduced in 1832 equivalence classes of "equipollent" vectors. Their representatives according to a unique origin is what we nowadays call a vector. Möbius before had already used what he called a "barycentric calculus" in 1823 to perform what we call vector calculations. As the name suggests, he considered the barycenter of geometric objects as his origin, inspired by Mechanics.


Thus, linearity is what surrounds us. It is therefore very likely that we started to investigate it. However, as so often, it began with the necessity in physics (forces and velocities) and was then adapted in mathematics to do analytical geometry, complex algebra, and complex calculus. The list Leibniz, Gauß, Bellavitis, and Möbius once more demonstrates that scientific achievements are most often convergent efforts of many,
 
Last edited:
  • Like
Likes NoahsArk
  • #10
@fresh_42

Do you have any examples of "linear approximations to non-linear functions"? That's a new concept for me I think.

Much of what I've seen being done in linear algebra to me doesn't even resemble algebra, which is about solving for unknowns, so I'm not even sure why it's called linear algebra? You can do linear transformations, for example, with no algebra.

The concept of using different basis is something that I haven't seeen in any other math subject. Is the concept of chosing unique basis what is they key idea behind LA or one of them?

Not sure I understood the counting example. Thx
 
  • #11
NoahsArk said:
@fresh_42

Do you have any examples of "linear approximations to non-linear functions"? That's a new concept for me I think.

It is the concept of differentiation, basically the origin of modern sciences. E.g. let ##y=x^2-7x+3.## Then
\begin{align*}
y'&=\dfrac{dy}{dx}=\lim_{\Delta x \to 0}\dfrac{\Delta y}{\Delta x}\\[12pt]&=\lim_{\Delta x \to 0}\dfrac{y(x+\Delta x)-y(x)}{\Delta x}\\[12pt]
&=\lim_{\Delta x \to 0}\dfrac{(x+\Delta x)^2-7(x+\Delta x)+3 - (x^2-7x+3)}{\Delta x}\\[12pt]
&=\lim_{\Delta x \to 0}\dfrac{x^2+2\cdot x\cdot \Delta x +(\Delta x)^2-7x-7\Delta x +3-x^2+7x-3}{\Delta x}\\[12pt]
&=\lim_{\Delta x \to 0}(2\cdot x +\Delta x -7)=2x-7
\end{align*}
For certain point ##p=(x_0,y_0)## we get ##y'(x_0)=2x_0+7## which is the slope of a tangent at ##p## that approximates the quadratic function ##y.## The equation for the tangent is ##y=y_0+\underbrace{(2x_0-7)\cdot}_{\text{lin. fct.}} (x-x_0).## For ##p=(2,-7)## this looks like


1715879476841.png

The blue straight is the tangent to the red graph of ##y## at point ##p.## It is the beginning of calculus.

A more pragmatic example of how linear concepts are used to solve non-linear problems is the Newton-Raphson algorithm to find roots numerically. Note the following remark:
Wikipedia said:
A special case of Newton's method for calculating square roots was known since ancient times and is often called the Babylonian method.
Seems, the idea is by far way older than I thought!

NoahsArk said:
Much of what I've seen being done in linear algebra to me doesn't even resemble algebra, which is about solving for unknowns, so I'm not even sure why it's called linear algebra? You can do linear transformations, for example, with no algebra.

That solely depends on what you mean by algebra. Linear algebra solves (sets of) linear equations. Solving equations of higher order immediately leads deep into abstract algebra in general, and algebraic geometry in particular. Things become complicated quickly with more than one variable of degree two or higher.

Linearity is what we can compute. For non-linearity, we have two possibilities: either it is really difficult (and ##NP\neq P## (https://www.physicsforums.com/insights/p-vs-np-conjecture-calculations-and-meaning/) or we are yet too stupid to deal with it. People think the first is the case, but I'm not 100% convinced.

Linearity plays a crucial role in physics. It is the framework for Mechanics and what we can handle in all other cases. The basic concept of the example above (generalized in many, many ways) can be found in general relativity as in quantum field theory.

NoahsArk said:
The concept of using different basis is something that I haven't seeen in any other math subject.

Well, it's more the domain of physics. Physicists always deal with coordinates, often various of them simultaneously, because they need to measure things. Mathematicians usually try to avoid coordinates, which is why your perspective on linear algebra right now might be a bit theoretical and narrowed. Different bases are important. Imagine a rotation. Whether you sit in a carousel or stand nearby watching makes a difference although it is the same phenomenon. It is still the old problem that we had with the "free vectors". Many problems in mathematics (and physics) deal with finding suitable bases. However, the examples I could tell you are too complicated, so I leave you with the image of the carousel.


NoahsArk said:
Is the concept of chosing unique basis what is they key idea behind LA or one of them?

No. The key is the linearity of objects, i.e. vectors, and functions, i.e. linear transformations. A vector space has a basis, but we do not necessarily calculate with a bases. E.g. functions build a vector space by
\begin{align*}
(f+g)(x)&=f(x)+g(x)\\
(c\cdot f)(x)&= c\cdot f(x)
\end{align*}
with the zero function ##x\mapsto 0## as origin, but nobody (with exceptions) cares about a basis. Yes, this is an infinite-dimensional example. Polynomials of degree less than a certain number would be a finite-dimensional example.

NoahsArk said:
Not sure I understood the counting example. Thx
Counting is something we do without thinking about it. Linear algebra is a toolset that scientists use without thinking about it.
 
  • Like
Likes NoahsArk
  • #12
NoahsArk said:
Solving systems of linear equations, though, is a subject taught in regular algebra books
OK, go ahead and solve 12 equations in 12 unknowns without linear algebra. Good luck!
 
  • Like
Likes Mark44 and NoahsArk
  • #13
@Vanadium 50

This is not a rhetorical question and is something I’ve been wondering about- Can you (or anyone) please explain what linear algebra technique for solving systems of equations does something we can’t already with the regular elimination method taught in regular algebra? I can do Gaussean elimination (i.e. by creating an augmented matrix in echelon form). While I do see that it’s a neat method and how it’s convenient to be able to do this with a matrix without needing to write all the variables, I don’t see the elimination part as conceptually much different than what’s being done in regular algebra.

I’m still struggling to be able to express in a sentence or two what linear algebra is about. I can do that with algebra, trig, geometry, and calculus but not with LA, and I haven’t seen the big picture idea of LA expressed in a way I can fully understand. All I’m really looking to do with my question is to see the bigger picture and not just the forest for the trees. I could see myself getting a good grade in an LA course (which I might be taking soon at a local university) by just learning the rules without deeply understanding them, but I want to really understand what’s going on.
 
Last edited:
  • #14
Gaussian elimination is a product of linear algebra. The elimination method is justified because it does not change the linear dependence/independence of the given system - in other words, the system remains equivalent to the initial system after every step in the elimination process. Gaussian elimination is "a way to solve" but does not answer immediately whether something is solvable. The Kronecker-Capelli theorem gives a readily verifiable condition to check for solvability - again, linear algebra.

Linear algebra is like a basic building block. Virtually anything we encounter either applies LA or is generalised from LA (category theory mostly).
 
  • Like
Likes NoahsArk
  • #15
NoahsArk said:
@Vanadium 50

This is not a rhetorical question and is something I’ve been wondering about- Can you (or anyone) please explain what linear algebra technique for solving systems of equations does something we can’t already with the regular elimination method taught in regular algebra? I can do Gaussean elimination (i.e. by creating an augmented matrix in echelon form). While I do see that it’s a neat method and how it’s convenient to be able to do this with a matrix without needing to write all the variables, I don’t see the elimination part as conceptually much different than what’s being done in regular algebra.
This is linear algebra!
NoahsArk said:
I’m still struggling to be able to express in a sentence or two what linear algebra is about.
Why do yiu want that?
NoahsArk said:
I can do that with algebra, trig, geometry, and calculus...
I doubt it. Can you do it with algebra?
NoahsArk said:
but not with LA, and I haven’t seen the big picture idea of LA expressed in a way I can fully understand. All I’m really looking to do with my question is to see the bigger picture and not just the forest for the trees.
Study linear algebra first. You will see the big picture later.
NoahsArk said:
I could see myself getting a good grade in an LA course (which I might be taking soon at a local university) by just learning the rules without deeply understanding them, but I want to really understand what’s going on.
 
  • Like
Likes NoahsArk
  • #16
As pointed out, "regular" elimination is a consequence of linear algebra.

But as a practical matter, as I said, try solving very large systems with using only freshman algebra. It will take you a very long time.
 
  • Like
Likes NoahsArk
  • #17
@martinbn

"Study linear algebra first. You will see the big picture later."

That makes sense.
 
  • #18
NoahsArk said:
I’m still struggling to be able to express in a sentence or two what linear algebra is about.
Solving systems of linear equations and working with matrices, vectors, vector spaces, and linear transformations.
 
  • Like
Likes NoahsArk
  • #19
NoahsArk said:
This is not a rhetorical question and is something I’ve been wondering about- Can you (or anyone) please explain what linear algebra technique for solving systems of equations does something we can’t already with the regular elimination method taught in regular algebra? I can do Gaussean elimination (i.e. by creating an augmented matrix in echelon form). While I do see that it’s a neat method and how it’s convenient to be able to do this with a matrix without needing to write all the variables, I don’t see the elimination part as conceptually much different than what’s being done in regular algebra.
It is the same method written a bit differently. The only major difference is, that doing it by hand can be shorter due to individual inspection of the resulting equations. However, Gauß elimination is easy to implement on a computer that does all the work for you, e.g. https://www.symbolab.com/solver/matrix-calculator.

If I had to deal with a large matrix, i.e. with many equations and many variables, I think, I would prefer the machine over my scribblings.

NoahsArk said:
I’m still struggling to be able to express in a sentence or two what linear algebra is about.
It is all about vector spaces, e.g. the one we all live in: height, width, length. It is about forces and velocities. It is about the direction you fly off the road if your car loses traction for whatever reason. Again, it is about linearity and linear equations.

Try to understand why a rotation
$$
\begin{pmatrix}x\\y\end{pmatrix}\longmapsto \begin{pmatrix}\cos \varphi &-\sin \varphi \\ \sin \varphi & \cos \varphi \end{pmatrix}\cdot \begin{pmatrix}x\\y\end{pmatrix}
$$
is called a linear transformation! Since when is a rotation linear?! Or why is the derivative a linear approximation? ##(x^3)' =3x^2## is a derivative, but barely a linear function. Try to solve this virtual contradiction. Both examples show you that even a non-linear subject can be approached by methods of linear algebra. Hint: look closer at the example I gave in post #11.

NoahsArk said:
I can do that with algebra, trig, geometry, and calculus but not with LA, and I haven’t seen the big picture idea of LA expressed in a way I can fully understand.

Yes, but that is because you haven't tried to resolve the problems I mentioned that point to contradictions which aren't any.

NoahsArk said:
All I’m really looking to do with my question is to see the bigger picture and not just the forest for the trees.
Forces and velocities. Approach linear algebra the same way as people did historically. You haven't seen enough trees to recognize it's a forest! E.g. this entire article is about such trees you haven't seen yet: https://www.physicsforums.com/insights/introduction-to-the-world-of-algebras/
It is entirely based on linear algebra since every definition of an algebra ##A## starts with the sentence "let ##A## be a vector space".


NoahsArk said:
I could see myself getting a good grade in an LA course (which I might be taking soon at a local university) by just learning the rules without deeply understanding them, but I want to really understand what’s going on.

Very good! And that is not meant ironically! It is important to have such a goal. I recall a verbal exam for which I had to write the protocol. The student had been asked what a linear transformation is. She could tell the definition without any mistakes, but when asked what a linear transformation is, or to name an example, all she could do was repeat the definition. I'm a bit curious whether she ever recognized why she didn't receive an A.

Linear algebra is all about arrows and the transformation of arrows. Simple as that. However, if you allow additional properties (see the second link above) then those arrows can become quite complicated.
 
Last edited:
  • #20
The three classical geometry problems: Doubling the cube, Trisecting the angle, and Squaring the circle, have no solutions. The proofs of that use field extensions, which includes using linear algebra. So. linear algebra did contribute to the solutions of these famous problems. Of course, the ancient Greeks had no idea of these methods.
 

Similar threads

Replies
12
Views
3K
  • STEM Academic Advising
2
Replies
60
Views
3K
  • Science and Math Textbooks
Replies
11
Views
2K
  • Precalculus Mathematics Homework Help
2
Replies
58
Views
3K
  • Linear and Abstract Algebra
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
15
Views
923
  • STEM Academic Advising
Replies
16
Views
689
  • Science and Math Textbooks
Replies
13
Views
2K
  • Science and Math Textbooks
Replies
15
Views
2K
Replies
5
Views
2K
Back
Top