Linear algebra Definition and 999 Threads

Linear algebra is the branch of mathematics concerning linear equations such as:





a

1



x

1


+

+

a

n



x

n


=
b
,


{\displaystyle a_{1}x_{1}+\cdots +a_{n}x_{n}=b,}
linear maps such as:




(

x

1


,

,

x

n


)


a

1



x

1


+

+

a

n



x

n


,


{\displaystyle (x_{1},\ldots ,x_{n})\mapsto a_{1}x_{1}+\cdots +a_{n}x_{n},}
and their representations in vector spaces and through matrices.Linear algebra is central to almost all areas of mathematics. For instance, linear algebra is fundamental in modern presentations of geometry, including for defining basic objects such as lines, planes and rotations. Also, functional analysis, a branch of mathematical analysis, may be viewed as the application of linear algebra to spaces of functions.
Linear algebra is also used in most sciences and fields of engineering, because it allows modeling many natural phenomena, and computing efficiently with such models. For nonlinear systems, which cannot be modeled with linear algebra, it is often used for dealing with first-order approximations, using the fact that the differential of a multivariate function at a point is the linear map that best approximates the function near that point.

View More On Wikipedia.org
  1. M

    Understanding the solution to this subspace problem in linear algebra

    For this problem, The solution for (a) is I am slightly confused for ##p \in W## since I get ##a_3 = 2a_1## and ##a_2 = 2a_0##. Since ##a_3 = 2b##, ##a_2 = 2a##, ##a_1 = b##, ##a_0 = a##. Anybody have this doubt too? Kind wishes
  2. R

    I The Orthogonality of the Eigenvectors of a 2x2 Hermitian Matrix

    The eigenvectors of a hermitian matrix corresponding to unique eigenvalues are orthogonal. This is not too difficult of a statement to prove using mathematical induction. However, this case is seriously bothering me. Why is the dot product of the vectors not rightly zero? Is there something more...
  3. R

    A Why are the eigenvectors of this hermitian matrix not orthogonal?

    Why are the eigenvectors of this hermitian matrix not checking out as orthogonal? The eigenvalues are certainly distinct. ChatGPT also is miscalculating repeatedly. I have checked my work many times and cannot find the error. Kindly assist.
  4. G

    I How does one find the dual of a matrix?

    How does one find the dual of a matrix? Thanks.
  5. S

    I Passive Transformation and Rotation Matrix

    I'm reading Group Theory by A. Zee , specifically, chapter I.3 on rotations. He used the passive transformation in analyzing a point ##P## in space. There are two observers, one labeled with unprimed coordinates and the other with primed coordinates. From the figure below, he deduced the...
  6. cianfa72

    I About the definition of vector space of infinite dimension

    Hi, a doubt about the definition of vector space. Take for instance the set of polynomals defined on a field ##\mathbb R ## or ##\mathbb C##. One can define the sum of them and the product for a scalar, and check the axioms of vector space are actually fullfilled. Now the point is: if one...
  7. Z

    Direct Proof that every zero of p(T) is an eigenvalue of T

    I was stuck on this problem so I looked for a solution online. I was able to reproduce the following proof after looking at the proof on the internet. By this I mean, when I wrote it below I understood every step. However, it is not a very insightful proof. At this point I did not really...
  8. N

    I Is there always a matrix corresponding to eigenvectors?

    I tried to find the answer to this but so far no luck. I have been thinking of the following: I generate two random vectors of the same length and assign one of them as the right eigenvector and the other as the left eigenvector. Can I be sure a matrix exists that has those eigenvectors?
  9. TGV320

    Linear Algebra How hard is this Linear Algebra textbook?

    Hello, I am currently self studying Linear Algebra using MIT lectures and the textbook Introduction to Linear Algebra by professor Gilbert Strang. I'm at the 16th lecture on Projection Matrices and Least squares approximation. The lectures are very informative, but I struggle a lot with...
  10. Z

    Prove 9 is eigenvalue of ##T^2\iff## 3 or -3 eigenvalue of ##T##.

    Suppose ##9## is an eigenvalue of ##T^2##. Then ##T^2v=9v## for certain vectors in ##V##, namely the eigenvectors of eigenvalue ##9##. Then ##(T^2-9I)v=0## ##(T+3I)(T-3I)v=0## There seem to be different ways to go about continuing the reasoning here. My question will be about the...
  11. Z

    Do these two statements imply an underlying induction proof?

    Here is one proof $$\forall u\in U\implies Tu\in U\subset V\implies T^2u\in U\implies \forall m\in\mathbb{N}, T^m\in U\tag{1}$$ Is the statement above actually a proof that ##\forall m\in\mathbb{N}, T^m\in U## or is it just shorthand for "this can be proved by induction"? In other words, for...
  12. Z

    Operator T, ##T^2=I##, -1 not an eigenvalue of T, prove ##T=I##.

    Now, for ##v\in V##, ##(T+I)v=0\implies Tv=-v##. That is, the null space of ##T+I## is formed by eigenvectors of ##T## of eigenvalue ##-1##. By assumption, there are no such eigenvectors (since ##-1## is not an eigenvalue of ##T##). Hence, if ##(T-I)v \neq 0## then ##(T+I)(T-I)v\neq 0##...
  13. A

    I About permutation acting on the Identity matrix

    Question: Let ##\sigma\in S_n## be a permutation and ##T_{\sigma}## be the matrix we obtain from ##I## by appling ##\sigma## on the raws of ##I## (I.e ##\sigma## acts on the rows of ##I##) . Then: 1. ##\det(T_{\sigma}) = sgn(\sigma) ## and 2. ##T_{\sigma} T_{\tau} =T_{\sigma\circ \tau}##, for...
  14. P

    Linear homogenous system with repeated eigenvalues

    I've solved this problem using a fairly involved technique, where I compute the matrix ##e^{tA}## (the fundamental matrix of the system) with a method derived from the Cayley-Hamilton's theorem. It is a cool method that I believe always works, but it can be a lot of work sometimes. It involves...
  15. H

    A gardener collected 17 apples...

    In fact, it WAS a homework couple of years ago, and I've solved it, kind of (below). I still would like to find a cleaner solution. Here is what I did. Let's say, the apples are labeled, and their weights are ##x_1, x_2, ...##. He takes out the apple #1 and finds that, e.g., ##x_2+x_5+x_9+... =...
  16. TGV320

    Calculating an n X n determinant

    Hello, I need some advice because I just can't figure out how to solve the problem. I could try to make the determinant triangular by adding all the b together, but that doen't seem a good way of solving the problem. Is there any direction I should be thinking of? Thanks
  17. Z

    Tricky Problem: Prove range T = null ##\phi## when null T' has dim 1

    This is problem 28 from chapter 3F "Duality" of Axler's Linear Algebra Done Right, third edition. I spent quite a long time on this problem, like a few hours. Since there is no available solution, I am wondering if my solution is correct. One assumption in this problem is that...
  18. Z

    Prove that range ##T'## = ##(\text{null}\ T)^0##

    My question is about item (b). (b) Here is what I drew up to try to visualize the result to be proved The general idea, I think, is that 1) ##(\text{null}\ T)^0## and ##\text{range}\ T'## are both subspaces of ##V'=L(V,\mathbb{F})##. 2) We can show that they have the same dimension. 3)We...
  19. Z

    Given surjective ##T:V\to W##, find isomorphism ##T|_U## of U onto W.

    I will use a proof by cases. Case 1: dim V = dim W Then ##T=T|_V## is an isomorphism of ##V## onto ##W##. The reason for this is that it is possible to prove that if ##T## is surjective, which it is, then it is also injective and so it is invertible (hence an isomorphism). Case 2: dim V < dim...
  20. Z

    Given specific v, dimension of subspace of L(V, W) where Tv=0?

    I was stuck when I started writing this question. I think I solved the problem in the course of writing this post. My solution is as follows: Consider any basis ##B## of ##V## that includes ##v##: ##(v, v_2, ..., v_n)##. ##L(V,W)##, where ##\dim{(V)}=n## and ##\dim{(W)}=m## is isomorphic with...
  21. Infrared

    Challenge Math Challenge Thread (October 2023)

    The Math challenge threads have returned! Rules: 1. You may use google to look for anything except the actual problems themselves (or very close relatives). 2. Do not cite theorems that trivialize the problem you're solving. 3. Do not solve problems that are way below your level. Some problems...
  22. G

    I Transfer rank2 tensor to a new basis

    The trace of the sigma should be the same in both new and old basis. But I get a different one. Really appreciate for the help. I’ll put the screen shot in the comment part
  23. D

    I Dimension and solution to matrix-vector product

    Let $$ X \in R^{m*n} $$ where m=n with rank(X)<m then there is at-least one equation which can be written as a linear combination of other equations. Let $$ \beta \in R^{n} $$. $$ X\beta=y $$ Suppose we have x<m independent equations (the equations are consistent) formed by taking the dot...
  24. Z

    Decompose 4x4 determinant into 24 determinants -- How many are zero?

    Here is an example of the decomposition for a 2 x 2 matrix We have ##2^2=4## determinants, each with only #n=2# non-automatically-zero entries. By "non-automatically-zero" I just mean that they aren't zero by default. Of course, any of ##a,b,c##, or ##d## can be zero, but that depends on the...
  25. crememars

    Identifying matrices as REF, RREF, or neither

    TL;DR Summary: we are given a set of coefficient matrices (shown below) and we need to determine whether they are in REF, RREF, or neither. Hello! I am having a lot of trouble identifying these matrices, and using the criteria checklist is not helping very much. Here is what I am working with...
  26. D

    I Row space, Column space, Null space & Left null space

    \begin{pmatrix} 2 & 4 & 6 \\ 3 & 5 & 8 \\ 1 & 2 & 3 \end{pmatrix} Using the row operations, R2<-- R2-3R1 R3<-- R3-R1 we find the row echelon form of the matrix. \begin{pmatrix} 1 & 2 & 3 \\ 0 & -1 & -1 \\ 0 & 0 & 0 \end{pmatrix} Based on the definition of row space in the book Í am...
  27. T

    B Question on basic linear algebra (new to the subject)

    It would be nice if someone could find the history of why we use the letters i and j or m and n for the basics when working with Matrices ( A = [aij]mxn ). I tried looking up the information and I was not successful. I understand what they represent in the context of the matter, but not why they...
  28. giodude

    I Using Linear Algebra to discover unknown Forces

    In classical mechanics, it seems like solving force equations are a question of finding a solvable system of equations that accounts for all existing forces and masses in question. Therefore, I'm curious if this can be mixed with reinforcement learning to create a game and reward function...
  29. chwala

    I Understanding the operation in ##(\mathbb{z_6})^{*}##

    ...Out of interest am trying to go through the attached notes, My interest is on the highlighted, i know that in ##\mathbb{z}/\mathbb{6z}## under multiplication we shall have: ##1*1=1## ##5*5=1## am assuming that how they have the ##(\mathbb{z}/\mathbb{6z})^{*}={1,5}## is that correct...
  30. chwala

    I Zero Element in a Ring: The 0 Ring Has Only One Element

    ...this element ##r## can only be ##0## correct? The zero ring has only one element which is ##0##.
  31. S

    B Can the Diagram in the Article Be Interpreted as Commutative?

    I'm used to seeing commutative diagrams where the vertices are mathematical objects and the edges (arrows) are mappings between them. Can the diagram ( from the interesting article https://people.reed.edu/~jerry/332/25jordan.pdf ) in the attached photo be interpreted that way? In the...
  32. S

    I Question from a proof in Axler 2nd Ed, 'Linear Algebra Done Right'

    My question is motivated by the proof of TH 5.13 on p 84 in the 2nd edition of Linear Algebra Done Right. (This proof differs from that in the 4th ed - online at: https://linear.axler.net/index.html chapter 5 ) In the proof we arrive at the following situation: ##T## is a linear operator on a...
  33. rajsekharnath

    Classical Source recommendation on Differential Geometry

    I am intending to join an undergrad course in physics(actually it is an integrated masters course equivalent to bs+ms) in 1-1.5 months. The thing is, in order to take a dive into more advanced stuff during my course, I am currently studying some of the stuff that will be taught in the first...
  34. Infrared

    Challenge Math Challenge - June 2023

    Welcome to the reinstatement of the monthly math challenge threads! Rules: 1. You may use google to look for anything except the actual problems themselves (or very close relatives). 2. Do not cite theorems that trivialize the problem you're solving. 3. Have fun! 1. (solved by...
  35. M

    Find Eigenvalues & Eigenvectors for Exercise 3 (2), Explained!

    For exercise 3 (2), , The solution for finding the eigenvector is, However, I am very confused how they got from the first matrix on the left to the one below and what allows them to do that. Can someone please explain in simple terms what happened here? Many Thanks!
  36. M

    Find the solutions of the system, for all λ

    Thread moved from technical math section, so there is no homework template. (∀λ∃ℝ) -x + y - z = 1 -2x + 10y + (2λ + 6) = 6 3x + 11y + (λ2+6)z = 5λ - 1 after gaussian elimination I have this: -1 4 -2 | 1 0 1 λ | 2 0 0 λ(λ-1) | 5λ So, for λ=0 ⇒ ∞ solutions, for λ=1...
  37. M

    I Find the Eigenvalues and eigenvectors of 3x3 matrix

    Assume a table A(3x3) with the following: A [ 1 2 1 ]^T = 6 [ 1 2 1 ]^T A [ 1 -1 1 ]^T = 3 [ 1 -1 1 ]^T A [ 2 -1 0]^T = 3 [ 1 -1 1]^T Find the Eigenvalues and eigenvectors: I have in mind to start with the Av=λv or det(A-λI)v=0.... Also, the first 2 equations seems to have the form Av=λv...
  38. M

    Proving eigenvalues of a 2 x 2 square matrix

    For this, Does someone please know why the equation highlighted not be true if ##(A - 2I_2)## dose not have an inverse? Many thanks!
  39. M

    Using inverse to find eigenvalues

    For this, I don't understand how if ##(A - 2I_2)^{-1}## has an inverse then the next line is true. Many thanks!
  40. M

    Understanding Eigenvalues of a Matrix

    For this, I am confused by the second line. Does someone please know how it can it be true since the matrix dose not have an inverse. Many thanks!
  41. A

    I Proving SL_2(C) Homeomorphic to SU(2)xT & Simple Connectedness

    Using the QR decomposition (the complex version) I want to prove that ##SL_2(C)## is homeomorphic to the product ##SU(2) × T## where ##T## is the set of upper-triangular 2×2-complex matrices with real positive entries at the diagonal. Deduce that ##SL(2, C)## is simply-connect. So, I can define...
  42. A

    Calculus Does Apostol Calculus Volume 2 cover sufficient multivariate calculus?

    Hello. I am currently doing a high school univariate calculus book, but I would like to go through Apostol's two volumes to get a strong foundation in calculus. His first volume seems great, and I've heard great things about his series, but I am not sure if his second volume contains sufficient...
  43. F

    Intro to Linear Algebra - Nullspace of Rank 1 Matrix

    The published solutions indicate that the nullspace is a plane in R^n. Why isn't the nullspace an n-1 dimensional space within R^n? For example, if I understand things correctly, the 1x2 matrix [1 2] would have a nullspace represented by any linear combination of the vector (-2,1), which...
  44. mattTch

    I Proof of Column Extraction Theorem for Finding a Basis for Col(A)

    Theorem: The columns of A which correspond to leading ones in the reduced row echelon form of A form a basis for Col(A). Moreover, dimCol(A)=rank(A).
  45. C

    I If T is diagonalizable then is restriction operator diagonalizable?

    The usual theorem is talking about the linear operator being restricted to an invariant subspace: I had no problem understanding its proof, it appears here for example: https://math.stackexchange.com/questions/3386595/restriction-operator-t-w-is-diagonalizable-if-t-is-diagonalizable However, I...
  46. hilbert2

    A The eigenvalue power method for quantum problems

    The classical "power method" for solving one special eigenvalue of an operator works, in a finite-dimensional vector space, as follows: suppose an operator ##\hat{A}## can be written as an ##n\times n## matrix, and its unknown eigenvectors are (in Dirac bra-ket notation) ##\left|\psi_1...
  47. H

    If |a> is an eigenvector of A, is f(B)|a> an eigenvector of A?

    Hi, If ##|a\rangle## is an eigenvector of the operator ##A##, I know that for any scalar ##c \neq 0## , ##c|a\rangle## is also an eigenvector of ##A## Now, is the ket ##F(B)|a\rangle## an eigenvector of ##A##? Where ##B## is an operator and ##F(B)## a function of ##B##. Is there way to show...
  48. Vanilla Gorilla

    B Array Representation Of A General Tensor Question

    So, I've been watching eigenchris's video series "Tensors for Beginners" on YouTube. I am currently on video 14. I, in the position of a complete beginner, am taking notes on it, and I just wanted to make sure I wasn't misinterpreting anything. At about 5:50, he states that "The array for Q is...
  49. Vanilla Gorilla

    B Transformation Rules For A General Tensor M

    So, I've been watching eigenchris's video series "Tensors for Beginners" on YouTube. I am currently on video 14. I am a complete beginner and just want some clarification on if I'm truly understanding the material. Basically, is everything below this correct? In summary of the derivation of the...
  50. S

    I Can time be another basis vector under Galilean relativity?

    I refer to the video of this page, where there is a description of Galilean relativity that is meant to be an introduction to SR, making the comprehension of the latter easier as a smooth evolution from the former. All the series is in my opinion excellent, but I think that this aspect is...
Back
Top