A couple questions for linear algebra final review

In summary: Very similar work, I could copy and paste everything I have from up ther and put it here, but once I figure out a, I'll get b.2:c) ##A^2 = AA## multiply by ##A^{-1}##... ##AAA^{-1} = AI = A##Proved.I only put this in here because I'm not sure what to do about d. I know it's very similar, but is it simply##A^3 = AAA## and since A is not invertible, A^3 cannot be reduced? How do I word that?
  • #1
BiGyElLoWhAt
Gold Member
1,622
131

Homework Statement



2: Some proofs:
a) If ##\{ v_1 , v_2...v_n \} ## are linearly independent in a real vector space, so are any subset of them.

b) If any subset of vectors ##\{ v_1 , v_2...v_n \} ## in a real vector space are linearly dependent, then the whole set of vectors are linearly dependent.

c) If A is invertible so is A^2

d) if A is not invertible, neither is A^3

I think I either know everything else or I can't really ask without asking to be walked through it so I'll just have to spend some time on google for acouple of them.

Any help is appreciated, my final's in 18 hours.

Homework Equations





The Attempt at a Solution



2:
a)I was thinking I could use closure under scalar multiplication and addition to prove that, but is that solid enough?

If ##\{ v_1 , v_2...v_n \} ## are linearly independent then by definition

##c_1v_1 + ... + c_{n-1}v_{n-1} ≠ c_nv_n##

(is subset ≈ subspace?)

then for ##\{ v_1 , v_2...v_m \} ## where m≤n (not really sure how to notate that all the vectors of this set are contained within the first set) which are all members of the first set, are linearly independent.

This feels weak though... It seems like common sense that if i have some linearly independent vectors, and I throw some of them out, then what I have is still linearly independent; I'm also sure that I need to start at the definition of linearly independent, which basically involves the demonstration of closure under addition and scalar multiplication. I'm just not really sure how to put it technically. I think this is one of those "prove 2+2=4" things and I'm just not seeing how to put it mathematically.

2:
b) Very similar work, I could copy and paste everything I have from up ther and put it here, but once I figure out a, I'll get b.

2:
c) ##A^2 = AA## multiply by ##A^{-1}##
... ##AAA^{-1} = AI = A##
Proved.
I only put this in here because I'm not sure what to do about d. I know it's very similar, but is it simply
##A^3 = AAA## and since A is not invertible, A^3 cannot be reduced? How do I word that?
 
Physics news on Phys.org
  • #2
BiGyElLoWhAt said:

Homework Statement



2: Some proofs:
a) If ##\{ v_1 , v_2...v_n \} ## are linearly independent in a real vector space, so are any subset of them.

b) If any subset of vectors ##\{ v_1 , v_2...v_n \} ## in a real vector space are linearly dependent, then the whole set of vectors are linearly dependent.

c) If A is invertible so is A^2

d) if A is not invertible, neither is A^3

I think I either know everything else or I can't really ask without asking to be walked through it so I'll just have to spend some time on google for acouple of them.

Any help is appreciated, my final's in 18 hours.

Homework Equations





The Attempt at a Solution



2:
a)I was thinking I could use closure under scalar multiplication and addition to prove that, but is that solid enough?

If ##\{ v_1 , v_2...v_n \} ## are linearly independent then by definition

##c_1v_1 + ... + c_{n-1}v_{n-1} ≠ c_nv_n##

(is subset ≈ subspace?)

then for ##\{ v_1 , v_2...v_m \} ## where m≤n (not really sure how to notate that all the vectors of this set are contained within the first set) which are all members of the first set, are linearly independent.

This feels weak though... It seems like common sense that if i have some linearly independent vectors, and I throw some of them out, then what I have is still linearly independent; I'm also sure that I need to start at the definition of linearly independent, which basically involves the demonstration of closure under addition and scalar multiplication. I'm just not really sure how to put it technically. I think this is one of those "prove 2+2=4" things and I'm just not seeing how to put it mathematically.

2:
b) Very similar work, I could copy and paste everything I have from up ther and put it here, but once I figure out a, I'll get b.

2:
c) ##A^2 = AA## multiply by ##A^{-1}##
... ##AAA^{-1} = AI = A##
Proved.
I only put this in here because I'm not sure what to do about d. I know it's very similar, but is it simply
##A^3 = AAA## and since A is not invertible, A^3 cannot be reduced? How do I word that?

For a) ##\{ v_1 , v_2...v_n \} ## are linearly independent when the vectors satisfy ##c_1v_1 + ... + c_nv_n = 0 \iff c_1=...=c_n=0##.

Suppose you take a smaller subset of the vectors, ##\{ v_1 , v_2...v_{n-1} \}##. Then these vectors are linearly independent when ##c_1v_1 + ... + c_{n-1}v_{n-1} = 0 \iff c_1 = ... = c_{n-1} = 0##.

Using the fact that ##c_1v_1 + ... + c_{n-1}v_{n-1} = c_nv_n##, you can say that the smaller subset of vectors is non-singular if ##c_nv_n = 0 \iff c_n = 0##.

It must be the case that ##c_n = 0##, otherwise you would have a contradiction. Why?

For part c) and d) what about the determinant?
 
  • #3
Aha! thank you I didn't even think about the determinant. But is the only situation where A is non invertible when det(A)=0? If it is then that makes c and d very simple as det(A^n) = det(A)^n.
 
  • #4
BiGyElLoWhAt said:
Aha! thank you I didn't even think about the determinant. But is the only situation where A is non invertible when det(A)=0? If it is then that makes c and d very simple as det(A^n) = det(A)^n.

If ##A## is invertible (non-singular), ##det(A) ≠ 0##.

If ##A## is not invertible (singular) ##det(A) = 0##.

It makes c) and d) quite straightforward.
 
  • Like
Likes 1 person
  • #5
Awesome, thanks a million.
 

Related to A couple questions for linear algebra final review

1. What is linear algebra?

Linear algebra is a branch of mathematics that deals with the study of linear equations, vector spaces, and linear transformations. It is used to solve problems involving systems of linear equations and to analyze geometric structures such as lines, planes, and higher-dimensional spaces.

2. What are some practical applications of linear algebra?

Linear algebra has many practical applications in fields such as engineering, physics, computer graphics, and data analysis. It is used to solve problems involving optimization, data fitting, image processing, and more. It is also the foundation for other areas of mathematics, such as differential equations and functional analysis.

3. What are the main concepts in linear algebra?

The main concepts in linear algebra include vectors, matrices, systems of linear equations, linear transformations, and eigenvalues and eigenvectors. These concepts are used to understand and solve problems involving linear relationships between variables and to analyze geometric structures.

4. How can I prepare for a linear algebra final?

To prepare for a linear algebra final, it is important to review and understand all the key concepts and techniques taught throughout the course. Practice solving problems and working with matrices and vectors. It can also be helpful to review past homework assignments and quizzes, and to seek out additional resources such as textbooks, online tutorials, and practice exams.

5. How can I apply linear algebra to real-world problems?

Linear algebra can be applied to real-world problems in a variety of fields, such as economics, computer science, and physics. It can be used to model and analyze systems of equations, optimize solutions, and analyze data sets. By understanding the principles of linear algebra, one can develop the skills to solve real-world problems and make informed decisions based on data and mathematical analysis.

Back
Top