Linear algebra multiple choice questions

In summary, the conversation discusses properties of symmetric matrices, rotations of the Euclidean plane, and similar matrices. It also covers the concepts of eigenvalues and eigenvectors and their relationship to linear operators. The summary concludes with a true or false statement for each of the ten statements discussed.
  • #1
underacheiver
12
0

Homework Statement


1. If A is a real symmetric matrix, then there is a diagonal matrix D and an orthogonal matrix P so that D = P T AP.
a. True
b. False

2. Given that λi and λj are distinct eigenvalues of the real symmetric matrix A and that v1 and v2 are the respective eigenvectors associates with these values, then v1 and v2 are orthogonal.
a. True
b. False

3.If T(θ) is a rotation of the Euclidean plane 2 counterclockwise through an angle θ, then T can be represented by an orthogonal matrix P whose eigenvalues are λ1 = 1 and λ2 = -1.
a. True
b. False

4. If A and B represent the same linear operator T: U → U, then they have the same eigenvalues.
a. True
b. False

5. If A and B represent the same linear operator T: U → U, then they have the same eigenvectors.
a. True
b. False

6. If A and B have the same eigenvalues, then they are similar matrices.
a. True
b. False

7. Which of the following statements is not true?
a. Similar matrices A and B have exactly the same determinant.
b. Similar matrices A and B have exactly the same eigenvalues.
c. Similar matrices A and B have the same characteristic polynomial.
d. Similar matrices A and B have exactly the same eigenvectors.
e. none of the above

8. Let the n × n matrix A have eigenvalues λ1, λ2 ... λn (not necessarily distinct). Then det(A) = λ1λ2 ... λn.
a. True
b. False

9. Every real matrix A with eigenvalues as in problem 8 is similar to the diagonal matrix D = diag [λ1, λ2, ... λn].
a. True
b. False

10. Eigenvectors corresponding to distinct eigenvalues for any n × n matrix A are always linearly independent.
a. True
b. False

Homework Equations


The Attempt at a Solution


1. b
2. a
3. a
4. a
5. b
6. b
7. d
8. a
9. b
10. a
 
Physics news on Phys.org
  • #2
[I know this is 14 years old, but it's sitting on the unanswered questions list so I felt it worth responding to. The OP has not, of course, shown any justification for their answers, some of which are incorrect (Iand mutually inconsistent).

underacheiver said:

Homework Statement


1. If A is a real symmetric matrix, then there is a diagonal matrix D and an orthogonal matrix P so that D = P T AP.
a. True
b. False

OP's answer: False

This is in fact true. The eigenvalues of a real symmetric matrix are real, and the corresponding eigenspaces are orthogonal with respect to the euclidean inner product (see #2 below). It follows that we can find an orthogonal basis for each eigenspace, and putting these together gives an orthogonal basis for the entire space with respect to which the matrix representation of the linear map is diagonal. Furthermore, the change of basis matrix from the standard basis to the new basis is orthogonal.

2. Given that λi and λj are distinct eigenvalues of the real symmetric matrix A and that v1 and v2 are the respective eigenvectors associates with these values, then v1 and v2 are orthogonal.
a. True
b. False

OP's answer: True

Not sure why the indices on the eigenvalues are i and j and those on the eigenvectors are 1 and 2; I will use the latter thiroughout.

This is true. As [itex]A[/itex] is symmetric, [tex]
\lambda_1 \langle v_1, v_2 \rangle = \langle Av_1, v_2 \rangle = \langle v_1, Av_2 \rangle = \lambda_2 \langle v_1, v_2 \rangle[/tex] and hence [tex]
(\lambda_1 - \lambda_2)\langle v_1, v_2 \rangle = 0.[/tex] Since [itex]\lambda_1 \neq \lambda_2[/itex], we have that [itex]v_1[/itex] and [itex]v_2[/itex] are orthogonal.

3.If T(θ) is a rotation of the Euclidean plane 2 counterclockwise through an angle θ, then T can be represented by an orthogonal matrix P whose eigenvalues are λ1 = 1 and λ2 = -1.
a. True
b. False

OP's answer: True

This is false. The eigenvalues of [tex]T(\theta) = \begin{pmatrix} \cos \theta & -\sin \theta \\ \sin \theta & \cos \theta \end{pmatrix}[/tex] are [itex]e^{\pm i\theta}[/itex].

4. If A and B represent the same linear operator T: U → U, then they have the same eigenvalues.
a. True
b. False

OP's answer: True

This is true: a linear map is a zero of its minimal polynomial (as a function on the space of linear maps); the scalar roots of the minimal polynomial (as a function on the field of scalars) are the eigenvalues. Different matrix representations of the same linear map are similar, and therefore have the same minimal polynomial.

5. If A and B represent the same linear operator T: U → U, then they have the same eigenvectors.
a. True
b. False

OP's answer: False

This is a little confused. Both matrices represent the same linear map, which has one set of eigenvectors. However, if they represent the same map with respect to different bases then the components of the eigenvectors with respect to the different bases will not necessarily be identical.

6. If A and B have the same eigenvalues, then they are similar matrices.
a. True
b. False

OP's answer: False

This is indeed false: consider [itex]\begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}[/itex] and [itex]\begin{pmatrix} 1 & 1 \\ 0 & 1 \end{pmatrix}[/itex], both of which have 1 as their only eigenvalue with algebraic multiplicity 2, but geometric multplicities 2 and 1 respectively.

7. Which of the following statements is not true?
a. Similar matrices A and B have exactly the same determinant.
b. Similar matrices A and B have exactly the same eigenvalues.
c. Similar matrices A and B have the same characteristic polynomial.
d. Similar matrices A and B have exactly the same eigenvectors.
e. none of the above

OP's answer: d.

(d) is indeed false; hence (e) is false also (it is not the case that none of (a) to (d) are not true).

8. Let the n × n matrix A have eigenvalues λ1, λ2 ... λn (not necessarily distinct). Then det(A) = λ1λ2 ... λn.
a. True
b. False

OP's answer: True

This is true; the determinant of a jordan normal form is the product of the diagonal entries, which are the eigenvalues.

9. Every real matrix A with eigenvalues as in problem 8 is similar to the diagonal matrix D = diag [λ1, λ2, ... λn].
a. True
b. False

OP's answer: False

This is false; the answer to #6 again provides a counterexample.

10. Eigenvectors corresponding to distinct eigenvalues for any n × n matrix A are always linearly independent.
a. True
b. False

OP's answer: True

This is true. If a vector [itex]v \neq 0[/itex] in the eigenspace of [itex]\lambda_1[/itex] is linearly dependent with vectors in the eigenspace of [itex]\lambda_2[/itex], then [itex]v[/itex] is in both eigenspaces so that [itex](\lambda_1 - \lambda_2)v = 0[/itex]; since [itex]\lambda_1 \neq \lambda_2[/itex] it follows that [itex]v = 0[/itex].
 
  • Like
Likes PhDeezNutz
  • #3
Re #3, non-trivial rotations neither fix nor just stretch any point other than the origin. Thus they have no
Real eigenvalues.
 

FAQ: Linear algebra multiple choice questions

What is linear algebra?

Linear algebra is a branch of mathematics that deals with linear equations and their representations in vector spaces. It involves studying the properties and operations of vectors and matrices.

What are the applications of linear algebra?

Linear algebra has numerous applications in fields such as physics, engineering, computer science, and economics. It is used to solve systems of equations, analyze data, and model real-world problems.

What is a vector in linear algebra?

A vector in linear algebra is a mathematical object that has both magnitude and direction. It is represented by an array of numbers and can be used to represent physical quantities such as velocity and force.

What is a matrix in linear algebra?

A matrix in linear algebra is a rectangular array of numbers arranged in rows and columns. It is used to represent systems of linear equations and perform operations such as addition, subtraction, and multiplication.

What are eigenvalues and eigenvectors in linear algebra?

Eigenvalues and eigenvectors are important concepts in linear algebra. Eigenvalues are scalars that represent the amount by which an eigenvector is stretched or compressed by a linear transformation. Eigenvectors are non-zero vectors that remain parallel to their original direction after a linear transformation.

Back
Top