How Do You Determine the Orthogonal Complement in a Linear Algebra Problem?

In summary, a selfadjoint endomorphism with eigenvalues λ_1=1 and λ_2=2, and kernel L((1,2,1)), can be represented by the space (ker(f))^⊥, which is equal to L((1,0,-1),(a,b,c)), where (α,β,-α-2β) is the subspace orthogonal to (1,2,1). This is due to the definition of selfadjointness and the fact that eigenvectors of a selfadjoint operator corresponding to different eigenvalues are orthogonal.
  • #1
Felafel
171
0

Homework Statement


Write a selfadjoint endomorphism ## f : E^3 → E^3## such that ##ker(f ) =
L((1, 2, 1)) ## and ## λ_1 = 1, λ_2 = 2## are eigenvalues of f


The Attempt at a Solution



I know ##λ_3=0## because ́##ker(f ) ≠ {(0, 0, 0)}## and ## (ker(f ))^⊥ = (V0 )^⊥ = V1 ⊕ V2 ## due to the definition of selfadjoint.
Then, my book gives the solution:
##(ker(f ))^⊥ = (L((1, 2, 1))^⊥ = {(α, β, −α − 2β) | α, β ∈ R} = L((1, 0, −1), (a, b, c))##
but i don't understand where did it get that (α, β, −α − 2β) from.
Could you please help me? thanks in advance :)
 
Physics news on Phys.org
  • #2
Felafel said:

Homework Statement


Write a selfadjoint endomorphism ## f : E^3 → E^3## such that ##ker(f ) =
L((1, 2, 1)) ## and ## λ_1 = 1, λ_2 = 2## are eigenvalues of f

The Attempt at a Solution



I know ##λ_3=0## because ́##ker(f ) ≠ {(0, 0, 0)}## and ## (ker(f ))^⊥ = (V0 )^⊥ = V1 ⊕ V2 ## due to the definition of selfadjoint.
Then, my book gives the solution:
##(ker(f ))^⊥ = (L((1, 2, 1))^⊥ = {(α, β, −α − 2β) | α, β ∈ R} = L((1, 0, −1), (a, b, c))##
but i don't understand where did it get that (α, β, −α − 2β) from.
Could you please help me? thanks in advance :)

(α, β, −α − 2β) is the subspace orthogonal to (1,2,1). You know that the eigenvectors of a self adjoint operator corresponding to different eigenvectors are orthogonal, yes? So you'll have to pick the other two eigenvectors from that space.
 
Last edited:

FAQ: How Do You Determine the Orthogonal Complement in a Linear Algebra Problem?

What is linear algebra?

Linear algebra is a branch of mathematics that deals with linear equations and their representations in vector spaces. It involves the study of linear transformations, matrices, and systems of linear equations.

What are some real-world applications of linear algebra?

Linear algebra has many practical applications in fields such as engineering, computer graphics, physics, economics, and statistics. It is used to solve problems related to optimization, data analysis, and modeling of systems.

What is a matrix?

A matrix is a rectangular array of numbers, symbols, or expressions arranged in rows and columns. It is used to represent linear transformations and systems of linear equations, and it can be manipulated using various operations such as addition, multiplication, and inversion.

How do I solve a linear algebra problem?

To solve a linear algebra problem, you can use various methods such as Gaussian elimination, matrix inversion, and eigenvalue decomposition. It is important to understand the properties of matrices and how different operations affect them in order to choose the most appropriate method for solving a specific problem.

What are eigenvectors and eigenvalues?

Eigenvectors and eigenvalues are important concepts in linear algebra. An eigenvector of a matrix is a non-zero vector that, when multiplied by the matrix, results in a scalar multiple of itself. The corresponding scalar is known as the eigenvalue. Eigenvectors and eigenvalues are used to analyze the behavior of linear transformations and systems of differential equations.

Similar threads

Back
Top