Linear transformations and orthogonal basis

In summary, we have a linear transformation Pk that maps a linear combination of an orthogonal basis onto the same basis, with the coefficient of the kth term remaining the same. This can be seen as the orthogonal projection of a vector onto the subspace spanned by Ek. This can also be interpreted as the unique vector that is orthogonal to the difference between the original vector and its projection onto U, where U is the subspace spanned by Ek.
  • #1
stunner5000pt
1,463
3
Let {E1,E2,...En} be an orthogonal basis of Rn. Given k, 1<=k<=n, define Pk: Rn -> Rn by [itex] P_{k} (r_{1} E_{1} + ... + r_{n} E_{n}) = r_{k} E_{k}. [/itex] Show that [itex] P_{k} = proj_{U} () [/itex] where U = span {Ek}

well [tex] \mbox{proj}_{U} \vec{m}= \sum_{i} \frac{ m \bullet u_{i}}{||u_{i}||^2} \vec{u}
[/tex]
right?
here we have Pk transforming linear combination of the orthogonal basis into rk Ek same index as the subscript of P

would it turn into
[tex] \mbox{proj}_{U} \vec{m}= \frac{ m \bullet E_{1}}{||E_{1}||^2}\vec{E_{1}} + ... + \frac{ m \bullet E_{n}}{||E_{n}||^2}\vec{E_{n}} [/tex]
and the whole stuff in front of each Ei can be interpreted as the Ri, a scalar multiple yes?
 
Physics news on Phys.org
  • #2
You can approach it with fewer symbols by simply noting that the orthogonal projection of a vector is the unique vector that is orthogonal to (the vector minus the projection fo the vector). That is, iff v is a vector and u is its orthogonal projection onto some subspace U, then (v - u) . u = 0, and u is in U.
 
  • #3
0rthodontist said:
You can approach it with fewer symbols by simply noting that the orthogonal projection of a vector is the unique vector that is orthogonal to (the vector minus the projection fo the vector). That is, iff v is a vector and u is its orthogonal projection onto some subspace U, then (v - u) . u = 0, and u is in U.

why is u = 0?
 
  • #4
u is not 0
(v-u).u is 0 (i.e., v-u is orthogonal to u)
 
  • #5
ok so (v-u).u = 0
but how does this equal to the Pk?
 

FAQ: Linear transformations and orthogonal basis

What is a linear transformation?

A linear transformation is a mathematical function that maps one vector space to another, while preserving the operations of vector addition and scalar multiplication. In other words, it is a transformation that maintains the linearity property, where the output is proportional to the input.

What is an orthogonal basis?

An orthogonal basis is a set of vectors that are mutually perpendicular (orthogonal) to each other and have a magnitude of 1. This means that any two vectors in the basis are at a 90-degree angle to each other and can be used to represent any other vector in the vector space through linear combinations.

How do you determine if a linear transformation is orthogonal?

A linear transformation is orthogonal if the dot product of any two transformed vectors is equal to the dot product of the original vectors. In other words, if T(u) and T(v) are the transformed vectors from the original vectors u and v, then T(u) • T(v) = u • v.

What is the significance of an orthogonal basis in linear algebra?

An orthogonal basis is significant in linear algebra because it simplifies many calculations and makes it easier to represent vectors and perform operations on them. It also allows for easier calculation of projections, distances, and angles between vectors.

How is the Gram-Schmidt process used to find an orthogonal basis?

The Gram-Schmidt process is a method used to find an orthogonal basis for a given set of vectors. It involves taking the first vector as the first basis vector, then subtracting its projection onto the second vector from the second vector to get an orthogonal vector. This process is repeated for each subsequent vector until an orthogonal basis is obtained.

Back
Top