What is the Orthogonal Property of Vectors in Span?

In summary, to show that a vector w is orthogonal to the subspace span(u,v), it is sufficient to show that w is orthogonal to each vector in that subspace. This can be done by using the fact that w being orthogonal to u and v means that w.u=0 and w.v=0, and using distributivity to show that w.(au+bv)=0 for any vector x in the subspace span(u,v). This ultimately results in the equation 0=0, showing that w is indeed orthogonal to the subspace span(u,v).
  • #1
war485
92
0

Homework Statement



If w is orthogonal to u and v, then show that w is also orthogonal to span ( u , v )

Homework Equations



two orthogonal vectors have a dot product equalling zero

The Attempt at a Solution



I can see this geometrically in my mind, and I know that w . u = 0 and w . v = 0
but I don't know or understand how I can show this for its span in writing.
 
Physics news on Phys.org
  • #2
a vector in span(u,v) is of the form au+bv. So w . (au+bv) = 0, using the distributivity of the dot product.
 
  • #3
you said that span(u,v) is in this form au+bv

so
w . (au+bv) = 0
w . au + w . bv = 0

where a and b are any scalar numbers
and that's all? There's no more to it?

[edit]
thanks grief. That one little bit helped a lot!
 
Last edited:
  • #4
war485 said:
you said that span(u,v) is in this form au+bv
No, Grief said a vector in the subspace span(u,v) is of the form au+bv for scalars a and b. To say that a vector is orthogonal to a subspace means that the vector is orthogonal to each vector in that subspace.

war485 said:
so
w . (au+bv) = 0
w . au + w . bv = 0

where a and b are any scalar numbers
and that's all? There's no more to it?
You need to show that given a vector x in span(u,v), we have w.x=0 . From above, x is of the form au+bv, so you want to show that w.(au+bv)=0. This means beginning with w.(au+bv) and showing it equals 0. As Grief already said, to do so just requires distributivity and recognising that w being orthogonal to u and to v means that w.u=0 and w.v=0.
 
  • #5
To make sure I got this right one more time:
a vector in span(u,v) is in this form au+bv

making (au+bv) dot w = 0 shows it is orthogonal, meaning any vector in that span(u,v) is orthogonal to w
then it'll become w . au + w . bv = 0
then w . au = a(w.u) = 0 since w.u = 0 (orthogonal)
w . bv = 0 since w . v = 0 (orthogonal)
so then 0 = 0

right?
 

FAQ: What is the Orthogonal Property of Vectors in Span?

What is an orthogonal matrix?

An orthogonal matrix is a square matrix whose columns and rows are orthogonal unit vectors. This means that the dot product of any two columns (or rows) is equal to zero, and the length of each column (or row) is equal to 1. In other words, an orthogonal matrix is a special type of matrix that preserves the magnitude and angle between vectors.

How is an orthogonal matrix related to the concept of span?

An orthogonal matrix is closely related to the concept of span because it represents a set of vectors that are linearly independent and span a particular subspace. In other words, the columns of an orthogonal matrix form a basis for the vector space they span, and any vector within that space can be expressed as a linear combination of these columns.

Can an orthogonal matrix have negative values?

Yes, an orthogonal matrix can have negative values. The orthogonality property of an orthogonal matrix only requires that the dot product of the columns (or rows) is equal to zero, not the actual values of the elements. So, an orthogonal matrix can have any real number as its elements, including negative values.

How do you determine if a matrix is orthogonal?

To determine if a matrix is orthogonal, you can use the following criteria:

  1. The matrix must be square (same number of rows and columns)
  2. The columns (or rows) must be orthogonal (dot product is equal to zero)
  3. The length of each column (or row) must be equal to 1

If all three criteria are met, then the matrix is orthogonal.

What are some real-world applications of orthogonal matrices?

Orthogonal matrices have various applications in fields such as computer graphics, signal processing, and statistics. For example, in computer graphics, orthogonal matrices are used to rotate, scale, and translate objects in 3D space. In signal processing, orthogonal matrices are used to compress and decompress signals, such as images and audio. In statistics, orthogonal matrices are used to perform data transformations and analyze multivariate data.

Back
Top