- #1
mathmari
Gold Member
MHB
- 5,049
- 7
Hey!
Let $1\leq n\in \mathbb{N}$, $V=\mathbb{R}^n$ and $\cdot$ the standard scalar multiplication. Let $b_1, \ldots , b_k\in V$ such that $$b_i\cdot b_j=\delta_{ij}$$
I have done the following:
For 1:
We have that $$\left (\sum_{i=1}^k\lambda_i b_i\right )\cdot b_j=\sum_{i=1}^k\lambda_i \left (b_i\cdot b_j\right )=\lambda_j$$ or not? :unsure: For 2:
We have that $$\sum_{i=1}^k\lambda_i b_i=0 \ \overset{\cdot b_j}{\longrightarrow} \ \left (\sum_{i=1}^k\lambda_i b_i\right )\cdot b_j=0\cdot b_j \ \overset{\text{ Question } 1.}{\longrightarrow} \ \lambda_j=0$$ for all $1\leq j\leq k$, and so $b_1, \ldots , b_k$ are linear independent.
Is this correct?
How can we show that $k\leq n$? :unsure: For 3:
We have that the vectors of $B$ are linear independent, according to question 2, and the number of vectors equals the dimension of $V$. This imply that $B$ is a basis of $V$, right?
Since $B$ is a basis of $V$, every element of $V$ can be written as a linear combination of the elements of $B$. But why is this linear combination $\displaystyle{v=\sum_{i=1}^n(v\cdot b_i)b_i}$ ? Is this because of the definition of $b_i$, i.e. that $b_i\cdot b_i=1$ ? :unsure:For 4:
To show that the matrix $a$ is orthogonal, we have to show that $a^Ta=I=aa^T$ using the definition os the vectors $b_i$, i.e. that $b_i\cdot b_i=1$ and $b_i\cdot b_j=0$ fr $i\neq j$, right? :unsure:
Let $1\leq n\in \mathbb{N}$, $V=\mathbb{R}^n$ and $\cdot$ the standard scalar multiplication. Let $b_1, \ldots , b_k\in V$ such that $$b_i\cdot b_j=\delta_{ij}$$
- Let $\lambda_1, \ldots , \lambda_k\in \mathbb{R}$. Determine $\displaystyle{\left (\sum_{i=1}^k\lambda_i b_i\right )\cdot b_j}$ for$1\leq j\leq k$.
- Show that $b_1, \ldots , b_k$ are linear independent and that $k\leq n$.
- Let $k=n$. Show that $B=(b_1, \ldots , b_n)$ is a basis of $V$ and it holds that $\displaystyle{v=\sum_{i=1}^n(v\cdot b_i)b_i}$ for all $v\in V$.
- Let $k=n$. Show that $a=(b_1\mid \ldots \mid b_n)\in O_n$.
I have done the following:
For 1:
We have that $$\left (\sum_{i=1}^k\lambda_i b_i\right )\cdot b_j=\sum_{i=1}^k\lambda_i \left (b_i\cdot b_j\right )=\lambda_j$$ or not? :unsure: For 2:
We have that $$\sum_{i=1}^k\lambda_i b_i=0 \ \overset{\cdot b_j}{\longrightarrow} \ \left (\sum_{i=1}^k\lambda_i b_i\right )\cdot b_j=0\cdot b_j \ \overset{\text{ Question } 1.}{\longrightarrow} \ \lambda_j=0$$ for all $1\leq j\leq k$, and so $b_1, \ldots , b_k$ are linear independent.
Is this correct?
How can we show that $k\leq n$? :unsure: For 3:
We have that the vectors of $B$ are linear independent, according to question 2, and the number of vectors equals the dimension of $V$. This imply that $B$ is a basis of $V$, right?
Since $B$ is a basis of $V$, every element of $V$ can be written as a linear combination of the elements of $B$. But why is this linear combination $\displaystyle{v=\sum_{i=1}^n(v\cdot b_i)b_i}$ ? Is this because of the definition of $b_i$, i.e. that $b_i\cdot b_i=1$ ? :unsure:For 4:
To show that the matrix $a$ is orthogonal, we have to show that $a^Ta=I=aa^T$ using the definition os the vectors $b_i$, i.e. that $b_i\cdot b_i=1$ and $b_i\cdot b_j=0$ fr $i\neq j$, right? :unsure: