Identities of the optimal approximation

In summary: It seems there is some confusion about vectors versus scalars. (Wasntme)The symbols $e_i$ and $e_j$ represent vectors, so it would be better to write them as $\overrightarrow{e_i}$ and $\overrightarrow{e_j}$Their dot product $(\overrightarrow{e_i}, \overrightarrow{e_j})$ is a scalar.When you write:$$\sum_{i=1}^n \sum_{j=1}^n e_i e_j ({(x,e_i)},{(x,e_j)})$$that is actually:$$\sum_{i=1}^n \sum_{j=1}^n \overrightarrow{e_i}
  • #1
mathmari
Gold Member
MHB
5,049
7
Hey! :eek:

I am looking at the identities of the optimal approximation.

At the case where the basis consists of orthogonal unit vectors,the optimal approximation $y \in \widetilde{H} \subset H$, where $H$ an euclidean space, of $x \in H$ from $\widetilde{H}$ can be written $y=(x,e_1) e_1 + (x,e_2) e_2 +... +(x,e_n)e_n$.

$||y||^2=(y,y)=$ $(\sum_{i=1}^n{(x,e_i)e_i}, \sum_{j=1}^n{(x,e_j)e_j)} $ $=\sum_{i,j=1}^n{(x,e_i)(x,e_j)(e_i,e_j)}=\sum_{i,j=1}^n{(x,e_i)(x,e_j) \delta_{ij}}=\sum_{i=1}^n{(x,e_i)^2}$

So $||y||^2=\sum_{i=1}^n{(x,e_i)^2}$, it's the generalized pythagorean theorem.

My question is, why do we take at the one sum $i$ and at the other $j$. Why do we not take at both sums $i$?
 
Mathematics news on Phys.org
  • #2
mathmari said:
Hey! :eek:

I am looking at the identities of the optimal approximation.

At the case where the basis consists of orthogonal unit vectors,the optimal approximation $y \in \widetilde{H} \subset H$, where $H$ an euclidean space, of $x \in H$ from $\widetilde{H}$ can be written $y=(x,e_1) e_1 + (x,e_2) e_2 +... +(x,e_n)e_n$.

$||y||^2=(y,y)=$ $(\sum_{i=1}^n{(x,e_i)e_i}, \sum_{j=1}^n{(x,e_j)e_j)} $ $=\sum_{i,j=1}^n{(x,e_i)(x,e_j)(e_i,e_j)}=\sum_{i,j=1}^n{(x,e_i)(x,e_j) \delta_{ij}}=\sum_{i=1}^n{(x,e_i)^2}$

So $||y||^2=\sum_{i=1}^n{(x,e_i)^2}$, it's the generalized pythagorean theorem.

My question is, why do we take at the one sum $i$ and at the other $j$. Why do we not take at both sums $i$?

Hi! :)

That's in preparation for the next step.

Each summation contains $n$ terms.
If we multiply the 2 summations we get a new summation with $n^2$ terms, where each term of the first summation (the i'th term) is multiplied by each term of the second summation (the j'th term).
In particular we are not merely multiplying each i'th term by the corresponding i'th term.
 
  • #3
I like Serena said:
Hi! :)

That's in preparation for the next step.

Each summation contains $n$ terms.
If we multiply the 2 summations we get a new summation with $n^2$ terms, where each term of the first summation (the i'th term) is multiplied by each term of the second summation (the j'th term).
In particular we are not merely multiplying each i'th term by the corresponding i'th term.

Aha! I understand it now! Thank you for the explanation! (Smirk)
 
  • #4
mathmari said:
$||y||^2=(y,y)=$ $(\sum_{i=1}^n{(x,e_i)e_i}, \sum_{j=1}^n{(x,e_j)e_j)} $ $=\sum_{i,j=1}^n{(x,e_i)(x,e_j)(e_i,e_j)}=\sum_{i,j=1}^n{(x,e_i)(x,e_j) \delta_{ij}}=\sum_{i=1}^n{(x,e_i)^2}$

I got stuck right now... Could you explain me how we get from the relation:

$$(\sum_{i=1}^n{(x,e_i)e_i}, \sum_{j=1}^n{(x,e_j)e_j)} $$

to the relation:

$$\sum_{i,j=1}^n{(x,e_i)(x,e_j)(e_i,e_j)}$$

?? (Wondering)
 
  • #5
mathmari said:
I got stuck right now... Could you explain me how we get from the relation:

$$(\sum_{i=1}^n{(x,e_i)e_i}, \sum_{j=1}^n{(x,e_j)e_j)} $$

to the relation:

$$\sum_{i,j=1}^n{(x,e_i)(x,e_j)(e_i,e_j)}$$

?? (Wondering)

A real-valued dot product has the property that $(a+b,c)=(a,c)+(b,c)$ and $(a,b+c)=(a,b)+(a,c)$.

And also that $(\lambda a, b) = \lambda (a,b)$ respectively $(a, \lambda b) = \lambda (a,b)$.

Therefore:
$$(\sum_{i=1}^n{(x,e_i)e_i}, \sum_{j=1}^n{(x,e_j)e_j)} = \sum_{i=1}^n \sum_{j=1}^n ({(x,e_i)e_i},{(x,e_j)e_j})
= \sum_{i=1}^n \sum_{j=1}^n(x,e_i)(x,e_j) (e_i,e_j)$$
(Mmm)
 
  • #6
I like Serena said:
A real-valued dot product has the property that $(a+b,c)=(a,c)+(b,c)$ and $(a,b+c)=(a,b)+(a,c)$.

And also that $(\lambda a, b) = \lambda (a,b)$ respectively $(a, \lambda b) = \lambda (a,b)$.

Therefore:
$$(\sum_{i=1}^n{(x,e_i)e_i}, \sum_{j=1}^n{(x,e_j)e_j)} = \sum_{i=1}^n \sum_{j=1}^n ({(x,e_i)e_i},{(x,e_j)e_j})
= \sum_{i=1}^n \sum_{j=1}^n(x,e_i)(x,e_j) (e_i,e_j)$$
(Mmm)

(Thinking)

We get the following relation from the identity $(a+b,c)=(a,c)+(b,c)$ and $(a,b+c)=(a,b)+(a,c)$, right??
$$(\sum_{i=1}^n{(x,e_i)e_i}, \sum_{j=1}^n{(x,e_j)e_j)} = \sum_{i=1}^n \sum_{j=1}^n ({(x,e_i)e_i},{(x,e_j)e_j}) $$But I still don't understand why this is equal to
$$\sum_{i=1}^n \sum_{j=1}^n(x,e_i)(x,e_j) (e_i,e_j)$$

(Worried)(Doh)

Do we use here the other identity, $(\lambda a, b) = \lambda (a,b)$ , $(a, \lambda b) = \lambda (a,b)$??

Is $\lambda$, in this case, $e_i$ and $e_j$ ??

(Wondering)

But why isn't it then as followed??

$\displaystyle{\sum_{i=1}^n \sum_{j=1}^n ({(x,e_i)e_i},{(x,e_j)e_j})=\sum_{i=1}^n \sum_{j=1}^n e_i e_j ({(x,e_i)},{(x,e_j)})}$
 
  • #7
It seems there is some confusion about vectors versus scalars. (Wasntme)

The symbols $e_i$ and $e_j$ represent vectors, so it would be better to write them as $\overrightarrow{e_i}$ and $\overrightarrow{e_j}$
Their dot product $(\overrightarrow{e_i}, \overrightarrow{e_j})$ is a scalar.When you write:
$$\sum_{i=1}^n \sum_{j=1}^n e_i e_j ({(x,e_i)},{(x,e_j)})$$
that is actually:
$$\sum_{i=1}^n \sum_{j=1}^n \overrightarrow{e_i} \overrightarrow{e_j} \left({(\overrightarrow x,\overrightarrow{e_i})},{(\overrightarrow{x},\overrightarrow{e_j})}\right)$$However, the expression $\overrightarrow{e_i} \overrightarrow{e_j}$ is not defined. :eek:
Only their dot product $(\overrightarrow{e_i}, \overrightarrow{e_j})$ is properly defined.Similarly $\left({(\overrightarrow x,\overrightarrow{e_i})},{(\overrightarrow{x},\overrightarrow{e_j})}\right)$ is not defined.
We are not supposed to take the dot product of 2 scalars. :eek:
 
  • #8
I like Serena said:
It seems there is some confusion about vectors versus scalars. (Wasntme)

The symbols $e_i$ and $e_j$ represent vectors, so it would be better to write them as $\overrightarrow{e_i}$ and $\overrightarrow{e_j}$
Their dot product $(\overrightarrow{e_i}, \overrightarrow{e_j})$ is a scalar.When you write:
$$\sum_{i=1}^n \sum_{j=1}^n e_i e_j ({(x,e_i)},{(x,e_j)})$$
that is actually:
$$\sum_{i=1}^n \sum_{j=1}^n \overrightarrow{e_i} \overrightarrow{e_j} \left({(\overrightarrow x,\overrightarrow{e_i})},{(\overrightarrow{x},\overrightarrow{e_j})}\right)$$However, the expression $\overrightarrow{e_i} \overrightarrow{e_j}$ is not defined. :eek:
Only their dot product $(\overrightarrow{e_i}, \overrightarrow{e_j})$ is properly defined.Similarly $\left({(\overrightarrow x,\overrightarrow{e_i})},{(\overrightarrow{x},\overrightarrow{e_j})}\right)$ is not defined.
We are not supposed to take the dot product of 2 scalars. :eek:

Ahaa... (Wasntme)

So is it as followed?? (Thinking)

$$(\sum_{i=1}^n{(x,\overrightarrow{e_i})\overrightarrow{e_i}}, \sum_{j=1}^n{(x,\overrightarrow{e_j})\overrightarrow{e_j})} =

\sum_{i=1}^n \sum_{j=1}^n ({(x,\overrightarrow{e_i})\overrightarrow{e_i}},{(x,\overrightarrow{e_j})\overrightarrow{e_j}}) =

\sum_{i=1}^n \sum_{j=1}^n (x,\overrightarrow{e_i}) (\overrightarrow{e_i},{(x,\overrightarrow{e_j})\overrightarrow{e_j}}) = \\

\sum_{i=1}^n \sum_{j=1}^n (x,\overrightarrow{e_i}) (x,\overrightarrow{e_j})(\overrightarrow{e_i},{\overrightarrow{e_j}}) $$
 
  • #9
mathmari said:
Ahaa... (Wasntme)

So is it as followed?? (Thinking)

$$(\sum_{i=1}^n{(x,\overrightarrow{e_i})\overrightarrow{e_i}}, \sum_{j=1}^n{(x,\overrightarrow{e_j})\overrightarrow{e_j})} =

\sum_{i=1}^n \sum_{j=1}^n ({(x,\overrightarrow{e_i})\overrightarrow{e_i}},{(x,\overrightarrow{e_j})\overrightarrow{e_j}}) =

\sum_{i=1}^n \sum_{j=1}^n (x,\overrightarrow{e_i}) (\overrightarrow{e_i},{(x,\overrightarrow{e_j})\overrightarrow{e_j}}) = \\

\sum_{i=1}^n \sum_{j=1}^n (x,\overrightarrow{e_i}) (x,\overrightarrow{e_j})(\overrightarrow{e_i},{\overrightarrow{e_j}}) $$

Yep! (Sun)

Erm... except that $x$ is also a vector... so that should be $\overrightarrow x$. (Lipssealed)
 
  • #10
I like Serena said:
Yep! (Sun)

Erm... except that $x$ is also a vector... so that should be $\overrightarrow x$. (Lipssealed)

Oh yes, you're right! (Wasntme)(Blush)

Thank you very much! (Clapping)
 

FAQ: Identities of the optimal approximation

What is an "identity" in the context of optimal approximation?

An identity in the context of optimal approximation refers to a mathematical expression or equation that represents the relationship between the input data and the desired output. It is used to determine the best approximation for a given problem.

How is the "optimal approximation" determined?

The optimal approximation is determined through a process of minimizing the error between the actual data and the approximated data. This is done by finding the values of the input parameters that will result in the smallest error or difference between the desired output and the actual output.

What factors influence the identity of the optimal approximation?

The identity of the optimal approximation is influenced by various factors such as the complexity of the problem, the quality and quantity of the input data, and the chosen method of approximation. Other factors can include the chosen error metric and the constraints of the problem.

Can the identity of the optimal approximation change for different data sets?

Yes, the identity of the optimal approximation can change for different data sets. This is because the input data and the desired output may change, resulting in a different optimal solution. It is important to carefully consider the characteristics of the data set when determining the optimal approximation.

Are there limitations to the use of optimal approximation identities?

Yes, there are limitations to the use of optimal approximation identities. These limitations can include the accuracy of the approximation, the complexity of the problem, and the computational resources and time required to find the optimal solution. Additionally, the use of optimal approximation may not be suitable for all types of problems, and other methods may be more appropriate.

Similar threads

Replies
2
Views
2K
Replies
23
Views
1K
Replies
2
Views
2K
Replies
1
Views
1K
Back
Top