Show that equality holds in Cauchy-Schwarz inequality if and only if....

In summary, the book states that the Cauchy-Schwarz inequality holds if and only if there exists a real number x such that a_kx+b_k=0 for every k=1,2,\ldots,n. Without introducing vectors and linear algebra, it is possible to prove this inequality without using those concepts.
  • #1
Ragnarok7
50
0
This is from section I 4.9 of Apostol's Calculus Volume 1. The book states the Cauchy-Schwarz inequality as follows:

\(\displaystyle \left(\sum_{k=1}^na_kb_k\right)^2\leq\left(\sum_{k=1}^na_k^2\right)\left(\sum_{k=1}^nb_k^2\right)\)

Then it asks you to show that equality holds in the above if and only if there is a real number \(\displaystyle x\) such that \(\displaystyle a_kx+b_k=0\) for every \(\displaystyle k=1,2,\ldots,n\).

The "if" direction is easy, but I'm not sure how to get the "only if" - i.e., that such an \(\displaystyle x\) implies equality. There are proofs of this online involving vectors and linear algebra, but I am wondering if it can be done without that, as the book has not introduced such things yet. For the record, the way the Cauchy-Schwarz inequality was proved in the book itself is as follows:

We have \(\displaystyle \sum_{k=1}^n(a_kx+b_k)^2\geq0\) for every real \(\displaystyle x\) because a sum of squares can never be negative. This may be written in the form \(\displaystyle Ax^2+2Bx+C\geq0\), where \(\displaystyle A=\sum_{k=1}^na_k^2\), \(\displaystyle B=\sum_{k=1}^na_kb_k\), and \(\displaystyle C=\sum_{k=1}^nb_k^2\). We wish to prove that \(\displaystyle B^2\leq AC\). If \(\displaystyle A=0\), then each \(\displaystyle a_k=0\), so \(\displaystyle B=0\) and the result is trivial. If \(\displaystyle A\neq 0\), we may complete the square and write

\(\displaystyle Ax^2+2Bx+C=A\left(x+\frac{B}{A}\right)^2+\frac{AC-B^2}{A}\).

The right side has its smallest value when \(\displaystyle x=-B/A\). Putting \(\displaystyle x=-B/A\) in the above, we obtain \(\displaystyle B^2\leq AC\).
 
Physics news on Phys.org
  • #2
Suppose there is an $x$ such that $a_kx + b_k = 0$ for each $k$.

Thus $b_k = -a_kx$, and:

$\displaystyle \left( \sum_{k = 1}^n a_kb_k \right)^2 = x^2\left(\sum_{k=1}^n (a_k)^2 \right)^2$

(really we are squaring the sum of terms $-(a_k)^2$, but when we pull out a factor of $(-1)^2$ it goes away).

On the other hand:

$\displaystyle \left( \sum_{k=1}^n (a_k)^2 \right)\left( \sum_{k=1}^n (b_k)^2 \right)$

$\displaystyle = \left( \sum_{k=1}^n (a_k)^2 \right)\left( \sum_{k=1}^n (-a_kx)^2 \right)$

$\displaystyle = \left( \sum_{k=1}^n (a_k)^2 \right)\left( x^2\sum_{k=1}^n (a_k)^2 \right)$

$\displaystyle = x^2\left( \sum_{k=1}^n (a_k)^2 \right)^2$.

This is actually the "if" part, the "only if" part would mean showing there is such an $x$ if equality holds (I get these mixed up, sometimes, too).
 
  • #3
Ragnarok said:
This is from section I 4.9 of Apostol's Calculus Volume 1. The book states the Cauchy-Schwarz inequality as follows:

\(\displaystyle \left(\sum_{k=1}^na_kb_k\right)^2\leq\left(\sum_{k=1}^na_k^2\right)\left(\sum_{k=1}^nb_k^2\right)\)

Then it asks you to show that equality holds in the above if and only if there is a real number \(\displaystyle x\) such that \(\displaystyle a_kx+b_k=0\) for every \(\displaystyle k=1,2,\ldots,n\).

The "if" direction is easy, but I'm not sure how to get the "only if" - i.e., that such an \(\displaystyle x\) implies equality. There are proofs of this online involving vectors and linear algebra, but I am wondering if it can be done without that, as the book has not introduced such things yet. For the record, the way the Cauchy-Schwarz inequality was proved in the book itself is as follows:

We have \(\displaystyle \sum_{k=1}^n(a_kx+b_k)^2\geq0\) for every real \(\displaystyle x\) because a sum of squares can never be negative. This may be written in the form \(\displaystyle Ax^2+2Bx+C\geq0\), where \(\displaystyle A=\sum_{k=1}^na_k^2\), \(\displaystyle B=\sum_{k=1}^na_kb_k\), and \(\displaystyle C=\sum_{k=1}^nb_k^2\). We wish to prove that \(\displaystyle B^2\leq AC\). If \(\displaystyle A=0\), then each \(\displaystyle a_k=0\), so \(\displaystyle B=0\) and the result is trivial. If \(\displaystyle A\neq 0\), we may complete the square and write

\(\displaystyle Ax^2+2Bx+C=A\left(x+\frac{B}{A}\right)^2+\frac{AC-B^2}{A}\).

The right side has its smallest value when \(\displaystyle x=-B/A\). Putting \(\displaystyle x=-B/A\) in the above, we obtain \(\displaystyle B^2\leq AC\).

Hi Ragnarok,

Note that for all real $x$,

$\displaystyle (*) \sum_{k = 1}^n (a_k x + b_k)^2 = Ax^2 + 2Bx + C$.

Suppose $B^2 = AC$. If $A = 0$, we must have $B = 0$. So $a_k = b_k = 0$ for all $k$. Consequently, for any real $x$, $a_k x + b_k = 0$ for all $k$. Now assume $A\neq 0$. Then $-\frac{B}{A}$ is the root of the polynomial $Ax^2 + 2Bx + C$, and so the left hand side of $(*)$ is zero for $x = -\frac{B}{A}$. This forces $a_k x + b_k = 0$ for all $k$.
 
  • #4
Thank you both so much! Deveno, I just realized that I asked the wrong question (I shouldn't post when tired). What I meant to ask was how equality implies the given condition. That's the harder part, for me at least. So I got the "if" and "only if" right, I just asked it the other way around. Euge, you seem to have realized what I meant anyway. Thanks!
 
Last edited:
  • #5
Euge said:
Suppose $B^2 = AC$. If $A = 0$, we must have $B = 0$. So $a_k = b_k = 0$ for all $k$. Consequently, for any real $x$, $a_k x + b_k = 0$ for all $k$.
The fact that $A=\sum_{k=1}^n a_k^2=0$ and $B=\sum_{k=1}^n a_kb_k=0$ does not imply that all $b_k=0$, but it does imply that all $a_k=0$. In fact, the required statement is not entirely correct: $AC=B^2$ holds iff vectors $\vec{a}=(a_1,\dots,a_n)$ and $\vec{b}=(b_1,\dots,b_n)$ are linearly dependent, which means that one of them equals the other multiplied by a scalar (possible 0). So either there exists an $x$ such that $a_kx+b_k=0$ for all $k$ (i.e., $\vec{b}=-x\vec{a}$) or there exists a $y$ such that $a_k+b_ky=0$ (i.e., $\vec{a}=-y\vec{b}$).
 
  • #6
Evgeny.Makarov said:
The fact that $A=\sum_{k=1}^n a_k^2=0$ and $B=\sum_{k=1}^n a_kb_k=0$ does not imply that all $b_k=0$, but it does imply that all $a_k=0$. In fact, the required statement is not entirely correct: $AC=B^2$ holds iff vectors $\vec{a}=(a_1,\dots,a_n)$ and $\vec{b}=(b_1,\dots,b_n)$ are linearly dependent, which means that one of them equals the other multiplied by a scalar (possible 0). So either there exists an $x$ such that $a_kx+b_k=0$ for all $k$ (i.e., $\vec{b}=-x\vec{a}$) or there exists a $y$ such that $a_k+b_ky=0$ (i.e., $\vec{a}=-y\vec{b}$).

Yes I mixed up the definitions of $B$ and $C$ (First time for everything, right? :) )

Since Ragnarok wants the "only if" part without the use of vectors, start with the identity

$\displaystyle AC - B^2 = \sum_{1 \le j < k \le n} (a_j b_k - a_k b_j)^2$.

If $AC - B^2 = 0$, then for all $k < j$, $a_j b_k = a_k b_j $. Hence $a_1, a_2, \ldots a_n$ is in proportion with $b_1, b_2,\ldots, b_n$.
 

FAQ: Show that equality holds in Cauchy-Schwarz inequality if and only if....

What is the Cauchy-Schwarz inequality?

The Cauchy-Schwarz inequality, also known as the Cauchy-Bunyakovsky-Schwarz inequality, is a fundamental inequality in mathematics that states the following:
For any two vectors, the dot product of the vectors is less than or equal to the product of their magnitudes. In other words, for two vectors a and b, the inequality can be written as |a · b| ≤ |a| · |b|.

How is the Cauchy-Schwarz inequality used?

The Cauchy-Schwarz inequality is used in various fields of mathematics, such as linear algebra, functional analysis, and probability theory. It is an important tool for proving other mathematical theorems and is also used in optimization problems and in the study of inner product spaces.

What does it mean for equality to hold in the Cauchy-Schwarz inequality?

When equality holds in the Cauchy-Schwarz inequality, it means that the two vectors involved are linearly dependent, or in other words, one vector is a scalar multiple of the other. In this case, the dot product of the vectors is equal to the product of their magnitudes.

What is the significance of proving equality in the Cauchy-Schwarz inequality?

Proving equality in the Cauchy-Schwarz inequality can lead to the discovery of important mathematical relationships and can also be used to find optimal solutions in optimization problems. It also helps in understanding the geometric interpretation of the inequality and its applications in various fields of mathematics.

Can you provide an example of when equality holds in the Cauchy-Schwarz inequality?

One example of when equality holds in the Cauchy-Schwarz inequality is when two vectors are orthogonal to each other. In this case, the dot product is equal to zero, and the product of their magnitudes is also equal to zero, thus satisfying the inequality with equality.

Similar threads

Replies
3
Views
2K
Replies
3
Views
2K
Replies
5
Views
1K
Replies
2
Views
2K
Replies
16
Views
3K
Replies
3
Views
913
Replies
6
Views
2K
Replies
5
Views
2K
Back
Top