Is Squaring an Equation Adequate for Proof in Analysis?

In summary, the conversation discusses the use of squaring an equation as a proof method in analysis courses. The conversation also addresses the proof of orthogonality and how to show that the equation ||A + B|| = ||A - B|| is not true when A and B are not orthogonal. The conversation concludes that sometimes squaring can be an adequate method of proof, but it may be used less frequently in analysis courses compared to algebra courses.
  • #1
sponsoredwalk
533
5
Hello I'm reading Lang's Intro to Linear Algebra & I've noticed that he uses squaring an equation,
working out the algebra and then square-rooting to prove a theorem.
I'm trying to get used to proofing for analysis &
I'd like to know whether squaring is considered an adequate method of proof?

A quick example is;

||xA|| = |x| ||A||

1/ ||xA||² = {√[(xA) · (xA)]}² = xA · xA

(Using the definition ||A|| = √(A · A) = √(a_1² + a_2² + ... + a_n²)

2/ xA · xA = (xa_1, xa_2, ..., xa_n) · (xa_1, xa_2, ..., xa_n)

(Using the definition of vector A : (a_1, a_2, ..., a_n) )

3/ x²a_1² + x²a_2² + ... + x²a_n² = x²(a_1² + a_2² + ... + a_n²)

(Using the Dot Product property for components)

4/ x² A · A = |x|² ||A||²

(Rewriting the squared components in 3/ as A · A,
x² as |x|² to account for ± x values & A · A as ||A||²
as just representing the definition of ||A|| squared).

5/ Take a square root, and voila!

Here he set out to prove his theorem by squaring out one side,
working out the algebra and achieving the other side i.e. proving an equality.

I'm just concerned as to whether this would constitute a rigid proof
in an analysis book as the ones I've read
(albeit I was lacking the mathematical maturity I have now, which is still in it's infancy!)
would seemingly come out of nowhere :p

Another question is the proof of orthogonality.

||A + B|| = ||A - B|| iff A · B = 0

1/ ||A + B||² = ||A - B||² <==> {√[(A + B) · (A + B)]}² = {√[(A - B) · (A - B)]}²

2/ (A + B) · (A + B) = (A - B) · (A - B) <==> A² + 2A · B + B² = A² - 2A · B + B²

3/ 2A · B = - 2A · B

4/ 4A · B = 0

5/ A · B = 0

If A · B = 0 then the above is true, but if A · B ≠ 0
how are you supposed to show ||A + B|| ≠ ||A - B|| ?

I'm thinking you're supposed to find a contradiction.
You assume all of the above & do the proof like I did,
then when you get to the end, 5/,
you find A · B = 0 but you know that A & B are not orthogonal
so we see that the assumption cannot be true.

Is that considered a proof or just a small exercise?

Note: I'm supposed to be proving all of this just using 4 properties of the dot product
as this is a way to achieve generality;
1/ A · B = B · A
2/ A · (B + C) = A · B + A · C
3/ (xA) · B = x(A · B)
4/ A · A > 0 iff A ≠ 0

So, basically to sum up, I'm asking about
a) Squaring an equation as an adequate method of proof, especially in analysis courses or is this just a trivial exercise used in college algebra at most (and intro linear algebra!).

b) In the orthogonality part above, I'm asking what happens when vectors A & B are not orthogonal but you're given the equation ||A + B|| = ||A - B|| and asked to test whether this equality is true. Is the contradiction achieved solely by arriving at A · B = 0 and our previous knowledge of A · B = 0 tells us that the equation can only be true if vectors A & B are orthogonal?
 
Physics news on Phys.org
  • #2
a. Sometimes squaring is adequate, given the proper assumptions, though I do believe it happens less frequently than in algebra courses.

b.Suppose [tex]A^TB=0[/tex].

Then
[tex]||A+B||=\sqrt{(A+B)^T(A+B)}=\sqrt{A^TA+B^TA+A^TB+B^TB}
[/tex]
but since
[tex]B^TA=A^TB[/tex] by definition of real inner product we have
[tex]||A+B||=\sqrt{A^TA+2A^TB+B^TB}=\sqrt{A^TA+B^TB}[/tex].
Also, we have
[tex]||A-B||=\sqrt{A^TA-A^TB-B^TA+B^TB}=\sqrt{A^TA-2A^TB+B^TB}=\sqrt{A^TA+B^TB}=||A+B||[/tex]

And that satisfies the "if" part of the "if and only if" to go along with your "only if" proof.
 
  • #3
It's very simple. In general (i.e. for arbitrary real numbers x and y) it does NOT hold that
[tex]x=y\Leftrightarrow x^2=y^2.[/tex]
Obviously [itex]\Rightarrow[/itex] does hold, but the converse not (take x=1 and y=-1).

But, given that x and y are non-negative, the iff statement is true. Since we know that the norm is always non-negative, we can square or take square roots as we please:

[tex]\|a\|=\|b\|\Leftrightarrow \|a\|^2=\|b\|^2.[/tex]
 
  • #4
sponsoredwalk said:
Another question is the proof of orthogonality.

||A + B|| = ||A - B|| iff A · B = 0

1/ ||A + B||² = ||A - B||² <==> {√[(A + B) · (A + B)]}² = {√[(A - B) · (A - B)]}²

2/ (A + B) · (A + B) = (A - B) · (A - B) <==> A² + 2A · B + B² = A² - 2A · B + B²

3/ 2A · B = - 2A · B

4/ 4A · B = 0

5/ A · B = 0

If A · B = 0 then the above is true, but if A · B ≠ 0
how are you supposed to show ||A + B|| ≠ ||A - B|| ?
This has just been done!
||A + B||² = ||A - B||²
iff
(A + B) · (A + B) = (A - B) · (A - B)
iff
A² + 2A · B + B² = A² - 2A · B + B²
iff
2A · B = - 2A · B
iff
A · B = 0.

If you don't understand that this is the complete proof, you don't understand what "iff" means.
 
  • #5
Thanks o:), well at least I know that I can use the squaring method if it appears as an obvious thing to do & feel confident that's what was expected in the proof.
Lang uses it a lot in this first chapter of the book but I haven't really seen it as useful in a calculus course - but that could be because I haven't been proving much - yet :biggrin:

Landau said:
If you don't understand that this is the complete proof, you don't understand what "iff" means.

Lol, of course I do. I don't think you understood what I was asking.

I gave the proof that ||A + B|| = ||A - B|| if & only if A · B = 0 to show that I understand this fine.

My question was, and I should have been clearer, if you're asked to find out if the equality ||A + B|| = ||A - B|| is a true statement when A · B 0,
how would I go about it?

I asked this only to get used to proving something to be false really.

I'm thinking that you assume ||A + B|| = ||A - B|| to be a true statement even though A and B are not orthogonal and you do the proof in the exact same way as the correct proof is done.

Then, because we've assumed ||A + B|| = ||A - B|| to be a true statement we arrive at a contradiction because our end result is A · B = 0 but we know that A and B are not orthogonal.

Because A and B are not orthogonal, A · B = 0 is a false statement, ergo ||A + B|| = ||A - B|| is a false statement too.

Is that the correct procedure?



Also, [tex] A^TB=0[/tex] is another way to write A · B = 0 I'm assuming, but it also looks like the notation for the transpose - which I don't understand but I'm now assuming it's related to the inner product - cool?
 
  • #6
To prove something false, ordinarily you would find a counter-example.
So if we have "If A and B are not orthogonal, then ||A+B||=||A-B||" To show this false, we come up with a counter example. Consider in R2. A=(1,1), B=(-1,2)
<A,B>=-1+2=1≠0
||A+B||=||(0,3)||=3
||A-B||=||(2,-1)||=√5

And yes, normally when dealing with the real dot product, written in matrix multiplication as xTy=<x,y>=x•y
 
  • #7
sponsoredwalk said:
Lol, of course I do. I don't think you understood what I was asking.

I gave the proof that ||A + B|| = ||A - B|| if & only if A · B = 0 to show that I understand this fine.

My question was, and I should have been clearer, if you're asked to find out if the equality ||A + B|| = ||A - B|| is a true statement when A · B 0,
how would I go about it?

I asked this only to get used to proving something to be false really.

I'm thinking that you assume ||A + B|| = ||A - B|| to be a true statement even though A and B are not orthogonal and you do the proof in the exact same way as the correct proof is done.
I think I am still not quite following you, because we have just proven that ||A + B|| = ||A - B|| iff A · B = 0. Part of this statement is that if A · B ≠ 0, then the equality ||A + B|| = ||A - B|| is false. Or do you want to forget about the fact that we just proved this, and prove this result seperately? Well, then you just use that part of the proof which shows one of the two implications...

I mean, you just proved a statement of the form "P iff Q", and then you're asking how to show that (not Q) implies (not P), which is precisely the contrapositive of "P implies Q".
Also, [tex] A^TB=0[/tex] is another way to write A · B = 0 I'm assuming, but it also looks like the notation for the transpose - which I don't understand but I'm now assuming it's related to the inner product - cool?
Yes, ^T means tranpose. It is used here because we can regard the inner product of two vectors as the matrix product of a row vector and a column vector (by the definition of matrix multiplication and inner product).
 
  • #8
Landau said:
Or do you want to forget about the fact that we just proved this, and prove this result seperately? Well, then you just use that part of the proof which shows one of the two implications...

Yes, if we forget that we just proved the theorem, had been given ||A + B|| = ||A - B|| on a blank sheet of paper, was told that A and B are not perpendicular & was asked to find out whether ||A + B|| = ||A - B|| was a true statement, would what I had written be correct?
I'm trying to do this without assuming I know that I'll arrive at A · B = 0 at the end.
If I arrived at A · B = 0 it would be satisfactory to state that there is no equality, i.e.
||A + B|| = ||A - B|| is false as A · B = 0 can only be true is A & B are orthogonal!

It's kind of stupid I know, I'm trying to put myself in the place where I'd stumble upon A · B = 0 and be able to recognise I can go no further instead of stupidly assuming ah, well that's the answer...
 

FAQ: Is Squaring an Equation Adequate for Proof in Analysis?

1. What is linear algebra?

Linear algebra is a branch of mathematics that deals with the study of linear equations and their representations in vector spaces.

2. What is a proof in linear algebra?

A proof in linear algebra is a logical argument that uses mathematical properties and theorems to show the validity of a statement or equation related to linear algebra.

3. How do I approach solving a linear algebra proof?

The key to solving a linear algebra proof is to start with the given information and use mathematical properties and theorems to manipulate the equations and reach the desired outcome.

4. Is it necessary to show all steps in a linear algebra proof?

Yes, it is important to show all steps in a linear algebra proof to demonstrate the logical progression and to ensure the validity of the final solution.

5. Are there any tips for effectively solving linear algebra proofs?

Some tips for solving linear algebra proofs include understanding the definitions and properties of linear algebra, practicing with different types of problems, and breaking the proof into smaller, more manageable steps.

Back
Top