- #1
siddharthmishra19
- 27
- 0
1. The Problem
For all positive real values of a,b and c such that a²+b²+c²=1, prove that
a(b+c) < [1/sqrt(2)]
2. My Attempt...
There are probably numerous ways to solve this, but i tried this way...
let a = sin(x)
b = cos(x)sin(y)
c = cos(x)cos(y)
a²+b²+c² = sin²(x)+cos²(x)[sin²(y)+cos²(y)] = 1
This satisfies the above
so a(b+c) = sin(x)cos(x)[sin(y)+cos(y)]
=(1/2)sin2(x)[sin(y)+cos(y)]...
I am sure you can somehow prove that the max value of the give is less than than the value given in the question i.e. 1/sqrt(2)...
I would appreciate any help... with any other methods.
For all positive real values of a,b and c such that a²+b²+c²=1, prove that
a(b+c) < [1/sqrt(2)]
2. My Attempt...
There are probably numerous ways to solve this, but i tried this way...
let a = sin(x)
b = cos(x)sin(y)
c = cos(x)cos(y)
a²+b²+c² = sin²(x)+cos²(x)[sin²(y)+cos²(y)] = 1
This satisfies the above
so a(b+c) = sin(x)cos(x)[sin(y)+cos(y)]
=(1/2)sin2(x)[sin(y)+cos(y)]...
I am sure you can somehow prove that the max value of the give is less than than the value given in the question i.e. 1/sqrt(2)...
I would appreciate any help... with any other methods.
Last edited: