- #1
hamsterman
- 74
- 0
I'm reading a math book and found a couple of proofs I can't do.
1. Given [itex]x \in R^n, a \in R, \sum\limits_{i=1}^n{x_i}=na[/itex], prove that
[itex]\sum\limits_{i \in A}\prod\limits_{j = 1}^k {x_{i_j}} \leq \binom{k}{n}a^k[/itex] where
[itex]A = \{i \in \{1, 2, ... n\}^k : i_1 < i_2 < ... < i_k\}[/itex]
which essentially says that if the average of all [itex]x[/itex] is [itex]a[/itex], then taken a product of any k [itex]x[/itex], it will usually not be greater than [itex]a^k[/itex]
2. Given [itex]A = (a_{ij}) \in L(R^n)[/itex], prove that
[itex]\det^2 A \leq \prod\limits^n_{i=1}\sum\limits^n_{j=1}{a_{ij}^2}[/itex]
The problems are given in a section about extrema points. I do see that these can be proved by finding the minimum of (right side - left side). I do know how to use, in the first case, Lagrange multiplier and, in the second case, plain differentiation to find that point. The problem is that the derivatives turn out very ugly. I don't think I can solve them.
One idea I had was that there exist matrices that have determinants (or some other function) equal to the expressions or the left side of (1) and right side of (2), so that this whole problem could be lifted to linear algebra. But then my algebra is really poor.
I'd love to hear some suggestions about this.
1. Given [itex]x \in R^n, a \in R, \sum\limits_{i=1}^n{x_i}=na[/itex], prove that
[itex]\sum\limits_{i \in A}\prod\limits_{j = 1}^k {x_{i_j}} \leq \binom{k}{n}a^k[/itex] where
[itex]A = \{i \in \{1, 2, ... n\}^k : i_1 < i_2 < ... < i_k\}[/itex]
which essentially says that if the average of all [itex]x[/itex] is [itex]a[/itex], then taken a product of any k [itex]x[/itex], it will usually not be greater than [itex]a^k[/itex]
2. Given [itex]A = (a_{ij}) \in L(R^n)[/itex], prove that
[itex]\det^2 A \leq \prod\limits^n_{i=1}\sum\limits^n_{j=1}{a_{ij}^2}[/itex]
The problems are given in a section about extrema points. I do see that these can be proved by finding the minimum of (right side - left side). I do know how to use, in the first case, Lagrange multiplier and, in the second case, plain differentiation to find that point. The problem is that the derivatives turn out very ugly. I don't think I can solve them.
One idea I had was that there exist matrices that have determinants (or some other function) equal to the expressions or the left side of (1) and right side of (2), so that this whole problem could be lifted to linear algebra. But then my algebra is really poor.
I'd love to hear some suggestions about this.