If a^2 + b^2 = 11ab and a>b >0, prove that {(a-b)/3}=1/2 (log a + log b)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The statement in the post IS NOT CORRECT. IT IS INCORRECT (!) (! !) (! ! !).
What is the correct statement, you will see from my solution.
If a^2 + b^2 = 11ab, then
= 11ab - 2ab = 9ab, which implies
= 9ab, or
= ab, or
= ab
Take the logarithm of both sides. Account that a > b > 0, according to the condition. You wull get
= log(a) + log(b),
which implies
= . <<<---=== It is the correct statement.