Question 1119107
.


If a^2 + b^2 = 11ab and a>b >0, prove that {(a-b)/3}=1/2  (log a + log b)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~



&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;The statement in the post &nbsp;<U>IS &nbsp;NOT &nbsp;CORRECT</U>.  &nbsp;&nbsp;<U>IT &nbsp;IS &nbsp;INCORRECT</U> &nbsp;&nbsp;(!) &nbsp;&nbsp;(! !) &nbsp;&nbsp;(! ! !).


&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;What is the correct statement, &nbsp;you will see from my solution.



<pre>
If a^2 + b^2 = 11ab,  then


{{{a^2 - 2ab + b^2}}} = 11ab - 2ab = 9ab,   which implies


{{{(a-b)^2}}} = 9ab,   or


{{{(a-b)^2/9}}} = ab,   or


{{{((a-b)/3)^2}}} = ab


Take the logarithm of both sides. Account that a > b > 0, according to the condition. You wull get


{{{2*log (((a-b)/3))}}} = log(a) + log(b),


which implies


{{{log (((a-b)/3))}}} = {{{(1/2)*(log((a)) + log((b)))}}}.    <<<---=== <U>It is the correct statement</U>.
</pre>