SOLUTION: Let ax^2 + bx + c = 0 be a quadratic equation with no real roots and a,b > 0. Prove that -b/(2a) > H, where H is the harmonic mean of the roots of this quadratic equation.

Algebra.Com
Question 1183930: Let ax^2 + bx + c = 0 be a quadratic equation with no real roots and a,b > 0.
Prove that -b/(2a) > H, where H is the harmonic mean of the roots of this quadratic equation.

Answer by ikleyn(52777)   (Show Source): You can put this solution on YOUR website!
.
Let ax^2 + bx + c = 0 be a quadratic equation with no real roots and a,b > 0.
Prove that -b/(2a) > H, where H is the harmonic mean of the roots of this quadratic equation.
~~~~~~~~~~~~~~~~~~~~~


Let p and q be the roots of the given equation,


The harmonic mean  of the roots is


    H =  = .


According to Vieta's theorem,  pq = ,  p+q = ,  so


    H =  =  = .


Thus the inequality we need to prove takes the form


     > .


It is equivalent to


     < 


which with positive "a" and "b" is equivalent to


    b^2 < 4ac,   or   b^2 - 4ac < 0.


The last inequality is equivalent to the condition that the given quadratic equation has no real roots.

Proved and solved.



RELATED QUESTIONS

what is the sum of the roots of the quadratic equation: ax^2 + bx + c = 0, where a, b,... (answered by Fombitz)
Please help me with these questions : 1) The equations ax^2 +bx+c =0 and bx^2 +ax+c=0... (answered by KMST)
Write a quadratic equation with the roots + or - radical5/4. write equation in the form... (answered by richard1234)
Show that the quadratic function {{{y=ax^2+bx+c}}} can be written in the form... (answered by fcabanski)
So for this question(All the roots of x^2 + px + q = 0 are real, where p and q are real... (answered by ikleyn)
What would be a quadratic equation with 7 and -3 as its roots? How would I write the... (answered by rothauserc)
If the roots are ax^2+bx+c=0, differ by 1, show that they are (a-b)/2a and -(a+b)/2a, and (answered by Edwin McCravy)
Find a quadratic equation in the form ax^2 + bx + c = 0, where a, b and c are integers... (answered by ankor@dixie-net.com)
For 0 < a < b, let h be defined by (1/h) = (1/2)[(1/a) + (1/b)]. Show that a < h < b.... (answered by Edwin McCravy,math_tutor2020,mccravyedwin)