SOLUTION: (a) Prove that if the roots of x^3 + ax^2 + bx + c = 0 form an arithmetic sequence, then 2a^3 + 27c = 9ab. (b) Prove that if 2a^3 + 27c = 9ab, then the roots of x^3 + ax^2 + b

Algebra.Com
Question 1176146: (a) Prove that if the roots of
x^3 + ax^2 + bx + c = 0 form an arithmetic sequence, then 2a^3 + 27c = 9ab.
(b) Prove that if 2a^3 + 27c = 9ab, then the roots of
x^3 + ax^2 + bx + c = 0 form an arithmetic sequence.

Answer by Edwin McCravy(20056)   (Show Source): You can put this solution on YOUR website!
 

Suppose the roots (which form an arithmetic sequence) are p-d, p, and p+d.
Then

The sum of the roots is -a
 








The sum of the products of pairs of roots is b







The product of the roots is -c










So we have solved a, b, and c in terms of p and d

Substitute in the original equation:







So either p=0 or d=0

If p=0, then the roots are -d, 0, and d

Then the sum of the roots = -d+0+d = 0 = -a, so a=0

The sum of the products of pairs of roots = (-d)(0)+(-d)(d)+(0)(d)=-d2, so b=-d2

Then the product of the root is (-d)(0)(d) = 0, so c=0

That means the original equation, in this case, was really:

 or



So we see if  holds true in this case:







So yes it does hold when p=0

Now we see what happens when d=0.

Then the roots are p-0, p, and p+0, or p, p, and p. 

So the three roots are all equal.

The sum of the roots is 3p, so a=-3p

The sum of the products of pairs of roots = (p)(p)+(p)(p)+(p)
(p)=3p2, so b=3p2

Then the product of the roots is (p)(p)(p) = p3, so c=-p3

That means the original equation, in this case, was really:

 or

 which is just 

So we see if  holds true in this case as well:











So yes it does hold true in this case too.

The (a) part of the problem is proved.

If I find time I'll do (b) as well.
 
Edwin


RELATED QUESTIONS

If x³+3ax²+bx+c is a perfect cube, prove that... (answered by greenestamps)
prove that if the sum of the sequence of the roots of the equation ax^2+bx+c=0 is 1 then... (answered by math_tutor2020)
prove that if the sum of the sequence of the roots of the equation ax^2+bx+c=0 is 1 then... (answered by ikleyn)
If the roots are ax^2+bx+c=0, differ by 1, show that they are (a-b)/2a and -(a+b)/2a, and (answered by Edwin McCravy)
If the roots of the quadratic equation ax*2+bx+c=0 are in the ratio of 2 : 3, then prove (answered by greenestamps)
Let ax^2 + bx + c = 0 be a quadratic equation with no real roots and a,b > 0. Prove that (answered by ikleyn)
So for this question(All the roots of x^2 + px + q = 0 are real, where p and q are real... (answered by ikleyn)
If u and v are the roots of the equation of {{{ax^2+bx+c=0}}}, prove that {{{u^2+v^2 =... (answered by Timnewman)
factor the trinomial... (answered by user_dude2008,fractalier,mananth)