Question 1168783
<font color=black size=3>
The roots of {{{ax^2+bx+c = 0}}} are p and q such that


{{{p = (-b+sqrt(b^2-4ac))/(2a)}}}


{{{q = (-b-sqrt(b^2-4ac))/(2a)}}}
which is from the quadratic formula


Adding p and q has the square root terms cancel out because we have {{{sqrt(b^2-4ac)}}} in both p and q; the only difference is that p has the positive version and q has the negative version. Effectively we're adding {{{R+(-R) = 0}}} where {{{R = sqrt(b^2-4ac)}}}


So after those root terms go away, we have
{{{p+q = (-b+(-b))/(2a)}}}


{{{p+q = (-2b)/(2a)}}}


{{{p+q = -b/a}}}
Therefore, the sum of the roots of {{{ax^2+bx+c = 0}}} is {{{-b/a}}}


We're told that the roots add to 1, so we know further that,
{{{p+q = 1}}}


{{{-b/a = 1}}}


{{{-b = a}}}


{{{a = -b}}}


Let's plug that into {{{b^2 = 2ac+a^2}}} to see what happens


{{{b^2 = 2ac+a^2}}}


{{{b^2 = 2(-b)c+(-b)^2}}} Replace 'a' with -b


{{{b^2 = -2bc+b^2}}}


{{{0 = -2bc}}} Subtract b^2 from both sides


From that last equation, we see that either b = 0 or c = 0.


If b = 0, then a = 0, but that means {{{ax^2+bx+c}}} isn't quadratic. Also, a = 0 causes division by zero errors in the quadratic formula. So we must make 'a' and b nonzero. This forces c to be zero.


Therefore, the equation {{{b^2 = 2ac+a^2}}} is only true if c = 0.


If we pick a nonzero c value such as c = 1, then,
{{{b^2 = 2ac+a^2}}}


{{{b^2 = 2(-b)*1+(-b)^2}}}


{{{b^2 = -2+b^2}}}


{{{0 = -2}}}
Which is a contradiction.
</font>