Nate's solution is incorrect.
Let α and β be the two roots of a quadratic equation
ax^2 + 2b^x + c = 0, where a, b and c are constants.
Obtain the quadratic equation whose two roots are λ and δ
such that λ = α^2 + β^2 and δ = α^2 - β^2.
Make use of this
(x - r1)(x - r2) = 0 has roots r1 and r2
and if you multiply that out by FOIL
x² - r1x - r2x + r1r2 = 0
and factor x out of the middle two terms:
x² - (r1 + r2)x + r1r2 = 0
This quadratic equations has roots r1 and r2
Since we want the roots to be λ and δ, we set
r1 = λ, r2 = δ. Substituting:
x² - (λ + δ)x + λδ = 0
Since λ = α² + β and δ = α² - β².
λ + δ = (α² + β²) + (α² - β²) = α² + β² + α² - β² = 2α²
λδ = (α² + β²)(α² - β²) = α4 - β4
So
x² - (λ + δ)x + λδ = 0
becomes
x² - 2α²x + (α4 - β4) = 0
Edwin