Question 498157
An athlete runs 10 miles and then walks home.
 The trip home takes 1 hour longer than it took to run that distance.
 If the athlete runs 5 miles per hour faster than he walks, what are his average running and walking speeds?
:
let s = his walking speed
then
(s+5) = his running speed
:
Write a time equation; time = dist/speed
:
Run speed = Walk speed + 1 hr
{{{10/((s+5))}}} = {{{10/s}}} + 1
:
multiply by s(s+5), results
10s = 10(s+5) + s(s+5)
:
10s = 10s + 50 + s^2 + 5s 
combine this as a quadratic equation
s^2 + 5s - 50 = 0
Factors to:
(s+10)(s-5) = 0
positive solution
s = 5 mph is his walking speed
then, obviously, 10 mph is his running speed
:
:
confirm this. find the actual times
10/5 = 2hr
10/10= 1hr
-------------
diff: 1 hr