Question 883299
Johnathan drove to the airport to pick up his friend.  
A rainstorm forced him to drive at an average speed of 45 mph, reaching the airport in 3 hours.
Note: distance to the airport = rate*time = 45 mph * 3 hr = 135 miles
-------------------------------------------------------------------------

He drove back home at an average speed of 55 mph. How long, to the nearest tenth of an hour, did the trip home take him?
Equation:
time = distance/rate = 135 mi / 55 mph = 2.45 hrs = 2hr 33 minutes
----------------
Cheers,
Stan H.
------------------