Question 601435
<font face="Times New Roman" size="+2">


The first part of the trip is 20 miles and the last part of the trip is 70 miles so the whole trip is 90 miles.  Time equals distance divided by rate, so if you want to average 40 mph for 90 miles it will take you *[tex \LARGE \frac{90}{40}\ =\ 2.25] hours to complete the trip.  You have already used *[tex \LARGE \frac{20}{50}] hours for the first part of the trip.  Calculate the amount of time you have used, subtract that amount from 2.25 hours to discover the amount of time remaining.  Divide that amount of time into 70 to see how fast you need to go to cover 70 miles in the remaining amount of time.  Since all of the given values are given to a whole number precision, you should round your answer to the nearest whole mile per hour; don't do any rounding until AFTER the final calculation.


John
*[tex \LARGE e^{i\pi} + 1 = 0]
My calculator said it, I believe it, that settles it
<div style="text-align:center"><a href="http://outcampaign.org/" target="_blank"><img src="http://cdn.cloudfiles.mosso.com/c116811/scarlet_A.png" border="0" alt="The Out Campaign: Scarlet Letter of Atheism" width="143" height="122" /></a></div>
</font>