Question 366836
<font face="Garamond" size="+2">


Remember, distance equals rate times time.  In order to average 45 miles per hour for 100 miles, you must complete the trip in *[tex \Large \frac{100}{45}\ =\ \frac{20}{9}] hours.


If you averaged 36 miles per hour for the first 50 miles, then the first half of the trip must have taken *[tex \Large \frac{50}{36}\ =\ \frac{25}{18}] hours.


That means you must complete the last half of the trip in *[tex \Large \frac{40}{18}\ -\ \frac{25}{18}\ =\ \frac{15}{18}\ =\ \frac{5}{6}] hours.


Simply divide 50 miles by *[tex \Large \frac{5}{6}] hours to get the required speed.


John
*[tex \LARGE e^{i\pi} + 1 = 0]
My calculator said it, I believe it, that settles it
<div style="text-align:center"><a href="http://outcampaign.org/" target="_blank"><img src="http://cdn.cloudfiles.mosso.com/c116811/scarlet_A.png" border="0" alt="The Out Campaign: Scarlet Letter of Atheism" width="143" height="122" /></a></div>
</font>