SOLUTION: A driver sets out on a 20-mile trip. When he has gone halfway he finds he has averaged 25 mph. At what speed must he travel the rest of the way to make his overall average speed fo

Algebra.Com
Question 121178: A driver sets out on a 20-mile trip. When he has gone halfway he finds he has averaged 25 mph. At what speed must he travel the rest of the way to make his overall average speed for the trip 40 mph?
Found 2 solutions by checkley71, bucky:
Answer by checkley71(8403)   (Show Source): You can put this solution on YOUR website!
10=25T FOR THE FIRST HALF OF THE TRIP
10=XT FOR THE SECOND HALF OF THE TRIP
THE TOTAL TIME=20/40=.5 HOURS.
10/25+10/X=.5
.4+10/X=.5
10/X=.5-.4
10/X=.1
.1X=10
X=10/.1
X=100 MPH FOR THE SECOND HALF OF THE TRIP.
PROOF
10/25+10/100=.5
.4+.1=.5
.5=.5

Answer by bucky(2189)   (Show Source): You can put this solution on YOUR website!
For this problem you use the formula that relates distance (D), rate of travel (R), and time (T).
The formula is:
.
D = R*T
.
[Think the distance I travel is equal to the rate (or speed) times the amount of time that I
travel at that speed. For example, the distance I cover at 50 miles per hour for 2 hours is 100
miles (50*2)]. You must use consistent units. If your speed is miles per hour, your time
must be in hours and the distance covered is in miles.
.
Now to the problem. The problem tells you that for the first half of the trip ... which is 10
miles ... the average rate is 25 mph. You know the distance (10) and the rate (25). The
question is how much time did this half of the trip take. Plug the information into the
distance equation and you have:
.
10 = 25*T
.
Solve for T by dividing both sides by 25 and you get:
.
T = 10/25 = 0.4 hours
.
So the driver has spent 4-tenths of an hour going the first 10 miles.
.
To average 40 mph and hour for the whole 20 miles, how much time does the trip involve? Plug
these numbers into the distance formula and you have:
.
20 = 40*T
.
Solve for T by dividing both sides of this equation by 40 and you get:
.
T = 20/40 = 0.5 hours
.
So to average 40 mph the driver must cover the 20 miles in 0.5 hours (a half hour). But
he has already used 0.4 hours to go the first 10 miles. That means that he will have to drive
the next 10 miles in the 0.1 hours remaining ... 0.5 minus 0.4 = 0.1 hours. Plug these two
values into the distance equation ... 10 miles and 0.1 hours and you have:
.
10 = R*(0.1)
.
Solve for the rate by dividing both sides of the equation by 0.1 and you get:
.
R = 10/0.1 = 100 miles per hour
.
So after going at 25 mph for the first half of a 20 mile trip, the driver must cover the
next half of the trip at 100 mph if the average speed for the trip is to be 40 mph.
.
Hope this helps you to understand the problem.
.

RELATED QUESTIONS

A driver sets out on a 20-mile trip. When he has gone halfway he finds he has averaged 25 (answered by Edwin McCravy)
A driver sets out on a 20-mile trip. When he has gone halfway he finds he has averaged 25 (answered by macston)
A biker wants to travel 24 miles. He wants to average 10 mph. He is halfway to his... (answered by ad_alta)
harrys motorboat can make an average of 8 miles per hour . one day he sets out for a trip (answered by ankor@dixie-net.com)
Dick's motorboat can make an average 8 miles an hour. One day he sets out for a trip,... (answered by ankor@dixie-net.com)
jake is on a 24-mile bike trip. He wants to average 10 miles per hour. He is halfway to... (answered by ankor@dixie-net.com)
dick sets out on a hike. after walking for a while at 5 miles per hour, he discovers he... (answered by Boreal)
An auto tourist made a trip of 275 miles in 8 hours. Before noon he averaged 40 mph, and... (answered by josmiceli)
Jorge is making a business trip by car. After driving half the total distance, he finds... (answered by ikleyn,josgarithmetic)