SOLUTION: Let’s say that you are driving on a straight route to a set destination, and you can drive at any speed you like. You stop for a few minutes but when you arrive at the halfway poin

Algebra ->  Customizable Word Problem Solvers  -> Travel -> SOLUTION: Let’s say that you are driving on a straight route to a set destination, and you can drive at any speed you like. You stop for a few minutes but when you arrive at the halfway poin      Log On

Ad: Over 600 Algebra Word Problems at edhelper.com


   



Question 329593: Let’s say that you are driving on a straight route to a set destination, and you can drive at any speed you like. You stop for a few minutes but when you arrive at the halfway point, you discover that you have averaged only 20 miles per hour. So you decide to forego any more stops and drive fast enough to average 40 miles per hour for the entire trip. If you keep a steady speed, how fast should you drive?
Answer by Theo(13342) About Me  (Show Source):
You can put this solution on YOUR website!
The general formula to use is:

D = R * T

Where:

D = Distance in miles.
R = Rate in miles per hours.
T = Time in hours.

From this formula, you can derive:

T = D / R

We will let:

T1 = the time required for the first half of the trip.
T2 = the time required for the last half of the trip.
T = the time required for the whole trip.
20 = the rate of speed for the first half of the trip.
x = the rate of speed for the last half of the trip.
40 = the rate of speed for the whole trip.
D = the distance for the whole trip.
D/2 = the distance for each half of the trip.

For the whole trip, we get T = D/40

For the first half of the trip, we get T1 = (D/2)/20 = D/40
For the second half of the trip, we get T2 = (D/2)/x = D/(2*x)

Since the time required for the whole trip is equal to the time required for the first half of the trip plus the time required for the second half of the trip, we get:

T = T1 + T2

Since T = D/40 and T1 = D/40 and T2 = D/(2*x), then we can substitute for T and T1 and T2 to get:

D/40 = D/40 + D/(2*x)

In order for this equation to be true, D/(2*x) must be equal to 0.

Since there is no value of x that will make D/(2*x) = 0, we have to conclude that is would be impossible to travel the whole distance at 40 miles per hour if we traveled the first half of the distance at 20 miles per hour.

As an example:

Assume the total distance was 1000 miles.

Half the distance is therefore equal to 500 miles.

500 miles at 20 miles per hour would take 25 hours.

1000 miles at 40 miles per hour would take 25 hours.

There is no time left for the last half of the distance so it is impossible to make an overall average of 40 miles per hour if we traveled the first half of the distance at 20 miles per hour.

The formula will give you a speed if the overall average was something less than 40 miles per hour.

Assume the overall speed to be 35 miles per hour.

The formula then would become:

T = T1 + T2

Since T is now equal to D/35, and T1 is still equal to D/40 and T2 is still equal to D/(2*x), then we can substitute for T and T1 and T2 to get:

D/35 = D/40 + D/(2*x)

Multiply both sides of this equation by 35 * 40 * (2*x) to get:

(40 * 2 * x * D) = (35 * 2 * x * D) + (40 * 35 * D)

Subtract (35 * 2 * x * D) from both sides of the equation to get:

(40 * 2 * x * D) - (35 * 2 * x * D) = (40 * 35 * D)

Simplify to get:

(40 - 35) * (2 * x * D) = (40 * 35 * D)

Simplify further to get:

5 * (2 * x * D) = (40 * 35 * D)

Simplify further to get:

10 * x * D = 1400 * D

Divide both sides of this equation by 10 * D to get:

x = 140

In order to travel the overall distance at 35 miles per hour, you would have to travel the last half of the distance at 140 miles per hour, if you traveled the first half of the distance at 20 miles per hour.

The time required for the overall distance would be 1000 / 35 = 28.571428571 hours.

The time required for the first half of the distance would be 500 / 20 = 25 hours.

the time required for the last half of the distance would be 500 / 140 = 3.571428571 hours.

The total time required for the first half of the distance and the last half of the distance would be 25 + 3.571428571 = 28.571428571

Since the total time for the first and second half is the same as the total time required to travel at 35 miles per hour for the whole distance, the formula checks out ok and the answer of 140 miles per hour for the second half of the trip is good.

The answer to your question is:

If you traveled the first half of the distance at 20 miles per hour, you could not possibly travel the second half of the distance fast enough to make the overall average speed for the whole distance equal to 40 miles per hour.