Question 1058945
.
At first a runner jogs at 5 mph and then jogs at 8 mph traveling 7 miles in 1.1 hours. How long does the runner jog at each speed. 
(Hint: Let t represent the amount of time the runner jogs at 5 mph. Then 1.1 - t represents the amount of time the runner jogs at 8 mph.) 
Set up an equation for distance at 5 mph + distance at 8 mph = 7miles.
~~~~~~~~~~~~~~~~~~~~~


They just explained you how to start your solution:
So, I will start as they recommend:


<pre>
"Let t represent the amount of time the runner jogs at 5 mph."
"Then 1.1 - t represents the amount of time the runner jogs at 8 mph."

During time "t" hours the runner jogs the distance of 5*t miles.
During time 1.1-t hours the runner jogs the distance 8*(1.1-t) miles.

In all, the runner will jog 5t + 8*(1.1-t) miles.
Since it is 7 miles, you have this equation

5t + 8*(1.1-t) = 7.

Simplify and solve for "t":

5t + 8.8 - 8t = 7,

-5t = 7 - 8.8,

-3t = -1.8,

t = {{{(-1.8)/(-3)}}} = 0.6.

Hence, the runner jogs 0.6 hour = 36 minutes at 5 mph.

The rest of time, 1.1 hour minus 0.6 hour = 0.5 hour = 30 minutes, the runner jogs t 8 mph.
</pre>

Solved.


And <U>forget on using tables</U> (as "josgarithmetic" recommend) if you want to learn on how to solve such problems.