Question 551399
Starting with:
{{{d = rt}}} d = distance, r = rate/speed, and t = time of travel.
For this problem, we want to find the difference in the times when r = 20mph and r = 30mph over a distance of 0.3 miles.
At 30mph we have:
{{{t[1] = 0.3/30}}}
{{{t[1] =0.01}}}hours.
At 20mph we have:
{{{t[2] = 0.3/20}}}
{{{t[2] = 0.015}}}hours. Subtract {{{t[2]-t[1]}}}
{{{0.015-0.01 = 0.005}}}hours.  Convert to seconds, multiply by 3600 seconds/hour.
{{{0.005(3600) = 18}}}seconds.
It takes 18 seconds longer at 20mph than it would at 30mph.