Question 878598
Hello,

So for this problem we need to change the units in the question (miles per hour) to miles per minute, so we can solve for minutes. 

So first we need to figure out how many minutes are in an hour.

There are 60 minutes in an hour.

Now we have to use a unit multiplier (you will see below):

25 miles/hour X 1 hour/60 minutes = 25 miles x hour / 1 hour x 60 minutes.

Since 1 hour is on the top and the bottom, they cancel each other out so you are left with:

25 miles/60 minutes.

Now we need to use division to find the regular rate per 1 minute:

25/60 = .41667 

Therefore,

25 miles per hour is equal to .41667 miles per minute. 

Now we have to use another unit multiplier to figure out the minutes:

40 miles X minutes / .41667 miles.

The miles on the top and bottom cancel each other out so we are left with:

(40) x minutes / .41667

Now we divide:

40/.41667 = 95.99923 

So it takes 95.99923 minutes to go 40 miles at a constant speed of 25 miles per hour. 

I hope this helps!