Question 100258:  An airplane flies 1000 miles due east in 2 hours and 1000 miles due south in 3 hours.  What is the average speed of the airplane?  
 Answer by bucky(2189)      (Show Source): 
You can  put this solution on YOUR website! The equation that you need to use is: 
. 
D = R*T 
. 
in which D is the total distance flown, R is the Rate or Speed, and T is the total time required 
for this travel. 
. 
The plane flies 1000 miles on one leg plus another 1000 miles on the second leg of this flight. 
Therefore, the total distance is 1000 + 1000 = 2000 miles. 
. 
The first leg of the flight takes 2 hours and the second leg of the flight takes 3 hours. 
Therefore, the total time of flight is 2 + 3 = 5 hours. 
. 
Substitute the total distance (2000 miles) and the total time (5 hours) into the equation 
and you get: 
. 
2000 = R*5 
. 
or in more standard form: 
. 
5*R = 2000 
. 
Solve for R by dividing both sides of this equation by 5 which is the multiplier of R to get: 
. 
5*R/5 = 2000/5 
. 
and after doing the division the equation simplifies to: 
. 
R = 2000/5 = 400 miles/hour 
. 
So the answer to this problem is that the average speed of the plane for this 5 hour flight 
is 400 miles per hour. 
. 
Hope this helps you to understand the problem a little better. 
. 
 
  | 
 
  
 
 |   
 
 |