Question 230594: two airplanes begin 1000 miles apart and fly along lines that intersect at a right angle. one plane flies an average of 100 miles per hour faster tha the other. if the two planes meet after two hours, how fast do the airplanes fly?
i have set up the problem and i just don't know how the times are to be used.
Answer by Alan3354(69443) (Show Source):
You can put this solution on YOUR website! two airplanes begin 1000 miles apart and fly along lines that intersect at a right angle. one plane flies an average of 100 miles per hour faster tha the other. if the two planes meet after two hours, how fast do the airplanes fly?
i have set up the problem and i just don't know how the times are to be used.
-----------------
The speeds can be set up as vectors, which makes them a right triangle the same as the flight paths.
They cover 1000 miles in 2 hours, --> 500 mph total. 500 is the hypotenuse.
The 2 legs differ by 100 mph.
a^2 + b^2 = c^2
a^2 + (a+100)^2 = 500^2
This is the 3, 4, 5 triangle, so the speeds are 300 mph and 400 mph.
|
|
|