Question 867796
Let's say that the distance from New York to Atlanta is 420 miles. This is not true, but for the sake of the problem, let's say it is. 



The speed going from New York to Atlanta is 420 mph. So it will take t = D/r = 420/420 = 1 hour for the plane to get there.



Coming back the same distance, it is now going 630 mph. So it will take t = D/r = 420/630 = 2/3 an hour to get back



Overall, the plane has traveled for 1+2/3 = 3/3+2/3 = 5/3 hours. 



Also, the plane has gone 420+420 = 840 miles total.



Divide the total distance by the total time to get the overall average speed



r = D/t
r = 840/(5/3)
r = 840*(3/5)
r = 2520/5
r = 504



So the average speed is <font color="red">504 mph</font>



Note: this works for <i>any</i> distance you pick from NY to Atlanta.