Question 406648
Hi!I would like to ask for help for this question:Suppose an outfielder catches the ball on the third  base line about 40 ft behind the third base. About how far would the outfielder have to throw the ball to the first base?(To be rounded to the nearest tenths).>Thank you very much in advance!:)
.
Draw a diagram, it'll help you see the solution.
You have to also know that the distance between home base and 3rd base is 90 feet.
Then, applying Pythagorean theorem:
Let x = distance to 1st base
then
x^2 = (90+40)^2 + 90^2
x^2 = (130)^2 + 90^2
x^2 = 16900 + 8100
x^2 = 25000
x = 158.1 feet