Question 153613This question is from textbook algebra1
: hi,
my name is jay, and i am struggling with this question. the question is:
if a pitcher throws a baseball 90 miles per hour, how many seconds does it take for the ball to reach home plate?
here's what i've tried:
time equals distance divided by speed
i divide 60.5 feet by 90 mph, and i get .6722 seconds.
Then i thought, can i really solve this equation this way? can i use feet in distance as the numerator and miles per hour in divisor. I'm not sure how to solve this.
thank you so much for your time. Jay
This question is from textbook algebra1
Found 3 solutions by stanbon, scott8148, Earlsdon: Answer by stanbon(75887) (Show Source):
You can put this solution on YOUR website! if a pitcher throws a baseball 90 miles per hour, how many seconds does it take for the ball to reach home plate?
-------------------------
Convert miles per hour to feet per second:
[90 miles/1 hour]
= [90 * 1 * 5280] / [1* 3600 * 1] ft./sec.
= 132 ft./sec
----------------------------
To cover 60 ft:
(60 ft/132 ft/sec) = 0.454545... seconds
==================================
Cheers,
Stan H.
Answer by scott8148(6628) (Show Source):
You can put this solution on YOUR website! you are right, Jay __ you need to keep the units consistent
converting miles per hour to feet per second
__ 60 seconds per minute, 60 minutes per hour __ so 60*60 or 3600 seconds per hour
__ 5280 feet in a mile
__ so feet per second equals miles per hour times 5280/3600
90 mph equals 132 fps __ so the time to the plate is 60.5/132
Answer by Earlsdon(6294) (Show Source):
|
|
|