Question 1128306
To find how long it takes for the ball to reach the point above home plate
you would divide distant by speed,
which is the same as multiplying distance times the reciprocal of speed:
{{{speed=distance/time}}} <--> {{{time=distance/speed}}} <--> {{{time=distance(1/speed)}}} 
 
We just have to deal with unit conversions.
One mile is {{{5280}}} feet,
so {{{1mile/"5280 feet"}}} and {{{5280 feet/"1 mile"}}}
are our "reduction-of-unit-multipliers" for distance.
One hour is 60 minutes and {{{3600}}} seconds, 
so {{{1 hour/"3600 seconds"}}} and {{{3600 seconds/"1 hour"}}}
are our "reduction-of-unit-multipliers" for time.
Beyond that, I hope we do not need to 80/
60 feet 6 inches = 60.5 feet.
 
The speed of the ball in feet per second is
{{{(100miles/"1 hour")(1 hour/"3600 seconds")(5280 feet/"1 mile")}}}{{{"="}}}{{{5280feet/"36 seconds"}}}{{{"="}}}{{{"about 146.7 feet / second"}}}
 
The time for the ball to reach just above home plate is
{{{60.5feet(36seconds/"5280 feet")=0.4125seconds}}}
 
Maybe that is the expected answer.
It seems to be a very short time to decide how to move and get the bat to the right position to meet the ball.