Question 1006752
A ball is thrown upward with an initial velocity of 28 meters per second from a cliff that is high.
 The height of the ball is given by the quadratic equation where h is in meters and t is the time in seconds since the ball was thrown.
 Find the time it takes the ball to hit the ground.
 Round your answer to the nearest tenth of a second.
:
Using the gravitational force of 4.9 m/sec/sec, assume the cliff is 210 meters high
:
Gravity pulls downward -, initial velocity is upward +
h = -4.9t^2 + 28t + 210
when hits the ground, h = 0
-4.9t^2 + 28t + 210 = 0
solve this equation using the quadratic formula a=-4.9; b=28; c=210
I got a positive solution of t = 10 secs to hit the ground