Question 150998: You have an object. That object is dropped from a height of x feet. The height of the object at any time is y, in seconds. This is given by the following equation:
15y^2 + x = h
How long does it take an object dropped from 500 feet to hit the ground?
Answer by tweevy(2) (Show Source):
You can put this solution on YOUR website! In this problem, you're given the height, x = 500 ft. You're trying to find the amount of time it takes the object to drop to the ground, which is at a final height of h = 0.
Plug these into the equation:
15y^2 + 500 = 0
15y^2 = -500
y^2 = -33.333
y = sqrt(-33.333)
Time must be positive, so the final answer is y = 5.77 seconds.
|
|
|