Question 150998
In this problem, you're given the height, x = 500 ft.  You're trying to find the amount of time it takes the object to drop to the ground, which is at a final height of h = 0. 

Plug these into the equation:

15y^2 + 500 = 0
15y^2 = -500
y^2 = -33.333
y = sqrt(-33.333)

Time must be positive, so the final answer is y = 5.77 seconds.