Question 796771
<font face="Times New Roman" size="+2">


The height at time *[tex \Large t] of a projectile launched near the surface of the earth is given by:


*[tex \LARGE \ \ \ \ \ \ \ \ \ \ h(t)\ =\ -\frac{1}{2}gt^2\ +\ v_ot\ +\ h_o]


Where *[tex \Large g\ =\ 32\text{ ft/sec^2}] is the acceleration due to gravity near the surface of the earth in fps units, *[tex \Large v_o] is the initial velocity (positive is UP), and *[tex \Large h_o] is the initial height.


For the first part of your problem, *[tex \Large v_o\ =\ 40\text{ ft/sec}] and *[tex \Large h_o\ =\ 1200\text{ ft}].  Solve for *[tex \Large t] when *[tex \LARGE h(t)\ =\ 0] (because the height of the ground is zero, right?).  Discard the negative root because you don't care what happened before you threw the ball.


For the second part, recalculate with *[tex \Large v_o\ =\ 0],  then calculate the difference between the first part answer and the second part answer.


John
*[tex \LARGE e^{i\pi}\ +\ 1\ =\ 0]
<font face="Math1" size="+2">Egw to Beta kai to Sigma</font>
My calculator said it, I believe it, that settles it
<div style="text-align:center"><a href="http://outcampaign.org/" target="_blank"><img src="http://cdn.cloudfiles.mosso.com/c116811/scarlet_A.png" border="0" alt="The Out Campaign: Scarlet Letter of Atheism" width="143" height="122" /></a></div>
</font>