Question 828174
At liftoff the rocket is at ground level, 0 miles above earth's surface.
After 30 seconds, it is 1 mile above earth's surface.
 
a) The average upwards speed of the rocket during the first 30 seconds (0.5 minutes) is
{{{1mile/"0.5 minutes"=2}}}miles per minute.
The average upwards speed of the rocket during the next 2 minutes (between 30 seconds and 2.5 minutes after liftoff) in miles per minute is
{{{(5-1)/(2.5-0.5)=4/2=2}}}.
It may not be scientifically sound to assume that the speed was 2 miles per minute at all times, but my guess is that your teacher intended exactly that.
So the height {{{h}}} of the rocket (in miles), as a function of time {{{t}}} (in minutes) since liftoff is
{{{h(t)=2t}}}
If you called the height in miles {{{y}}} and the time in minutes {{{x}}} ,
then the equation would be {{{y=2x}}} .
Either way, the function is a linear function.
Its graph is a straight line through the origin.
I would graph it only for {{{t>=0}}} because we do not know where that rocket was before liftoff (for {{{t<0}}} ).
{{{drawing(300,300,-0.5,4.5,-2,8,
grid(1),
line(0,0,4,8) )}}}
 
b) The slope in {{{y=2x+0}}} is {{{2}}} miles per minute and represents the speed of the rocket.
The y-intercept is 0 miles, because the graph crosses the y-axis at {{{y=0}}} .
The y-intercept represents the height of the rocket at liftoff: 0 miles above earth.