Question 1190625
<font color=black size=3>
Part A


The change in cost is 300 thousand (since 1.3 million - 1.0 million = 0.3 million = 300 thousand)


The change in time is 3 years (1998 - 1995 = 3)


The slope is the ratio of the two results:
(300 thousand)/(3 years) = 100 thousand per year


The slope is 100,000
Interpretation: The average cost of a 30-second ad goes up by $100,000 per year.


---------------------------------------

Part B


I'll let you do this part. The instructions are pretty straight forward. 
Make an xy grid and plot the two points given. Draw a straight line through them.


I recommend making the y scale increment by units of 100,000.


---------------------------------------
Part C


The gap of time from 1995 to 2022 is 2022-1995 = 27 years.


If the average cost has gone up by $100,000 per year, then it has gone up by 27*(100,000) = 2.7 million dollars over the course of those 27 years.


Add this to the original price of $1,000,000 in 1995
1,000,000 + 2,700,000 = 3,700,000


Answer: 3.7 million dollars
Is it realistic? I'd assume so but I'm not entirely familiar with actual advertising costs.
</font>