Question 1010593: In a recent study, it was found that in one town the number of hours that a typical 10-year-old child watches television per week is normally distributed with a mean of 12 hours and a standard deviation of 1.5 hours. If Gary is a typical 10-year-old child in this town, what is the probability that he watches between 9 and 14 hours of television per week?
Answer by stanbon(75887) (Show Source):
You can put this solution on YOUR website! In a recent study, it was found that in one town the number of hours that a typical 10-year-old child watches television per week is normally distributed with a mean of 12 hours and a standard deviation of 1.5 hours.
-----------------------------
If Gary is a typical 10-year-old child in this town, what is the probability that he watches between 9 and 14 hours of television per week?
--------
z(9) = (9-12)/1.5 = -2
z(14) = (14-12)/1.5 = 1 1/3
-------
P(9 < x < 14) = P(-2 < z < 4/3) = normalcdf(-2,4/3) = 0.8860
-------
Cheers,
Stan H.
----------
|
|
|