SOLUTION: Suppose you have a normal distribution of test scores with a mean of 50 and a standard deviation of 5. What percentage falls between 40 and 60?
Algebra.Com
Question 896238: Suppose you have a normal distribution of test scores with a mean of 50 and a standard deviation of 5. What percentage falls between 40 and 60?
Answer by stanbon(75887) (Show Source): You can put this solution on YOUR website!
Suppose you have a normal distribution of test scores with a mean of 50 and a standard deviation of 5. What percentage falls between 40 and 60?
----
40 is 2 standard deviations below the mean.
60 is 2 standard deviations above the mean.
-----
Chebyshev says the answer is "at least 1 - (1/2)^2 = 75%
-----
The standard normal distribution says
P(-2 < z <= 2) = normalcdf(-2,2) = 0.9545 = 95.45%
=======================
Cheers,
Stan H.
===============
RELATED QUESTIONS
In a normal distribution with a mean of 90 and a standard deviation of 10, approximately... (answered by Fombitz)
If a normal distribution of scores has a mean of 100 and a standard deviation of 10, what (answered by oscargut)
A scale measuring prejudice has been administered to a large sample of respondents. The... (answered by cynalvaz)
scores from a test have a mean of 64.6. the distribution sample means for size 100 is... (answered by stanbon)
Please help me solve this problem, based on the examples in the book, I can't figure it... (answered by stanbon)
The distribution of certain test scores is a nonstandard normal distribution with a mean... (answered by stanbon)
Please help me solve this problem: Suppose thet IQ scores have a bell-shaped distribution (answered by solver91311)
Suppose that IQ scores have a bell-shaped distribution with a mean of 104 and a standard... (answered by ewatrrr)
Suppose that IQ scores have a bell-shaped distribution with a mean of 97 and a standard... (answered by Boreal)