SOLUTION: Does anyone understand this question? The standard deviation of a distribution is 20. If a sample of 225 is selected, what is the standard error of the mean?

Algebra.Com
Question 137841: Does anyone understand this question?

The standard deviation of a distribution is 20. If a sample of 225 is selected, what is the standard error of the mean?

Answer by stanbon(75887)   (Show Source): You can put this solution on YOUR website!
The standard deviation of a distribution is 20. If a sample of 225 is selected, what is the standard error of the mean?
--------------------
20/sqrt(225) = 20/15 = 4/3 = 1.33333333....
===============
Cheers,
Stan H.

RELATED QUESTIONS

Binomial Distribution: If anyone can answer this question, thank you so much! A... (answered by ewatrrr)
If the mean of a normal distribution is 70 and the standard deviation is 10 what is the... (answered by stanbon)
This is from a normal distribution chapter: Not sure if I use the central limit theorem.... (answered by stanbon)
Can someone please assit me with this? I am not getting this at all. An April 2004... (answered by Theo)
An April 2004 article on Hear The Issues.com stated that Americans have an average of... (answered by rothauserc)
An April 2004 article on HearTheIssues.com stated that Americans have an average of 2.13... (answered by reviewermath)
CONSIDER A NORMAL DISTRIBUTION WITH THE MEAN OF 104 AND STANDARD DEVIATION OF 20, IF YOU... (answered by ewatrrr)
Americans watch an average of 3.5 hours of television per person per day. If the standard (answered by jim_thompson5910)
stated that Americans watch an average of 3.5 hours of television per person per day. If... (answered by ewatrrr)