Question 1010759



since the standard deviation comes from the sample, you would conduct a t-test.


the degrees of freedom is equal to n-1 which is equal to 30 - 1 which is equal to 29.


the critical t-score at .01 alpha with 29 degrees of freedom is plus or minus 2.46202 truncated to 5 decimal places.


the sample standard deviation is .26 ounces.


the standard error of the distribution of sample means is .26 / sqrt(30) = .04746 truncated to 5 decimal places.


you can calculate the critical raw scores from the critical t-scores.


the critical t-scores at .01 significance level and 29 degrees of freedom at each end are plus or minus 2.46202 truncated at 5 decimal places.


the formula is:


t = (x-m)/s


t is the t-score.
x is the mean of the sample.
m is the desired measurement of 12.
s is the standard error.


from this formula, solve for x to get:


x = t*s+m


when t = 2.46202, this becomes x = 2.46202 * .04746 + 12 = 12.11684 truncated to 5 decimal places.


when t = -2.46202, this becomes x = -2.46202 * .04746 + 12 = 11.88315 truncated to 5 decimal places.


the mean of the sample is 11.92 which is within the critical levels of 11.88 to 12.12 rounded to 2 decimal places.


no calibration is required.