SOLUTION: A sample of data has a standard deviation of 68. If you were to multiply all of the scores in the data set by factor 38, What would the new standard deviation be? Please be kind

Algebra ->  Probability-and-statistics -> SOLUTION: A sample of data has a standard deviation of 68. If you were to multiply all of the scores in the data set by factor 38, What would the new standard deviation be? Please be kind      Log On


   



Question 734335: A sample of data has a standard deviation of 68. If you were to multiply all of the scores in the data set by factor 38, What would the new standard deviation be?
Please be kind enough to explain how you go there so I am able to figure this out by myself too. The only calculator I have is a TI-30X

Answer by Positive_EV(69) About Me  (Show Source):
You can put this solution on YOUR website!
A calculator isn't going to help you, this is a pure theory problem.

A rule which holds true for -every- statistical distribution is that Var[aX] = a^2*Var[X]. Thus, when you multiply the data set by 38, you multiply its variance by 38^2.

The standard deviation for such a distribution is the square root of the variance, so it'll be sqrt(38^2) = 38 times higher.

PS: If you can access the internet, there are calculators online that'll help you compute standard deviations and the like. Calculating stuff like a standard deviation by hand is a task I'd wish on no one :P