Question 326000: How do I use the mean and the standard deviation to calculate the odds?
For example, suppose I have a set of data representing how often a person might guess a playing card correctly, using a standard deck of 52 cards.
I want to use the mean and the standard deviation of this sample to aproximate the odds of that same person guessing N cards correctly, out of 52 cards, where N is any integer.
Random chance averages 1 card correct out of a deck of 52...
Assuming that the sample size is sufficiently large, I want to calculate the odds of getting 2, 3, 4, etc correct.
If I also keep statiscics on how often the person gets just the suit correct, or juxt the rank correct, how do I also calculate those odds?
Random chance averages 3 ranks 'hits', 12 suit 'hits' and one exact 'hit' (both rank and suit correct) in a deck of 52 cards.
I want to calculate the odds of that same person getting X rank 'hits' or Y suit 'hits'
Also, If I include the jokers, how does this affect the rank and suit calculations (if both jokers are considered the same "suit" and the same "rank" are the odds 1 in 5 of a suit 'hit' and 1 in 14 of a rank 'hit) {some decks contain different jokers, a red one and a black one - so if these jokers are considered to be of different "suits" but of the same "rank" are the odds 1 in 6 of a suit 'hit' and 1 in 14 of a rank 'hit}
Using my statistical sample, how do I calculate these numbers?
(Assume my sample size is in the millions of decks - sufficiantly large to provide meaningfull results)
Answer by Edwin McCravy(20065) (Show Source):
|
|
|