SOLUTION: The mean time taken by all participants to run a road race was found to be 115 minutes with a standard deviation of 20 minutes. Using Chebyshev's theorem, find the percentage of r

Algebra.Com
Question 971405: The mean time taken by all participants to run a road race was found to be 115 minutes with a standard deviation of 20 minutes.
Using Chebyshev's theorem, find the percentage of runners who ran this road race in 75 to 155 minutes.

Answer by Boreal(15235)   (Show Source): You can put this solution on YOUR website!
Chebyshev's Theorem puts a bound on the percentage between x number of standard deviations. A certain amount of data must fall within 1-[1/(sd)^2], so at least 3/4 of people will be within two standard deviations. It may be more, and it usually is, but it can not be fewer than 3/4.
The data are mean +/- 2 sds or 75 to 155 minutes. At least 75% of the runners finished in that time interval.

RELATED QUESTIONS

3.78 The mean time taken by all participants to run a road race was found to be 250... (answered by ewatrrr)
The mean time taken by all participants to run a road race was found to be 220 minutes... (answered by ewatrrr)
Let x denote the time taken to run a road race. Suppose x is approximately normally... (answered by ewatrrr)
Let x denote the time taken to run a road race. Suppose x is approximately normally... (answered by CPhill)
Let x denote the time it takes to run a road race. Suppose x is approximately normally... (answered by ewatrrr)
Let x denote the time it takes to run a road race. Suppose x is approximately normally... (answered by ewatrrr)
The average time to complete a race last weekend was 47 minutes with a standard deviation (answered by Boreal)
The time, X minutes, taken by FlyFast to install a satellite may be assumed to be a... (answered by Boreal)
All runners in the 2005 Fox Cities Half Marathon had a mean finishing time of 137 minutes (answered by math_tutor2020)