SOLUTION: A motorist goes for a drive, keeping the reading on the speedometer at a constant value of 40 kmh−1. The speedometer is assumed to be accurate to ± 2 kmh−1. At the end of the
Algebra ->
Customizable Word Problem Solvers
-> Travel
-> SOLUTION: A motorist goes for a drive, keeping the reading on the speedometer at a constant value of 40 kmh−1. The speedometer is assumed to be accurate to ± 2 kmh−1. At the end of the
Log On
Question 1155800: A motorist goes for a drive, keeping the reading on the speedometer at a constant value of 40 kmh−1. The speedometer is assumed to be accurate to ± 2 kmh−1. At the end of the day he wants to know how far he has travelled, but unfortunately, he forgot to look at the distance indicator when he set out. He thinks that he drove for four hours, give or take a quarter of an hour. Estimate how far he travelled and assign an error to your result. Answer by Boreal(15235) (Show Source):
You can put this solution on YOUR website! the minimum would be 38 km/h*3.75 h=142.5 miles
the maximum would be 42*4.25 or 178.5 miles
the best estimate is the midpoint between those two or 160.5 miles with error +/-18 miles.