document.write( "Question 1155800: A motorist goes for a drive, keeping the reading on the speedometer at a constant value of 40 kmh−1. The speedometer is assumed to be accurate to ± 2 kmh−1. At the end of the day he wants to know how far he has travelled, but unfortunately, he forgot to look at the distance indicator when he set out. He thinks that he drove for four hours, give or take a quarter of an hour. Estimate how far he travelled and assign an error to your result. \n" ); document.write( "
Algebra.Com's Answer #778464 by Boreal(15235)\"\" \"About 
You can put this solution on YOUR website!
the minimum would be 38 km/h*3.75 h=142.5 miles
\n" ); document.write( "the maximum would be 42*4.25 or 178.5 miles
\n" ); document.write( "the best estimate is the midpoint between those two or 160.5 miles with error +/-18 miles.
\n" ); document.write( "
\n" ); document.write( "
\n" );