Question 1181806: A motorist goes for a drive, keeping the reading on the speedometer at a constant value of 40 km h−1. The speedometer is assumed to be accurate to ± 2 km h−1. At the end of the day he wants to know how far he has travelled, but unfortunately, he forgot to look at the distance indicator when he set out. He thinks that he drove for four hours, give or take a quarter of an hour. Estimate how far he travelled and assign an error to your result.
Answer by CPhill(1959) (Show Source):
You can put this solution on YOUR website! Here's how to estimate the distance traveled and its error:
1. **Nominal Distance:**
Distance = Speed × Time = 40 km/h × 4 h = 160 km
2. **Uncertainty in Speed:**
Δv = ± 2 km/h
3. **Uncertainty in Time:**
Δt = ± 0.25 h (15 minutes = 0.25 hours)
4. **Calculating the Uncertainty in Distance:**
Since distance is calculated by multiplying speed and time, the *fractional* uncertainties in speed and time combine to give the fractional uncertainty in distance.
(Δd/d)² = (Δv/v)² + (Δt/t)²
(Δd/d)² = (2/40)² + (0.25/4)²
(Δd/d)² = (0.05)² + (0.0625)²
(Δd/d)² = 0.0025 + 0.00390625 = 0.00640625
Δd/d = √0.00640625 ≈ 0.08
5. **Absolute Uncertainty in Distance:**
Δd = (Δd/d) * d = 0.08 * 160 km ≈ 12.8 km
6. **Final Result:**
The motorist traveled approximately 160 ± 13 km (rounded to two significant figures, consistent with the least precise measurement).
|
|
|