My dad and I had a disagreement. We were driving back from a trip and he let me take over the wheel for a while. His complaint was that I was driving too slow and it was driving him crazy. For me, I thought I was driving fast. My typical behavior is to drive 3 mph under the speed limit. That is just how I roll. In this case, I knew he wouldn't be able to handle this so I went the speed limit (70 mph).
Here is the problem. The speedometer said 70 mph. The gps thingy said the average speed was 69 mph. I think my dad feels that the gps is correct and that cars have speedometers that are intentionally set too high to prevent excessive speeding. I wasn't too sure about this theory. Ok, before I go any further, let me give some definitions.
When this term is used, it is typically used to mean "instantaneous velocity". In simple terms, the instantaneous velocity is how fast something is moving and in what direction (it's a vector). Just for completeness, you could say the instantaneous velocity is:
But I really don't want to talk about this except for the next definition.
Again, this is usually used as "instantaneous speed". Basically, this is how fast something is going. Velocity is a vector, and speed usually means the magnitude of this vector. (if you forgot about vectors, here is a refresher) There is a problem though. Sometimes, the word speed is used to mean the distance traveled over the time or:
This one is going to need a diagram. Basically, the average velocity is the change in position over time for some interval. I know what you are thinking - but isn't this the distance? Let me draw an overly complicated diagram.
The vector r1 is a vector from the origin to the starting position of the object. Delta r vector is the vector from the initial position to the final position. The black curved line is the path of the object. So, in this case, the magnitude of the average velocity is not the distance divided by the time (if you call the distance the length of that black line).
Determining the speed
So, what does a car do? How does it "measure" the speed that it reports. I suspect that the gps and the speedometer use different methods. The speedometer uses the rotation of the wheels to get the speed. If you know the angular velocity and the radius of the wheel, then the instantaneous speed is:
Of course, if the wheels are slipping, this method does not work too well. Also, if you have the wrong size tires, this can cause a problem. The gps reports average speed as the magnitude of the average velocity (see above). So, it is possible to have these two devices report a different average speed.
When the gps average speed is reset, it takes about 12 seconds to give a reading. From this, I assume that it is finding the average speed by using the position every 12 seconds. If the road is reasonably straight, this should be a good measure of your average speed. However, if it is curved, this would give a low value.
To settle our argument, we decided to set the cruise control around 70 mph and use a stopwatch. We measured the time to go 5 miles (according to the mile markers on the road). This gave an average speed of 70.5 mph (compared to the gps report of 69 mph). The road was not too curvy, so I suspect the difference is due to something other than the magnitude of the average velocity. Also, it is possible that the mile markers were off. We would have done more than 5 miles distance if we were more patient.
Anyway, thanks dad for helping me drive on that trip - even though you were wrong.
I noticed a similar, but slightly larger, discrepancy between my GPS and my speedometer. I plan to do the same calculation, but haven't given it a try yet.
In the Kgalagadi Park in South Africa and Botswana, you sign out of a camping area with your car. Regardless of your intentions, the park employee routinely radios your departure time to the other gates, including the exit gate.
If you are leaving the exit gate, you must stop there and turn in your permit, if you've dropped firearms you pick them up, etc. Then they let you out.
But the gate guard also checks the time you arrived at the guard house and the time you left the last gate yo are known to have left within the park, and calculates how much to fine you for speeding based on your average speed and/or velocity.
For this reason you see a lot of people parked just inside the exit gate watching non-existent wildlife and glancing at their watch every few minutes.
You don't need a stopwatch to calibrate your speedometer against mile markers. Just observe whether your odometer accumulates more or fewer miles than the mile markers indicate. Of course, that assumes the speedometer has a good clock, i.e., that it correctly differentiates the odometer with respect to time.
It turns out that auto speedometers are just plain not really very precise.
Great CnD link - but I think the main point is that they're not very accurate - They're generally within a couple of percent, which seems pretty precise.
(I'm told that one of my superpowers is pedantry...)
Does that mean my dad was right? I hate when that happens. Now I will have to go test some other cars.
Sounds like your dad was right!
I was going to comment on the "slip" in the speedo of older (and sometimes newer cheap) cars that use the magnetic coupling speedo as described in Moopheus' link, but no need to now. I am pretty sure my '98 Ford Festiva uses this method.
For many years I have roughly checked cars that I drive against GPS speeds etc and have never found a car speedo that over reads. While they are accurate to within a few percent that still makes a real (if maybe insignificant) difference at highway type speeds. For another data point I bought a new (to me) Subaru Outback just before Christmas. I did some country driving about two weeks ago and now know that the speedo reads 114kph at a GPS reading of 110 for that car. That error is typical of what I have found in the past (and exactly the same as the Festiva).
I have no clue about the details of how they work, but unless a manufacturer can guarantee that speedometers will be absolutely correct all the time, making sure that they err on the high side seems like a good thing to do.
Giving too high a reading just means people will drive slower than they "should", but tell them they're going slower than they really are and they'll start getting speeding tickets, and the manufacturer lawsuits.
(Well, they'll get speeding tickets anyway of course, but they won't necessarily have someone convenient to blame...)
GPS uses Doppler shift to measure speed. I thought a physicist would know that! :D
Robert is correct - speed measurements by GPS are Doppler-aided.
As far as accuracy goes, for one of their OEM boards, Garmin claims a "velocity" accuracy of 0.1 knots RMS steady-state. I don't think this is unreasonable.
UK regulations actually insist that speedos do not under-read. As a consequence, all of them over-read.
In the dim distant past a lot of manufacturers deliberately had their speedos over read to the maximum allowed - usually 10% - so that drivers would think their vehicles were faster than they really were. Not that many potential buyers were fooled but it made for easier bragging in the pub.
Not surprising, since it protects the manufacturer from blame for a speeding ticket, and worn tires make it worse. And not only faster, but more miles per mile so more miles per gallon.
Interesting observation about the mile markers. In my experience, the survey markers along the highway are in feet from a point of beginning, measured along the center of the median, and are completely decoupled from the mile markers, which are there for convenience.