- #1
- 7,255
- 66
Discovery Science did it to me again. I am now officially back on the anti-metric bandwagon.
I learned that during the first Gulf War our Patriot missiles system were failing due to an inability to keep time accurately. When the errors were finally officially recognized, the word came down that the systems could not be run for long periods of time. What was not known was just what a “long period” was. They spoke of an incidence were a Patriot missed badly after 100hrs of continuous operation.
They finally identified the culprit as round off error. They were running a counter in steps of .1 which is an infinitely repeating pattern in binary. (.0001001001001…) and cannot be precisely represented in a binary computer. The round off accumulated over time, rendering the system useless.
This is a very real world example of why you should avoid using .1 as a basic step in any computation. An excellent alternative is to use a power of 2 step like 1/16 or 1/32 etc. While this size of step may seem a bit strange to the human mind, the CPU loves it.
Unfortunately the metric system encourages the use of .1 as a fundamental step. While .1 is a nice multiple when you are doing arithmetic in your head, it is no benefit, indeed the source of errors, when the computer is doing the number crunching.
It is interesting that the subdivisions of the inch are typically powers of 2, so the American system is inherently binary computer friendly, why in the world should we switch to the metric impossible step system?
DOWN WITH THE METRIC SYSTEM!
I learned that during the first Gulf War our Patriot missiles system were failing due to an inability to keep time accurately. When the errors were finally officially recognized, the word came down that the systems could not be run for long periods of time. What was not known was just what a “long period” was. They spoke of an incidence were a Patriot missed badly after 100hrs of continuous operation.
They finally identified the culprit as round off error. They were running a counter in steps of .1 which is an infinitely repeating pattern in binary. (.0001001001001…) and cannot be precisely represented in a binary computer. The round off accumulated over time, rendering the system useless.
This is a very real world example of why you should avoid using .1 as a basic step in any computation. An excellent alternative is to use a power of 2 step like 1/16 or 1/32 etc. While this size of step may seem a bit strange to the human mind, the CPU loves it.
Unfortunately the metric system encourages the use of .1 as a fundamental step. While .1 is a nice multiple when you are doing arithmetic in your head, it is no benefit, indeed the source of errors, when the computer is doing the number crunching.
It is interesting that the subdivisions of the inch are typically powers of 2, so the American system is inherently binary computer friendly, why in the world should we switch to the metric impossible step system?
DOWN WITH THE METRIC SYSTEM!