Bobby Heid
bheid at appdevgrp.com
Fri Mar 17 13:54:02 CST 2006
Hey, I always thought that the main difference between a single and a double was that the single had 7 digits and the double had 15 digits of precision AFTER the decimal point. Normally, I store decimal values as double. But for some reason, this particular field was set to single. Our client found an issue with that field today. They had entered 666,656,666 into the field. When you look at the table, it is 666,656,600. As a test, I changed the field to double, entered 666,656,666, and saved it. The value stuck. Then I changed it to single, got the smaller field size warning, and saved it. When I looked at the table, the value was 666,656,600. So, I think the issue is that I misunderstood what the precision meant in the single vs. double declaration. I wanted to be sure that this was the issue before I just blindly convert them to doubles. Am I correct? Thanks, Bobby