In writing a tool to parse the data types out of a dbf for a shapefile comparison I’m working on – I’m getting some rather odd behavior…As good ol’ Ianko explains it in the forums:
From the help:
“Precision is the number of digits in a number.”
“Scale is the number of digits to the right of the decimal point in a number”
Precision of 10 should store numbers with max 10 digits:1234567890
Precision of 10 and Scale of 5 should store
The interesting fact is that these rules behave differently on shapefiles(dbf) and Geodatabases and some times the results are strange.
– a field defined as Long with precision of 10 on GDB will not accept a value of 12345678901 (correct), on a shapefile(dbf) it will be calculated to 1230000000.
– field defined as Double with precision of 10 and scale of 5 will convert 123456.12345 to 123500 on a dbf file and will be accepted if the table is in a GDB. In fact the same field in a GDB will accept 1234567890.12345 !!!
Another interesting fact is that a field defined in ArcCatalog as Double with Precision of 10 and Scale of 5 will be reported in ArcMap as follows:
– if the table is DBF – correctly
– if the table is in a personal GDB – precision of 0 and scale of 0″
Link to the complete Post: http://bit.ly/s9BInv