I just came across an issue, and I can't believe that this is the first
time I am finding this... if it is not a problem.
When I add a Decimal column to a table, it sets it as (18,0), which has
a scale of 0.
So if you add a decimal and try to add the value 1.1 to the field, it
rounds it to 1.
Is this right? Why would the default settings for a "Decimal" have no
decimals?Zero is the smallest permitted value for scale so it seems like a
reasonable choice for a default. Remember that DECIMAL is synonymous
with NUMERIC.
Why do you care what the default is? If you want to store decimals then
set the scale to something other than zero.
--
David Portas
SQL Server MVP
--|||I guess it makes more sense if you consider it as a NUMERIC and not a
DECIMAL.
But it still seems odd that a default Decimal can not be a decminal.
As long as I know this, I make sure to set the scale each time I use it.|||> Why would the default settings for a "Decimal" have no decimals?
Because the ANSI SQL standard says so:
"If a <scale> is omitted, then a <scale> of 0 is implicit."
Razvan
Subscribe to:
Post Comments (Atom)

No comments:
Post a Comment