Alright, T-SQL isn't acting as I would expect. Look at this:
SELECT SUM(DEBET),SUM(CREDIT)
CONVERT(DECIMAL(9,2),
CASE WHEN SUM(DEBET) - SUM(CREDIT) > 0.00
THEN CONVERT(DECIMAL(9,2),SUM(CONVERT(DECIMAL(9,2),DEBET))) - CONVERT(DECIMAL(9,2),SUM(CONVERT(DECIMAL(9,2),CREDIT)))
ELSE CONVERT(DECIMAL(9,2),0.00)
END
) AS DEBETSALDO
FROM OTHERTABLE
DEBET
and CREDIT
in OTHERTABLE
both are of type decimal; DEBET(decimal,null)
according to the tooltip.
The result is...
DEBET CREDIT DEBETSALDO
6817.07 0.00 0
Why and where, if DEBET and CREDIT are interpreted as decimals, does SQL convert my whole shebang back to integer in this select statement? Maybe add a few more converters?
EDIT
I just changed
SELECT DEBET, CREDIT
into
SELECT SUM(DEBET), SUM(CREDIT)
To emphasize where the problem lies. The results stay the same.
Found it.
Apparently when you convert a decimal(11,2)
to a decimal(9,2)
leads to precision loss.
Sorry, I should have double checked that. But nevertheless, 6817.07
should fit inside decimal(9,2), right?
Thanks for the help.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.