What is the maximum precision for the DECIMAL data type?
What is the maximum precision for the DECIMAL data type?
The DECIMAL data type typically supports a maximum precision of 31 digits. Precision refers to the total number of digits that the number can hold, both to the left and to the right of the decimal point. This aligns with most database systems that adhere to SQL standards, which specify the maximum precision for DECIMAL data types. Therefore, for DECIMAL, the maximum precision is correctly identified as 31 digits.
Ans is D (31) https://www.ibm.com/support/producthub/db2/docs/content/SSEPGG_11.5.0/com.ibm.db2.luw.sql.ref.doc/doc/r0000791.html