Overriding Default Precision and Scale for Numeric Data Types

When the SQL_DESC_TYPE field in an ARD is set to SQL_C_NUMERIC, either by calling SQLBindCol or SQLSetDescField, the SQL_DESC_SCALE field in the ARD is set to 0, and the SQL_DESC_PRECISION field is set to a driver-defined default precision. This is also true when the SQL_DESC_TYPE field in an APD is set to SQL_C_NUMERIC, either by calling SQLBindParameter or SQLSetDescField. This is true for input, input/output, or output parameters.

If either of the defaults described previously are not acceptable for an application, the application should set the SQL_DESC_SCALE or SQL_DESC_PRECISION field by calling SQLSetDescField or SQLSetDescRec.

If the application calls SQLGetData to return data into an SQL_C_NUMERIC structure, the default SQL_DESC_SCALE and SQL_DESC_PRECISION fields are used. If the defaults are not acceptable, the application must call SQLSetDescRec or SQLSetDescField to set the fields, and then call SQLGetData with a TargetType of SQL_ARD_TYPE to use the values in the descriptor fields.

When SQLPutData is called, the call uses the SQL_DESC_SCALE and SQL_DESC_PRECISION fields of the descriptor record that corresponds to the data-at-execution parameter or column, which are APD fields for calls to SQLExecute or SQLExecDirect, or ARD fields for calls to SQLBulkOperations or SQLSetPos.