When the SQL_DESC_TYPE field of an ARD is set to a datetime or interval C type, either by calling SQLBindCol or SQLSetDescField, the SQL_DESC_PRECISION field (which contains the interval seconds precision) is set to the following defaults:
For all interval data types, the SQL_DESC_DATETIME_INTERVAL_PRECISION descriptor field, which contains the interval leading field precision, is set to a default value of 2.
When the SQL_DESC_TYPE field in an APD is set to a datetime or interval C type, either by calling SQLBindParameter or SQLSetDescField, the SQL_DESC_PRECISION and SQL_DESC_DATETIME_INTERVAL_PRECISION fields in the APD are set to the default given previously. This is true for input parameters, but not for input/output or output parameters.
A call to SQLSetDescRec sets the interval leading precision to the default, but sets the interval seconds precision (in the SQL_DESC_PRECISION field) to the value of its Precision argument.
If either of the defaults given previously is not acceptable to an application, the application should set the SQL_DESC_PRECISION or SQL_DESC_DATETIME_INTERVAL_PRECISION field by calling SQLSetDescField.
If the application calls SQLGetData to return data into a datetime or interval C type, the default interval leading precision and interval seconds precision are used. If either default is not acceptable, the application must call SQLSetDescField to set either descriptor field, or SQLSetDescRec to set SQL_DESC_PRECISION. The call to SQLGetData should have a TargetType of SQL_ARD_TYPE to use the values in the descriptor fields.
When SQLPutData is called, the interval leading precision and interval seconds precision are read from the fields of the descriptor record that correspond to the data-at-execution parameter or column, which are APD fields for calls to SQLExecute or SQLExecDirect, or ARD fields for calls to SQLBulkOperations or SQLSetPos.