For the video compression and decompression interface, quality is indicated by an integer ranging from 0 to 10,000. A quality level of 7,500 typically indicates an acceptable image quality. A quality level of 0 typically indicates a very low quality level (possibly even a totally black image). As the quality level moves from an acceptable level to low quality, the image might have a loss of color as the colors in the color table are merged, or as the color resolution of each pixel decreases. If your driver supports temporal compression (it needs information from the previous frame to decompress the current frame), low and high quality might imply how much this type of compression can degrade image quality. For example, your driver might limit the compression of a high quality image to preserve sharp detail and color fidelity. Conversely, your driver might sacrifice these qualities to obtain very compressed output files.
If your driver supports quality values, it maps the values to its internal definitions used by the compression algorithms. Thus, the definition of image quality will vary from driver to driver, and, quite possibly, from compression algorithm to compression algorithm. Even though the values are not definitive, your driver should support as many individual values as possible.
The client-application obtains the capabilities for compression quality with the ICM_GETDEFAULTQUALITY and ICM_GETQUALITY messages. If your driver supports quality levels, it should respond to the ICM_GETDEFAULTQUALITY message by returning a value between 0 and 10,000 that corresponds to a good default quality level for your compressor. Your should return the current quality level for the ICM_GETQUALITY message.
The client-application sends the ICM_SETQUALITY message to set the quality level of your driver. Your driver should pass the quality value directly to the compression routine.
If your driver supports quality levels, it should set the VIDCF_QUALITY flag when it responds to the ICM_GETINFO message.