Setting and Obtaining Video Capture Format

The video capture format globally defines the attributes of the images transferred from the frame buffer with the video in channel. Attributes include image dimensions, color depth, and the compression format of images transferred. Applications use the DVM_FORMAT message to set or retrieve the format of the digitized image.

The calling application must modify this message with flags to indicate its purpose. Your driver must examine the flags sent with the message to determine the proper response. The flags are specified in lParam1.

The VIDEO_CONFIGURE_GET or VIDEO_CONFIGURE_SET flag indicates if the DVM_FORMAT message is being used to obtain the format or set the format. The DVM_FORMAT message and these flags are sent to your driver when it is opened, and when it is configured with DVM_DIALOG.

When an application opens your driver, it retrieves the initial driver format. (Video capture drivers initially default to a format that efficiently uses the capabilities of the video capture hardware or, if they have been previously configured, they restore the last user specified configuration saved in a disk file.) If this format is acceptable, the application continues its operations. If the format is not acceptable, the application will either immediately close your driver or suggest a very limited format. If the limited format is not acceptable to your driver, the application closes it. (Typically, applications do not accept a format because they cannot allocate enough memory to capture video. A limited format might free enough memory for operation.)

Applications also get the format when the user changes the format. (Users change the format with the VIDEO_IN channel dialog box displayed with the DVM_DIALOG message.) In this case, applications get and retain a copy of the current format prior to sending the DVM_DIALOG message. After the user exits from the DVM_DIALOG dialog box, applications get the new format from your driver. If the application accepts the new format, it uses the VIDEO_CONFIGURE_SET flag to return the format back to your driver. (Your driver should verify that the application has not changed the format information.) If the application does not accept the new format, it restores the format it obtained prior to displaying the dialog box.

The DVM_FORMAT messages uses lParam2 to pass the format information. This parameter contains a pointer to a VIDEOCONFIGPARMS structure. This structure has the following members:

typedef struct tag_video_configure_parms {
    LPDWORD  lpdwReturn;
    LPVOID   lpData1;
    DWORD    dwSize1;
    LPVOID   lpData2;
    DWORD    dwSize2;
} VIDEOCONFIGPARMS;
 

The lpData1 member points to a BITMAPINFOHEADER data structure. The size of this structure is specified in dwSize1.

Changing the format can affect overall dimensions of the active frame buffer as well as bit depth and color space representation. Since changing between NTSC and PAL video standards can also affect image dimensions, applications should request the current format following display of the EXTERNAL_IN channel dialog box.

If an application just wants to know if your driver supports DVM_FORMAT, it sends the VIDEO_CONFIGURE_QUERY flag with the message. (Using the VIDEO_CONFIGURE_QUERY flag without VIDEO_CONFIGURE_GET or VIDEO_CONFIGURE_SET is invalid.) Your device driver should return DV_ERR_OK if it supports the message. Otherwise, it should return DV_ERR_NOTSUPPORTED.

If an application wants to determine the amount of memory it needs to allocate for the format information, it sends the DVM_FORMAT message with the VIDEO_CONFIGURE_GET and VIDEO_CONFIGURE_QUERYSIZE flags set. Your driver should specify the format size in the lpdwReturn member of the VIDEOCONFIGUREPARMS structure.