Microsoft DirectX 8.1 (C++) |
The GetStreamCaps method retrieves a set of format capabilities.
Syntax
HRESULT GetStreamCaps(
int iIndex,
AM_MEDIA_TYPE **pmt,
BYTE *pSCC
);
Parameters
iIndex
[in] Specifies the format capability to retrieve, indexed from zero. To determine the number of capabilities that the pin supports, call the IAMStreamConfig::GetNumberOfCapabilities method.
pmt
[out] Address of a pointer to an AM_MEDIA_TYPE structure.
pSCC
[out] Pointer to a byte array allocated by the caller. For a video pin, this parameter receives a VIDEO_STREAM_CONFIG_CAPS structure. For an audio pin, it receives an AUDIO_STREAM_CONFIG_CAPS structure. To determine the required size, call the GetNumberOfCapabilities method.
Return Values
Returns an HRESULT value. Possible values include the following.
Return code | Description |
S_FALSE | Specified index is too high. |
S_OK | Success. |
E_INVALIDARG | Invalid index. |
E_OUTOFMEMORY | Insufficient memory. |
E_POINTER | NULL pointer value. |
VFW_E_NOT_CONNECTED | The input pin is not connected. |
Remarks
This method returns two pieces of information:
To configure the output pin so that it uses this format, call the IAMStreamConfig::SetFormat method and pass in the value of pmt.
Before calling SetFormat, you can modify the AM_MEDIA_TYPE structure in pmt, using the information in pSCC. For example, an audio pin might return a default media type of 44-kHz, 16-bit stereo in the pmt parameter. Based on the values returned in the AUDIO_STREAM_CONFIG_CAPS structure, you might change this format to 8-bit mono before calling SetFormat.
The method allocates the memory for the AM_MEDIA_TYPE structure that is returned in the pmt parameter. The caller must release the memory, including the format block. You can use the DeleteMediaType helper function in the base class library. The caller must allocate the memory for the pSCC parameter.
On some compression filters, this method fails if the filter's input pin is not connected.
Filter Developers: For more information on implementing this method, see Exposing Capture and Compression Formats.
Example Code
The following example retrieves the first supported format (index zero) on a video output pin. Then it configures the pin to use the smallest output size for that format. To do so, it modifies the format block's BITMAPINFOHEADER structure with the new width and height; and sets the lSampleSize member of the AM_MEDIA_TYPE structure to the new sample size. The sample size depends partly on the bit depth, which can be determined from the media subtype. (For brevity, this example shows only the case for UYVY format. Also, it assumes that the caller already knows the pin produces video output.)
int iCount, iSize;
VIDEO_STREAM_CONFIG_CAPS scc;
AM_MEDIA_TYPE *pmt;
hr = pConfig->GetNumberOfCapabilities(&iCount, &iSize);
ASSERT(sizeof(scc) <= iSize); // Make sure the structure is big enough.
// Get the first format.
hr = pConfig->GetStreamCaps(0, &pmt, reinterpret_cast<BYTE*>(&scc));
if (hr == S_OK)
{
// Is it VIDEOINFOHEADER and UYVY?
if (pmt->formattype == FORMAT_VideoInfo &&
pmt->subtype == MEDIASUBTYPE_UYVY)
{
// Find the smallest output size.
LONG width = scc.MinOutputSize.cx;
LONG height = scc.MinOutputSize.cy;
LONG cbPixel = 2; // Bytes per pixel in UYVY
// Modify the format block.
VIDEOINFOHEADER *pVih =
reinterpret_cast<VIDEOINFOHEADER*>(pmt->pbFormat);
pVih->bmiHeader.biWidth = width;
pVih->bmiHeader.biHeight = height;
// Set the sample size and image size.
// (Be sure to round the image width up to four.)
pmt->lSampleSize = pVih->bmiHeader.biSizeImage =
((width + 3) & ~3) * height * cbPixel;
// Now set the format.
hr = pConfig->SetFormat(pmt);
if (FAILED(hr))
{
MessageBox(NULL, TEXT("SetFormat() Failed!\n"), NULL, MB_OK);
}
DeleteMediaType(pmt);
}
}
See Also