Well, creating DirectShow filters can be real pain at times especially Source filter. When I needed a source filter to generate my own images, I looked at the Ball source filter in DirectShow documentation but it was useless for me because I needed to generate my own image using my own painting. So, here it is.
This filter lets you create your own DC and do all the painting in that DC. Everything painted on the DC is delivered to the downstream filters until finally renderer filters render the RGB data.
The filter is based on
CSourceStream to create the output pin and deliver the data, and
CSource to create the source filter.
I am keeping this article really short as those who would need to create such a filter would know most about the DirectShow architecture and filters.
The only new thing in this filter is create the media type for RGB type, create the DC when stream thread is created, and fill the media sample with new buffer taken from DC.
Create Media Type
This source filter generates 24 bit RGB video. To create the media type,
CSourceStream is overridden. The filter graph calls this function before connecting the filters for connecting filter to agree on a media type (well, this is one of the functions called for connection apart from agreement of allocators, media type, and final acceptance of output pin to agree on connection). This function receives the
CMediaType pointer on the output pin of source filter supplied by the input pin of the downstream filter. The output pin fills up the information of media type it's going to generate.
HRESULT CSnakeStream::GetMediaType(CMediaType *pMediaType)
VIDEOINFO *pvi =
if (NULL == pvi)
pvi->bmiHeader.biCompression = BI_RGB;
pvi->bmiHeader.biBitCount = 24;
pvi->bmiHeader.biSize = sizeof(BITMAPINFOHEADER);
pvi->bmiHeader.biWidth = 320;
pvi->bmiHeader.biHeight = 240;
pvi->bmiHeader.biPlanes = 1;
pvi->bmiHeader.biSizeImage = GetBitmapSize(&pvi->bmiHeader);
pvi->bmiHeader.biClrImportant = 0;
const GUID SubTypeGUID = GetBitmapSubtype(&pvi->bmiHeader);
m_bmpInfo.bmiHeader = pvi->bmiHeader;
The Stream thread
OnThreadCreate() of source stream is called by the filter graph when filter graph comes into pause state. This function creates the thread to generate the data and later calls the
FillBuffer to fill the data.
We override this function to do any initialization. In this sample filter, I draw a snake (kind of) which crawls from left to right in the middle of the screen. Here in this function, I initialize the height and width of the snake and no. of blocks in the snake body.
The painting magic
Well, all the magic of painting in DC is actually nothing but create a DIB section and selecting the DC for painting. From this DIB section, we pick the data and fill the media sample. This all is done in
OnThreadCreate() override of
CSrouceStream. The whole magic is as follows:
HBITMAP hDibSection = CreateDIBSection(NULL,
(BITMAPINFO *) &m_bmpInfo, DIB_RGB_COLORS,
&m_pPaintBuffer, NULL, 0);
HDC hDC = GetDC(NULL);
m_dcPaint = CreateCompatibleDC(hDC);
HGDIOBJ OldObject = SelectObject(m_dcPaint,hDibSection);
m_nScoreBoardHeight = m_bmpInfo.bmiHeader.biHeight/8;
m_nSnakeBlockHeight = 4, m_nSnakeBlockWidth = 6;
m_nNumberSnakeBlocks = 6;
m_nSpaceBetweenBlock = 1;
m_nLastX = 0; m_nLastY = 0;
Send the picture downstream
This is simply straight forward. Once we have created an off-screen DC and have selected a DIB section in the DC, all we need to do is do whatever painting we need to do in the DC and just copy the DIB section buffer into the media sample buffer.
All this is done in the
FillBuffer override of
FillBuffer is called by the thread created in
OnThreadCreate() whenever there is a request of data from upstream filters. This filter runs on 26 frames per second rate.
The code for filling media sample looks as:
HRESULT CSnakeStream::FillBuffer(IMediaSample *pSample)
hr = pSample->GetPointer(&pBuffer);
lSize = pSample->GetSize();
if( nFrameRate++ > 150 )
rcSnake.left = m_nLastX + (m_bmpInfo.bmiHeader.biWidth/2)
rcSnake.top = m_nLastY + (m_bmpInfo.bmiHeader.biHeight/2);
rcSnake.right = rcSnake.left + m_nSnakeBlockWidth;
rcSnake.bottom = rcSnake.top + m_nSnakeBlockHeight;
m_nLastX += m_nSnakeBlockWidth;
*m_nNumberSnakeBlocks > m_bmpInfo.bmiHeader.biWidth )
m_nLastX = -m_bmpInfo.bmiHeader.biWidth/2;
TRACE("%d %d %d %d\n",rcSnake.left,rcSnake.top,
CRefTime m_rtStart; CRefTime m_rtStop;
m_rtStart = m_llFrameCount*1000000/26;
m_rtStop = (m_llFrameCount+1)*1000000/26;
wsprintf( szText, TEXT("%s\0"), TimeToTimecode(m_rtStart));
if( !TextOut( m_dcPaint, m_bmpInfo.bmiHeader.biWidth/2-50,
szText,_tcslen( szText ) ) )
In the above code, we draw snake once in a second but we update the frame counter displace at every frame.
CopyMemory copies the buffer from DIB section to the output media sample.
Well, please go through the code carefully before using it or taking any inspiration from it. There is not much code in the sample and I have not put in any checks or anything.
How to use it
This is one of the good things about DirectShow. No pain using it (if it's working correct), just add the snake source filter from the list of DirectShow filters in Filter graph. Render the output pin. It's done.
More to do in it
Well, as you see above, it looks very simple to create the filter but it is very very basic. In the practical scenario, you may need to support seeking on the filter. You may also need to generate the audio samples as well. Then you may need to support multiple time format for seeking and sync between audio and video samples generated.
Looking at the new Ball sample in DirectShow documentation which supports
IMediaSeeking doesn't solve all the problems of multiple time formats and syn between audio and video.
Apart from all this, most of the source filters read from file and can be of push or pull based source filter, i.e., implementing
IFileSourceFilter. This can again be another complicated implementation.
I would try to post another article on how to deal with the above mentioned issues, meanwhile enjoy coding...cheers.
innovating, managing and developing next generation media products and services