¶YCbCr levels and the Video Mixing Renderer 9
Blight mentions:
However, in VMR7/9 mode when not using the overlay and using a YUV output mode, both ATI and NVIDIA seem to output a luma range of 16-235, which is a pain considering the monitor works in rgb where the luma range is 0-255, requiring either a software conversion to 0-255, or to specifically adjust the hardware color controls to compensate.
The VMR7/9 being referred to are the DirectShow Video Mixing Renderers, which are filters that render final video on the screen in a player like Windows Media Player. YCbCr data, which is frequently 8 bits per channel, nominally has a range of 16-235 for the luminance (Y) channel, where 16 represents black and 235 represents white. Any values outside of this range are reserved; the usual way of handling them is to clamp to the valid range. This is sometimes a point of confusion because JPEG JFIF specifies a slightly different encoding where YCbCr values are encoded with the full 0-255 range; mixing these up causes contrast shifts in the video. Some Motion JPEG codecs that don't properly compensate for this difference when exchanging the output of the JPEG engine with client applications can trigger this. It sounds like a similar mixup is occurring here, in that 16-235 data is being displayed as if it were 0-255, resulting in a 14% loss of contrast.
What's odd about Blight's statement, though, is that VMR7 and VMR9 are quite different beasts — VMR7 is DirectDraw 7 based, whereas VMR9 is Direct3D 9 based. It's somewhat unusual to see the same bug in both. I was curious, so I decided to look into it.
What Video Mixing Renderer 9 does
My first attempt to gain insight into VMR9's workings was to attach NVIDIA NVPerfHUD 3 to GraphEdit, with the aid of one of my proprietary adapter tools — but it crashed. Then I tried PIX for Windows to try to get a D3D call stream dump, but that never triggered. Trying to force it to use refrast failed too. Argh. So I just resorted to good old WinDbg and VTune.
I'd expected some neat pixel shaders to be used, but what VMR9 does in the common single-stream case is rather pedestrian: it calls CreateTexture() to allocate a texture and then StretchRect() to blast it onto the backbuffer. I'm not sure why it uses a texture, because an offscreen plain surface would actually be more flexible in that more formats and fewer restrictions are usually imposed. If either UYVY or YUY2 is available as a texture format, it uses that, otherwise it falls back to X8R8G8B8 (RGB 32-bit) instead. It tries R8G8B8 (24-bit RGB) instead, too, but good luck finding a card that supports it. It also calls CreateTexture() without checking for format availability first, which is bad since the error that comes back will fire a breakpoint when the debug D3D runtime is enabled.
Testing results
I was eventually able to track down the problem on an ATI X800XT with a hacked-up version of VirtualDub. On that card, if you create a UYVY or YUY2 surface, and then StretchRect() it, the luma values are interpreted as 0-255 and a contrast shift results. If you simply draw the texture to the screen with a quad, though, you get the expected 16-235 behavior. Odd.
On an NVIDIA GeForce Go6800, I was also able to see the difference in comparing VMR9 to the old Video Renderer with all acceleration disabled, but couldn't reproduce it programmatically. NVIDIA GPUs don't support UYVY or YUY2 textures, so the only option to use those formats is to use an off-screen plain surface — and yet, I couldn't see a problem StretchRect()ing one, nor could I catch VMR9 using one either. And yet, VTune doesn't show software conversion going on in VMR9 itself, and if I stick in an Infinite Pin Tee Filter to force a conversion elsewhere, the problem goes away. So I'm not really sure where the problem occurs here.
Judging by the behavior I saw, and the total lack of any description in the DirectX SDK as to how this conversion should work, I'm inclined to say that this isn't VMR9's fault, but either a common driver problem or an unfortunate mismatch between Direct3D and DIB/DirectDraw conventions for YCbCr. Fortunately, VirtualDub always draws quads when Direct3D display mode is enabled, so it's not affected by these oddities. Whew.
I dug up the old Reference Rasterizer source code from the DirectX 7 SDK, which I think is the last version that was publicly released, and examined its texture sampler. It too decodes as if black were 16/255 and white were 235/255, although I noticed that it doesn't interpolate chroma (boo). Unfortunately, refrast isn't helpful in determining which StretchRect() behavior is correct, because it doesn't support YCbCr-to-RGB conversions along that path. So much for a reference.