I got an HDR supporting monitor (LG 32GK850F), so I started learning how I can use its capabilities programatically. I still have much to learn, as there is a lot of theory to be ingested about color spaces etc., but in this blog post I’d like to go straight to the point: How to enable HDR in your C++ DirectX program? To test this, I used 3 graphics chips from 3 different PC GPU vendors. Below you can see results of my experiments.
You need to know that there are two different ways to query for display capabilities, as well as to control its working parameters.
(1) First is standard Windows, GPU-independent API called DirectX Graphics Infrastructure (DXGI). It’s available together with Windows SDK, in form of a COM API. It’s a common layer for Direct3D 11, as well as 12 (you can even use it with Vulkan). Functions related to HDR are added in recent updates of the API, which take form of new interfaces with a number at the end. To access them, you have to query your base object for them first. Example:
IDXGIOutput* output = /* initialize output */;
IDXGIOutput6* output6;
HRESULT hr = output->QueryInterface(__uuidof(IDXGIOutput6), (void**)&output6);
if(SUCCEEDED(hr)) {
// Use output6...
output6->Release();
} else {
// Error!
}
You will be able to successfully compile this code only if you have sufficiently new version of Windows SDK installed. The code will execute successfully (as opposed to failing with an error code) only if the user has sufficiently new version of Windows 10.
You can then query for monitor capabilities by calling function IDXGIOutput6::GetDesc1
. You get structure DXGI_OUTPUT_DESC1
filled, which describes available color space, bits per component, red/green/blue primaries, white point, and the range of luminances available on the device.
To inform the system about intended HDR usage, you may call function IDXGISwapChain4::SetHDRMetaData
with structure DXGI_HDR_METADATA_HDR10
filled, where you pass your parameters - again red/green/blue, white point, and luminance ranges. There is also IDXGISwapChain3::SetColorSpace1
function that lets you pass enum DXGI_COLOR_SPACE_TYPE
, telling what color space do you use (RGB or YCbCr, linear or gamma 2.2, P709 or P2020 etc.)
(2) There are custom APIs from different GPU vendors.
2.1. In case of AMD, we are talking about the library called AMD AGS. To use it in your C++ code, you need to:
- #include <ags_lib/inc/amd_ags.h>
- Link with “amd_ags_x64.lib”.
- Bundle file “amd_ags_x64.dll” with your game.
- Call function
agsInit
to initialize the library.
To query for monitor capabilities, inspect structure AGSGPUInfo
filled by agsInit
. Contained structure AGSDisplayInfo
provides information about a connected monitor, including displayFlags
(telling what HDR modes are supported), color primaries, white point, and min/max/avg luminance.
To enable HDR mode, call agsSetDisplayMode
function. Specify requested parameters in AGSDisplaySettings
structure.
2.2. In case of Nvidia, there is a library called NVAPI (free registration is required to download it). To use it in your C++ code, you need to:
- #include <nvapi.h>
- Link with “nvapi64.dll”.
- No need to bundle a DLL file with your game. The library uses “nvapi64.dll” from system directory, installed with graphics driver.
- Call function
NvAPI_Initialize
to initialize the library.
To query for monitor capabilities, first enumerate available DisplayIds. Then call NvAPI_Disp_GetHdrCapabilities
. Returned structure NV_HDR_CAPABILITIES
provides information about HDR parameters of a monitor - color primaries, white point, min/max luminance etc.
To enable HDR mode, call NvAPI_Disp_HdrColorControl
function. Use structure NV_HDR_COLOR_DATA
to specify requested parameters.
That’s the theory. Below are the results of my experiment on 3 different GPUs. All use the same HDR monitor (LG 32GK850F), connected as the only active display via HDMI or DisplayPort cable, with monitor driver installed. I used Windows 10 64-bit version 1809 (OS Build 17763.316) in each case.
1. GPU = AMD Radeon RX Vega 56, driver = 19.2.3
IDXGIOutput6::GetDesc1
returned DXGI_OUTPUT_DESC1
members:
BitsPerColor = 10
ColorSpace = DXGI_COLOR_SPACE_RGB_FULL_G2084_NONE_P2020
RedPrimary = 0.669921875, 0.3095703125
GreenPrimary = 0.2548828125, 0.6796875
BluePrimary = 0.150390625, 0.0751953125
WhitePoint = 0.3134765625, 0.3291015625
MinLuminance = 0 nits
MaxLuminance = 496 nits
MaxFullFrameLuminance = 496 nits
AGSDisplayInfo
has members:
displayFlags = AGS_DISPLAYFLAG_PRIMARY_DISPLAY | AGS_DISPLAYFLAG_HDR10 | AGS_DISPLAYFLAG_FREESYNC | AGS_DISPLAYFLAG_FREESYNC_2
chromaticityRed = 0.6699, 0.3096
chromaticityGreen = 0.2549, 0.6797
chromaticityBlue = 0.1504, 0.0752
chromaticityWhitePoint = 0.3135, 0.3291
screenDiffuseReflectance = 0
screenSpecularReflectance = 0
minLuminance: 0.1687 nits
maxLuminance = 496 nits
avgLuminance = 496 nits
As you can see, the parameters pretty much match between DXGI and AGS, except minLuminance
.
Now to the test of enabling HDR: As it turns out, the only thing needed to make it working is to create swap chain in half-float format R16G16B16A16_FLOAT
instead of traditional one with 8 bits per component, like R8G8B8A8_UNORM
.
Then values 0..1 map to SDR, while values above that make things brighter. The value that gets maximum HDR brightness seems to be around 6.2. It makes sense, because we use scRGB standard here (designed to be backward-compatible with SDR), which defines value 1 as 80 nits, so value 6.2 gives 496 nits - exactly the maximum luminance as reported by my monitor.
It is also important to note that you are expected to output color values in linear space, not gamma-corrrected!
I tried many different things that may be related to display mode, but it seems they are neither required nor changing anything in the HDR behavior:
- Window can be borderless and cover whole screen, but it can also be a regular window.
- I may enter “true”, exclusive full screen mode by setting appropriate resolution, RefreshRate, ScanlineOrder, Scaling etc. and calling
IDXGISwapChain::SetFullscreenState(TRUE, NULL)
, but I don’t have to. - I tried to call
IDXGISwapChain4::SetHDRMetaData
with various parameters - swapped red with green primary, changed luminances by orders of magnitude… No effect was observed. - I tried to call
IDXGISwapChain3::SetColorSpace1
. With valueDXGI_COLOR_SPACE_RGB_FULL_G2084_NONE_P2020
the call failed, despite this is the value reported by the monitor. With valueDXGI_COLOR_SPACE_RGB_FULL_G10_NONE_P709
it succeeded, but it changed nothing, because that’s what we are using here anyway (RGB color space, G10 = gamma with exponent 1.0 = linear space, P709 = primaries similar to sRGB). I tried to use gamma 2.2, but call withDXGI_COLOR_SPACE_RGB_FULL_G22_NONE_P709
also failed. - I also tried to call
agsSetDisplayMode
. Again, changing primaries or luminances has no effect, and so doesn’t setting mode toMode_HDR10_scRGB
,Mode_FreeSync2_scRGB
, orMode_FreeSync2_Gamma22
. The only thing that made a difference was usingMode_SDR
. My program then behaved in a strange way that it used SDR only, but it still expected range of values 0..6.2, in linear space.
2. GPU = Nvidia GeForce GTX 1070, driver = 419.17
IDXGIOutput6::GetDesc1
returned DXGI_OUTPUT_DESC1
members:
BitsPerColor = 8
ColorSpace = DXGI_COLOR_SPACE_RGB_FULL_G2084_NONE_P2020
RedPrimary, GreenPrimary, BluePrimary, WhitePoint same as on AMD
MinLuminance = 0 nits
MaxLuminance = 1499 nits
MaxFullFrameLuminance = 799 nits
Wow, that’s much different! Is it because I use different cable and connector type, or because of different graphics card? I don’t know.
NVAPI function NvAPI_Disp_GetHdrCapabilities
returns following NV_HDR_CAPABILITIES
structure members:
isST2084EotfSupported = true
isTraditionalHdrGammaSupported = false
isEdrSupported = true
driverExpandDefaultHdrParameters = false
isTraditionalSdrGammaSupported = true
isDolbyVisionSupported = false
display_data.displayPrimary_x0 = 33496, which means red X = 0.66992
display_data.displayPrimary_y0 = 15478, which means red Y = 0.30956
display_data.displayPrimary_x1 = 12744, which means green X = 0.25488
display_data.displayPrimary_y1 = 33984, which means green Y = 0.67968
display_data.displayPrimary_x2 = 7519, which means blue X = 0.15038
display_data.displayPrimary_y2 = 3759, which means blue Y = 0.07518
display_data.displayWhitePoint_x = 15673, which means 0.31346
display_data.displayWhitePoint_y = 16455, which means 0.3291
display_data.desired_content_max_luminance = 0
display_data.desired_content_min_luminance = 0
display_data.desired_content_max_frame_average_luminance = 0
dv_static_metadata = all zeros
As you can see, we have same red/green/blue primaries and white point as before, but we don’t have information about luminance.
Just like on AMD, to enable HDR mode, it is enough to create swap chain in R16G16B16A16_FLOAT
format. We can do it fullscreen or in a window, it doesn’t matter. Colors are also expected in linear space, but the maximum seems to be around value 12.5 (which corresponds to exactly 1000 nits).
Just as before, calling IDXGISwapChain3::SetColorSpace1
with DXGI_COLOR_SPACE_RGB_FULL_G10_NONE_P709
does nothing, while other values make this function fail. I also tried to call NvAPI_Disp_HdrColorControl
, but no values set in NV_HDR_COLOR_DATA
structure made any difference, even changing hdrMode from NV_HDR_MODE_UHDA
(“This is the only supported production HDR mode.”) to NV_HDR_MODE_OFF
. HDR was still active in this case.
3. GPU = Intel Iris Plus Graphics 655, driver = 25.20.100.6444
IDXGIOutput6::GetDesc1
returned same members of DXGI_OUTPUT_DESC1
structure as on Nvidia:
BitsPerColor = 8
ColorSpace = DXGI_COLOR_SPACE_RGB_FULL_G2084_NONE_P2020
RedPrimary, GreenPrimary, BluePrimary, WhitePoint same as on AMD
MinLuminance = 0 nits
MaxLuminance = 1499 nits
MaxFullFrameLuminance = 799 nits
Experiments with various display modes gave very interesting results:
- When using swap chain in traditional format
R8G8B8A8_UNORM
and no exclusive fullscreen, values 0..1 of course map to SDR, in sRGB (gamma) space. - When changing swap chain format to
R16G16B16A16_FLOAT
, still in windowed mode, HDR starts working. Colors are expected in linear space, and the maximum brightness seems somewhere around value 12.5. - Interestingly, when I used same swapchain format
R16G16B16A16_FLOAT
, but entered exclusive fullscreen mode using functionIDXGISwapChain::SetFullscreenState(TRUE, NULL)
, HDR was activated, but maximum brightness was reached already at values around 0.6! - When doing fullscreen with 8 bpc swapchain format
R8G8B8A8_UNORM
, HDR was still active, but this time maximum brightness was reached at value 1.0, and sRGB gamma space was expected. - Just as in previous experiments, calling function
IDXGISwapChain4::SetHDRMetaData
with any parameters had no effect. - Function
IDXGISwapChain3::SetColorSpace1
always failed on Intel regardless of parameter passed.
Conclusions
What a mess! It is possible that I still don’t understand something here. I may update this blog post or write a new one later. For now it seems to me that that you can robustly support HDR monitors in your game (which is a good news), but you need to disregard all these advanced parameters and just follow simple steps:
- Use swapchain in
R16G16B16A16_FLOAT
format. - Give the player control over brightness and gamma exponent in graphics options, because valid parameters may vary depending on graphics card (maximum HDR brightness may be reached at 0.6 or at 12.5) and there is no good way to query them.
Top comments (0)