I just tested it and it works for me. Hold down the windows+ALT keys and then press B. Personally, I don't ever toggle it as my monitor is always in HDR mode, but Windows will automatically switch to SDR mode for games that don't support HDR. Also, make sure you switch to 10 bit color depth in your GPU driver settings and RGB mode if you haven't done so yet. *Edit* Some keyboards have a switch that can disable the Windows key, so make sure it's not switched on. I know my keyboard does (Logitech G910)
Thanks, but the windows key works fine, just the on/off function isn't working. I have it set to 10 Bit,and RGB mode, HDR on always for me is too bright, specially at night, so i just turn it on when i´m going to play a game that supports HDR.
That's really weird. Can't think of any other reason why it may not be working then. I'm assuming you already calibrated HDR using the Windows 11 HDR tool, and it was still too bright? That tool works really well to tone down excessive brightness.
Also, one can use a handy tool called AutoActions to automatically enable/disable HDR (among many other things) when starting up a game/application.
I have the same issue in Windows 10, cannot use the shortcut keys to enable/disable HDR. Windows key works fine. I just hit start and type hdr, gives the correct place to change it.
Same were, but in windows 11 i have to type it twice. It gets annoying. After a while i found that, from the games i tested, only The Witcher 3 and A Plague Tale Requiem, have a decent HDR implementation, and in both cases the results is some places are stunning, even on a crappy monitor like mine. AC games, Odissey and Origins, the last one with a better result are also worth a try, AC: Valhalla looks terrible, too dark and washed out, GOW also a bit dark and with no vibrancy. Bottom line, even if Windows does its job well, and the hardware is capable, its going to depend on the game if the end result is favourable. I love it when it gets the result right, and cant wait for 27 HDR monitors become available at a decent price. Edit: I forgot HZD, looks a little more dark, but suits it well. Great too.
I must say thanks for this really useful Topic. I was about to ignore my 4k TV´s HDR 10 ability because of so less PC games seems to support it or just like of looking awful what I not really understand compared to PS 4 Pro HDR Gaming on the same TV?! Literally all PS4 Pro games looking just great on my previous Pro, but PC-wise it was a huge disappointment for me. Like some Guys here mentioned that annoying on/off turning with Windows 10 is a massive pain in the a... Like I mentioned currently I am running Win 10 and didn´t thought it would be better to change it to Win 11. But this thread here encourage me to do so now.
HDR from my PC with a TCL 55S405 TV looks overly-bright and washed-out; I think I tried Destiny 2 a while back, but this also happens today with Diablo II Resurrected. I recall HDR content played from the TV itself looking fine, but I haven't tested this in a year or so. I have a 6600 XT currently but this also happened with a GTX 1060, RTX 3060, and RX 580, so it doesn't seem driver-related. It happens on W11 too with the 6600 XT, so it seems not OS-related either. My TV has the input on the Computer setting. The only thing I can think of is the color space? AMD defaults to 4:4:4 YCbCr, but NVIDIA does RGB by-default (full on Linux, limited on Windows). On AMD with past drivers and TV firmware YCbCr was too-dark. Do I need to adjust what color space SDR uses before switching to HDR?
Its possible the TV switched to BT2020 mode for HDR but the game had a bug making it not extend the colourspace. I had the opposite happen with a tomb raider game, it didnt switch the display to HDR mode but did extend the games colour space. Either could have been a driver or game bug. With HDMI 2.0 there isnt enough bandwidth to display 10bit RGB at UHD res. It must be ycbcr 4:2:2 or 4:2:0 for 10bit output. or 8bit RGB, this is what I use for HDR and have no complaints. This is with an NVidia card, I havent tested on AMD. HDMI 2.1 can display RGB 10bit at UHD res.
I changed because of HDR to 11. I have it set to on always, after going back and forward with on/off, reduced brightness for SDR mode, and could be happier. Maybe setting it to always on , auto hdr off, made some changes because games that weren't great with it, like GOW, look amazing now, and The Witcher is truly amazing. Until i find a game that looks worse , its a keeper. Some games doe have a poor support for it, no matter what i tweak, like AC: Valhalla , but setting HDR off in game reverts to the sdr settings which look as usual. Too bad that some games don't invests much in HDR, it does make a huge impact on IQ
If you have a self emissive display such as OLED, leaving HDR permanently on while using the desktop shouldnt cause excess wear. But on an LCD display when HDR is on, the backlight is in a high power state where anything is lit, using the LCD panel to dim the brightness to normal SDR level. This will cause the display to run a lot hotter, use more power and wear out the backlight faster. For these reasons HDR is best left off for desktop use with an LCD.
succinct once you do go to a self emmive display, there is no turning back as your eyes quickly become "spoiled/ used to" real HDR as it is more natural
so hdr does make the game more crisp I can tell the difference on my g7 with 600 nits of hdr brightness. definitely use sdr if just browsing the web or watching videos just use the shortcut keys win+alt b for gaming with hdr.
VA is a form of LCD, its not emissive, it needs a backlight. An LCD blocks light passing through with RGB filters. HDR on requires the backlight be in high power mode.
For sure its better on very dark scenes with bright objects like stars or the moon in the night sky, or a candle/ light in a dark room. But HDR scenes without a black background look better on my Samsung Q9FN 2000nit TV, its f'n marvellous! Colours are superb too. I'm surprised how well it compares. Because its so good I'm sticking with it until probably micro LED or its equivalent with HDR2000 or higher. Though another jump in efficiency would be welcome, my current 55" TV uses 200W producing nr 2000nits, 4000nits or more on a larger display, eek!
I will test those games now. I have GOW and The NG Version of the Witcher 3 on my PC. I finished Win 11 installation half an hour ago. I turned on Auto HDR and I will also install the HDR Calibrating app from Windows store like in a previous Post recommend. I am curious now what else I can set up now on my HDMI 2.1 TV. If one of the Pros could recommend me something I would be grateful. I checked that my TV produce 1500nits.
Turn auto HDR off Use the ingame option for HDR, Witcher doesnt have it ingame but it launchs automatcly with the game. Auto HDR works in some older games , but on others look bad.