HDR support for PC platofrm is native. The moniker only applies to TV industry as they needed a new moniker to segregate their offerings and even then HDR isn't HDR on tv's as they have different levels of HDR capable abilities. Does it make you feel better to know that you have an HDR set? if it does then the TV's industry decision was justified. The HDR Moniker used in video games is really sad and if their game was able to receive an HDR update, then their game was sub par to begin with. "we are releasing an HDR update for our game" These days, with the sheer Size of textures, there is little reason to try to save room by using non HDR textures. Who's trying to save space these days? Let's make a dull looking game loaded with dull looking textures, "no one said ever."" Your GPU was almost always capable of HDR output ( I say almost because for a while there Nvidia was handicapping their consumer cards to a lower bit rate VS their pro cards. This has since been rectified as competition never did such disservice to their customers), though your monitor may not be able to support handle the ranges. Traditionally, only professional monitors offered a large range of colours/light as if in most situations any 32 bit screen has enough colours for the average users, some 16,777,216. In 2018, I don't see how this is even a point of conversation these days. Any tv or monitor that doesn't have HDR is purposely built for one reason and one reason only, and that's to create a product hierarchy to create the illusion of superiority, or lack of, to entice the consume to spend more for what they feel their deserve.
The effect that you're setting on an HDR TV is the ability to show more colours mostly due to of light levels. You can take any signal and put it on an HDR TV and it will look better, regardless of source signal, because the TV is effectively superior than a non HDR set. My TV isn't HDR, though it has an HDR mode which makes all source signals look far better than when the HDR mode is off, even though it's not HDR...HDR isn't something you turn on and off with a switch. You can either do it or you can't and it's irrelevant to the source signal or not. For now, HDR is a term used to describe colour ranges in a TV. Ranges, that for the most part, are already obtainable by a good quality computer monitor, Moniker not required (no, not a $100 LCD screen with dead colour gammit) Sigh, I wish plasma comes back... proper colour levels at entry level with no need to buy gimmicks like all while LED back lights or All Blue LED back lights. Who cares if it uses "more power," it's superior, and besides it's a faction more in power use anyhow... 3x .00001 watt is is .00003, which is inconsequential.
To answer your question, yes. all games should be able to be HDR compliant regardless of the consoles ability to display the image. I can't imagine someone is going out of their way to make a texture to be a garbage texture on purpose. That would be bad practice for the original artifact to be poor quality. An HDR texture can mean many things, like the colours in a texture cover a proper gradient of a colour, or simply the artist had more colours to pick from when making the texture...Mostly a texture is just compressed from the high quality to it's new required level of quality. HDR for TV's basically provides a standard that is the same level of what a computer monitor can produce, though the HDMI standard isn't, or wasn't, capable of processing it, So when a console says 4k HDR, it means they are passing the right amount of information to the TV set to decode and display a 10 bit image at 4k resolution. There is still a problem of FPS, I understand that the last version of HDMI was only able to provide HDR at 30FPS and 4k resolution. If you wanted to go to 60FPS you would shift down to 8bit colour. so, unless your console is running the newest HDMI standard, you don't have HDR, 60FPS, and 4k resolution. You can fill in the gaps with a few mirrors and a lot of smoke.
But how about SLI support for this? I bet no chance to run it properly with one 1080Ti if want to have around 165fps at 1440p. They are making monitors like 120Hz 4K / 200Hz 1440p and i hope there will be decent SLI support too.
Where did I say that? And yes I have a proper HDR TV... Assets have to be coded to support HDR. When it has been done for other platforms, then yes it’s there. Pcars 2 didn’t include it on PC but did on consoles, they gave the same lame excuse others seem to say, but also said it would be easy to enable on PC if people wanted it. I swear the people saying it’s not a big deal haven’t seen HDR in action on a high end TV, it makes a bigger difference than using ultra settings to the look of a game. It’s astonishing that PC is lagging behind, when it normally sets the standard. It’s got to the point where if a game supports HDR on consoles but not PC I buy the console version, which is something I thought I’d never do.
Doesn't matter what settings it has because it's Ubisoft. Guaranteed even the best of hardware will be not be able to crank those up on anything other than resolutions no greater than 1440p. Also expect abnormal CPU usage and lots of stuttering.
Lucky for you, FC5 will be one of the few games to offer SLI support. It's even in their recommended settings. "4K/60fps requires 1080SLI" meaning 1080Ti users wont be able to run this game at that setting. https://www.digitaltrends.com/computing/far-cry-5-system-pc-specs-4k-sli-crossfire/
Looks good imo, the amount of settings you'd expect from a PC title + FOV and resolution scale adjustment. Also TAA as antialiasing method.
I missed that recommendation. I believe only heavier AAA games supports SLI novadays, but thats where we need it. Looks like this could be hard to run smoothly and i afraid what kind of stutter-show / framedropping-freak it will be. But can be also a nice graphic demo if it can achieve high framerates smooth at least with some settings. Thanks for advice, sorru for ”rally english”
The point was that very few PC users have HDR capable displays, so it made more sense for game devs to do HDR for consoles due to their use on TVs.
IMO it does. I've said this before but I'd rather have a 1080p game with HDR support than a 4K game without it, given that both run at the same framerate. I've played Obduction, ME:A and SW:2 and all three looked significantly better on my LG OLED with HDR enabled. The first G-Sync Ultrawide 30+ inches with HDR will be my next monitor and it's entirely because I want HDR.
We are talking about the PC version... barely any PC monitors have HDR, and those few monitors that does, aren't the ones that sell. People mainly buy nvidia, and people want g-sync... there are currently no g-sync hdr monitors available. I can totally see why they didn't bother including HDR. You play PC on your tv, but the vast majority don't. And i don't agree with you about HDR being more significant for gaming than 4k... at all. A correctly calibrated sdr 4k screen will always be surpirior in image quality in games, compared to a 1080p hdr screen, solely due to the vastly surpirior clarity and detail on the 4k screen.
This. I have a ASUS VG278H and the colors and clarity of this monitor are amazing, even after all these years I'm comparing monitors to mine seeing they are not worth "upgrading" to. I've been playing Destiny 2 on pc and recently switched it over to my Vizio P series and finally saw something worth making me use my "tv monitor" to game on because of the HDR. I can barely stand to play it anymore without HDR, IMO it makes a dramatic difference. I'm glad to see this debate on HDR happening, Whats funny is everyone I know who games on consoles don't even own a hdr capable tv or really even know anything about it, but almost everyone I talk to who pc games, even if they don't own a hdr tv, they still at least know about it and want it
If your argument was "HDR support isn't worth it because no HDR monitors are available" I would still argue you're wrong but I wouldn't have responded the way I did. You specifically made a sarcastic response implying that HDR doesn't provide a benefit in games, not that HDR isn't available. The difference between 4K and HDR is obviously opinion but again your initial post implied that it doesn't make a big difference and I disagree. That's coming from my experience with my LG 65" C7 from about 14' viewing distance and the three games I mentioned above. With a monitor I may prefer resolution due to sitting distance but either way HDR is going to have a significant impact on the image and I want it and clearly a bunch of others in this thread do as well - some even going as far to play on a console for it (gross!)
I agree that having to revert to consoles is atrocious! And i will buy into the argument, that when the consoles has HDR support, the pc version ought to have it aswell. BUT i do believe that it's a very small amount of users that will miss it. As per usual, this forum is hardly representative of the average pc gamer In regards to hdr making a difference, i agree that the difference it makes in movies is quite immense (planet earth 2 in 4k hdr... wauw), but imo the difference for games are alot more subtle. I tried sw bf2 and ac origins on my tv (just to test it out, hate playing without gsync at this point), and i didn't feel that hdr made any significant difference. But as you quite rightly say, it's entirely subjective
The only title I have found it makes a huge difference on was Mass Effect Andromeda. Not enough of a difference to make me want to go back and grind through it again, but the job they did optimizing for WCG, HDR and 4K were pretty amazing.
........ smaa and blurry taa antialising ........ hopefully there is no blur when moving like fallout 4 does . dono if ill buy it ....
Its funny that most people arguing that HDR makes little difference and that there are not many monitors out there supporting it are forgetting few things: - HDR standard was finalizetrecently and due its little bit extreme requirements compared to normal screens, its going to take a while to make acceptable screens. - even if there are screens supporting HDR its not surrently add support G-sync monitors as that would require new g-sync modules. - cost of screens will be high and will take a while to reduce costs to acceptable levels for most users - most users upgrade monitors in 5+ years intervals - fir TVs however most of points above do not apply and most likely we will have a lot TVs out there soon and monitors will follow later.