DirectX 12 Adoption Big for Developers Microsoft shares New Info

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 23, 2016.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    42,123
    Likes Received:
    10,099
    GPU:
    AMD | NVIDIA
  2. KFBR392

    KFBR392 Member

    Messages:
    34
    Likes Received:
    0
    GPU:
    RTX 3090
    Really can't wait for HDR displays. I think for gamers the ultimate monitors in the next few years are gonna be 1440p IPS HDR 144Hz + FreeSync/GSync!

    We should hopefully be able to see on DP 1.3 (all of em IPS):

    2560x1440 @ 168Hz + HDR + FreeSync/GSync
    3440x1440 @ 144Hz + HDR + FreeSync/GSync
    3840x2160 @ 60Hz + HDR only 60Hz with DP 1.3 afaik + FreeSync/GSync
     
    Last edited: Mar 23, 2016
  3. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    12,476
    Likes Received:
    4,777
    GPU:
    2080Ti @h2o
    For me personally, there is not much of interest besides them admitting that they broke vsync / fsync / gsync with dx12, and that they need to fix this.
     
  4. mbk1969

    mbk1969 Ancient Guru

    Messages:
    11,293
    Likes Received:
    8,873
    GPU:
    GF RTX 2070 Super
    Half the list is devoted to HDR which needs new monitors which will be capable to drive you temporarily blind by mega bright splashes.
     

  5. KFBR392

    KFBR392 Member

    Messages:
    34
    Likes Received:
    0
    GPU:
    RTX 3090
    On Windows Store games only, right (due to the new format)? Ashes on DX12 works with FreeSync.
     
  6. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    12,476
    Likes Received:
    4,777
    GPU:
    2080Ti @h2o
    Well for store games certainly, not so sure about the others (dx12 games that is, because of that unified pipeline). Haven't really been able to differentiate, not enough time to read into it.
     
  7. AlmondMan

    AlmondMan Master Guru

    Messages:
    818
    Likes Received:
    216
    GPU:
    5700 XT Red Devil
    So I read up a little on HDR monitors, but what I could find on the first couple hits doesn't really yield much info other than what appears to be marketing-buzzwords.

    I mean, what's the difference between a HDR monitor and the wide color-gamut Dell U2711 I have? (obviously this is an "old" monitor and newer ones provide better quality, but compared to a newer version of same wide-gamut type display)
     
    Last edited: Mar 23, 2016
  8. zzzaac

    zzzaac Member

    Messages:
    39
    Likes Received:
    0
    GPU:
    -
    For me, at least so far, I haven't seen massive gains or improvements, though I guess the games i played weren't "pure" direct X12 games (ROTR, GOW)
     
  9. xIcarus

    xIcarus Master Guru

    Messages:
    954
    Likes Received:
    96
    GPU:
    1080 Ti AORUS
    Soo we're moving to OLED monitors or..?

    In their current state IPS and TN monitors do not have the necessary accuracy for the HDR described here as far as I know.

    Interesting, but the presentation reads fishy to me.
     
    Last edited: Mar 23, 2016
  10. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,366
    GPU:
    6900XT+AW@240Hz
    realistically speaking... We have seen several DX12 implementation and they really did not left mark on target as promised.
    So now, instead of classical HDR we get new level of blinding light. That's gimmick at best. At worst people will hate it.

    I am sure many people here have screens calibrated for use during day and even with proper calibration those screens feel bit too bright at night. That's unless you have another source of light in room which sufficiently illuminates wall/objects behind screen to create day-like feeling for eye.

    I do not see myself to be happy about spot on screen which suddenly becomes 2~3 times brighter than what maximum brightness of screen was calibrated to.

    And in total... I see list of features which are to serve as smoke screen to let everyone forget that DX12 failed at its main promise.
     

  11. Lane

    Lane Ancient Guru

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    You will need new monitors capable of REC 2020 color space and GPU`s for it ( so far only AMD have speak about HDR support )...

    This said, most future UHD monitors should be 10 bit - REC2020 as it part of standard.

    http://www.businesswire.com/news/ho...Defines-Premium-Home-Entertainment-Experience

    https://en.wikipedia.org/wiki/Rec._2020

    The more important things with HDR, is it a need and a will from games developpers... Its incredible how you can just increase the image quality with it...
     
    Last edited: Mar 23, 2016
  12. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,693
    Likes Received:
    360
    GPU:
    MSI GTX1070 GamingX
    DX12 isn't a failure yet. What was promised as "better" performance has ended-up being AC touted as the important feature for real-world performance benefits. This isn't what DX12 was supposed to do. This one feature is also not exclusive to DX12; opengl and vulkan can both use it. In effect, this negates windows 10.

    We were promised orders of magnitude better performance under dx12 (especially if you look at early benchmarks). Overall, this hasn't actually happened. What it's actually done is cause developers to spend much more time tweaking code where-as before some of this work was automated by the schedulers. DX12 then has actually made dev's jobs harder, not simpler.
     
  13. Denial

    Denial Ancient Guru

    Messages:
    13,527
    Likes Received:
    3,073
    GPU:
    EVGA RTX 3080
    I don't think it failed it's main promise. I just think that, once again, gamers created an unrealistic expectation for it.

    I don't recall Microsoft ever saying it would bring massive performance improvements across the board. I recall them saying that it would reduce CPU overhead. I recall them saying it would allow developers to have their games drive a deeper into the architecture itself. That it would be consistent across multiple platforms and that it would have a decent toolset to accompany it.

    Multiple times in the last few years I've said that it wouldn't make much of a difference in GPU heavy games. The performance improvements you see in games like Ashes is a combination of RTS games being heavily CPU throttled and Oxide making good use of Async Compute to eliminate that bottleneck by offloading more of the CPU calculations to the GPU. Again, I stated multiple times that game and RTS in general is literally the perfect genre to exploit DX12. MMO's might be a close second. Most FPS/RPGs/etc are GPU bound though.

    Games like Hitman, which AMD calls "The best use of Async Compute" the maximum difference between DX11/12 card is the 390, with ~10%. Most GPU's only see ~3% gain, some see none. Then I see multiple posts here on Guru3d going "Where is DX12's performance, must be immature drivers". Nope, that's actually probably it. By the time game engines really start taking advantage of DX12's low level stuff and creating some new rendering methods, the DX11/12 comparisons will be gone because there will be no DX11 variant. Without the comparison any visual gain of the difference is essentially lost in terms of gamers being able to see it.

    I personally think DX12 did exactly what it promised. People just falsely expected more.

    Tom Peterson from Nvidia indirectly said that Pascal will support it on PcPers podcast.
     
    Last edited: Mar 23, 2016
  14. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,623
    Likes Received:
    258
    GPU:
    RX 580 8GB
    Another possibility is that PC low level API's will give us effectively more powerful GPU's as developers don't need to be careful of drawcalls bottlenecking the driver thread(s), that should increase game hardware requirements. VR and UHD is helping too.
     
    Last edited: Mar 23, 2016
  15. nz3777

    nz3777 Ancient Guru

    Messages:
    2,468
    Likes Received:
    210
    GPU:
    Gtx 980 Strix
    They will have to do alot more then mentioned above to sell me on dx-12 I am sorry,1st you need a Dx-12 capable card (gpu) then windows 10 for obvious reasons which I do not like at all, I will stay with 7 until I have no choice-Where is the performance benafits they bragged about so much?

    Howcome v-sync does not work with dx-12? That is a MAJOR kick in the balls!
     

  16. Juliuszek

    Juliuszek Member Guru

    Messages:
    106
    Likes Received:
    2
    GPU:
    GTX1060/3GB
    So I need a HDR display now? And I was sooo happy when I got my G-Sync monitor at end of 2014... I thought it was to stay for couple of years longer...
    Hmmm - maybe my old CRT will be able to do it :) You could adjust the brightness to unimaginable levels with that thing :)
     
  17. Denial

    Denial Ancient Guru

    Messages:
    13,527
    Likes Received:
    3,073
    GPU:
    EVGA RTX 3080
    Idk, my Samsung 8500 has HDR, although I guess the newer Samsung's announced this year are supposed to be better at it. I watched an HDR demo on it and while it was kind of cool, it wasn't something I'd want on all the time. I personally think it's overrated.

    I'd rather monitor companies sell me a $100 premium on a guarantee that my monitor will come out of the box with zero defects. No backlight bleed, sub 10% delta in gray uniformity, factory calibrated to SRGB, zero dead pixels.

    I had to go through multiple ROG Swifts and multiple XB270HU's before I found a decent one. And even my current one still has some bleed in the lower left corner. Unacceptable for a $800 monitor.
     
  18. kegastaMmer

    kegastaMmer Master Guru

    Messages:
    326
    Likes Received:
    40
    GPU:
    strix 1070 GTX
    geez, i too am glad i never threw mine out the window! gonna try it with the fermi dx12 drivers...oh wait
     
  19. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,366
    GPU:
    6900XT+AW@240Hz
    Thing is, we have not seed that reduced CPU utilization. Have we? UE4 Infiltrator tanked in DX12 same way as it did in DX11 upon reaching last scenes outside where GPU utilization went down to nothing as CPU could not handle it.

    I want to see that people will get same performance with 4.5GHz OC CPU under DX11 as they do with regular much cheaper 3.2GHz CPUs under DX12.

    We have read about new concepts for graphics, we have seen "new" ways to use GPUs. But where is that reduced use of CPU?

    It would deliver great performance improvement for intel's/AMD's SoCs as in those CPU and iGPU are fighting over limited TDP. And DX12 was supposed to free this CPU TDP use and allow iGPU to truly shine.
     
  20. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,693
    Likes Received:
    360
    GPU:
    MSI GTX1070 GamingX

Share This Page