DirectX 12 Adoption Big for Developers Microsoft shares New Info

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 23, 2016.

  1. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    I don't know. I keep waiting to see Hitman/Tomb Raider CPU scaling tests but no one does them.

    I feel like DX12 UE games are irrelevant at the moment. By Epic's own admission their engine does not support DX12 yet. On the forums they are saying it might be feature complete by 4.12, they aren't even shipping 4.11 yet.

    https://trello.com/b/gHooNW9I/ue4-roadmap

    4.11 is apparently has a bunch of new DX12 stuff, but still not "officially supported".

    Yeah, in CPU tests and not only that, but they are very specific CPU functions. If those specific functions are not bottlenecking performance, then why would increasing them increase the performance of a game?

    I mean that's the issue I have with what people are saying. Yeah we saw Intel render 50K unique asteroids on it's processor, swap to indirect and get 3x the framerate, but no real game is doing that, aside from maybe Ashes. In real games you're going to instance like 90% of those asteroids and reduce the draw calls to nearly nothing. DX12 won't make a difference in that case.
     
    Last edited: Mar 23, 2016
  2. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,551
    Likes Received:
    608
    GPU:
    6800 XT
    Give my OLED... Then I can buy in to that HDR thing.
     
  3. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Yes, I know UE4 was not much of DX12 engine at time Infiltrator came, but I believe those people working on it do their best to have most popular engine.

    I keep eye on UE4 through ARK: Survival Evolved planed patches news as they are putting DX12 further and further away.

    It does matter little if Hitman does 5~10% better on high end GPU if it does not deliver on greatly reduced CPU need. Because that same Hitman with very low need for CPU would run on notebooks which has only iGPU, as TDP of APU would be used for graphics rendering instead of managing rendering paths.

    Reduced CPU overhead may deliver maybe 15~20% improved desktop performance. But for mobile we may see twice as high fps than before. (That's for Atoms. Where it may lead APUs? I do not know.)
     
  4. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Is there a benchmark of Hitman running on an APU? The old Starwarm benchmarks show APU's running double under DX12. I haven't seen a Hitman APU benchmark though. I don't know if we would see double on Hitman, but I imagine we would see a much larger increase than we see on dGPU's.
     

  5. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,750
    Likes Received:
    9,641
    GPU:
    4090@H2O
    QFT.

    What I have been thinking (and almost forgotten) after the first dx12 demos / benches / games now have launched, but remembered lately, that if you don't run into any bottlenecks with a high performance gaming rig under dx11 (as in, top dx11 gaming rigs), I did not expect huge gains under dx12.

    Also, don't forget, how should Intel sell those expensive CPUs to gamers if you don't need them anymore? :D
     
  6. Lane

    Lane Guest

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    I have no doubt aboout Pascal supporting it, or well can output it.. because it is the next leap in display image quality.
     
  7. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    That entire DX12 business is propagated wrong way and makes me feel that industry is changing direction and heading where we do not need it to be.

    As for tests, not even notebookcheck makes distinction between DX11/DX12 game tests for Hitman 2016 and RotTR.
     
  8. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Hilbert, would it be possible to get an APU test for Hitman, DX11 vs DX12? I'd be interested in seeing it.

    I'm kind of blown away by the fact that I can't find a single APU test, for neither Ashes/Tomb Raider/Gears or Hitman. The benefits for APU's was like one of the largest advertising points going into DX12s announcement. Now I can't find anything.

    Reporting my own post to get his attention lol
     
    Last edited: Mar 23, 2016
  9. Turanis

    Turanis Guest

    Messages:
    1,779
    Likes Received:
    489
    GPU:
    Gigabyte RX500
    Hmm new monitors?

    Gsync,FreeSync, they did not have much success.So they(industry) want to introduce new monitors(like 3D TVs some years ago),then another wasted money on new monitors.

    Why not they put that thing,new HDR,in graphic Gpu or in Post process or in drivers so the actual monitors will displays that thing.
    More money,more problems.
     
  10. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Uh because the monitor has to be able to display a higher brightness? It's like saying why don't they put 4K in graphics cards and not require buying a new monitor. It's not physically possible.
     

  11. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,128
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    DX12 is fine. People KEEP confusing it with the push for Windows Store games that Microsoft is going for. The whole promise of DX12 is that you get an almost console-like low leve utilization of the computer. One of the side effects is that AMD's GCN architecture can finally be "fed" more effectively, but that's not really the reason for it. The reason is to lower latencies, drop power consumption and/or create CPU based effects that finally can use all the CPU power that modern computers have.
    As for the rest of the news, the MOST important announcement was Shader Model 6. Will it require new hardware? Will current hardware support it correctly?
     
  12. Turanis

    Turanis Guest

    Messages:
    1,779
    Likes Received:
    489
    GPU:
    Gigabyte RX500

    I understand,but i think this would be another hyped product on the market and after maybe 2-3 years they will cancel this.
     
  13. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,750
    Likes Received:
    9,641
    GPU:
    4090@H2O
    Probably not, like it was back then with SM 2.0 and 3.0 (2.0 wouldn't start Bioshock without a workaround / patch for instance, did this myself back then with a SM 2.0 card, Sapphire 4850 Toxic I think.).
    Thinking further, will Polaris and Pascal be SM6.0 compliant, or will we have for 2017 to see GPUs to support the new SM fully?
     
  14. slyphnier

    slyphnier Guest

    Messages:
    813
    Likes Received:
    71
    GPU:
    GTX1070
  15. MBTP

    MBTP Member Guru

    Messages:
    143
    Likes Received:
    11
    GPU:
    Sapphire RX590
    People still dont understand how important dx12 is, and im not the one who will make it clear, theres just too much info on the internet, HDR will not be a gimmick, it will be like black and white to colors, in the past. As Vr will not be a gimmick too, although will not be feaseable for the majority of content and people by now as it alienates people from reality.
    The major problem with HDR is that you will need an special place with little to no light to really have the experience it is capable of delivering, OLEDs should be watched in full darkness, and the other variants will accept a little more light, the curved displays will help in getting rid of reflections i guess but still will be very proeminent.
     

  16. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,128
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    I suspect that we might not actually need new hardware. From the information I found in the Tehcpowerup article about it, it seems that it's just accessing the same hardware in a new way, more consistent with the new API and it's using tiled resources. If that's the case, it doesn't seem to have extra hardware requirements.

    It was interesting that I saw a presentation about HDR from the Radeon Group recently. They seem to keep cooking DX12 features closely with Microsoft.
     
    Last edited: Mar 23, 2016
  17. MBTP

    MBTP Member Guru

    Messages:
    143
    Likes Received:
    11
    GPU:
    Sapphire RX590
    Why it isnt? It definitly is, some monitors have contrast enough to produce something near, its just about the standards and hardware acceleration needed to make the process faster, produce less heat, and consume less resources.
    LOL
     
  18. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    What are you talking about? All it needs is a compatible decode engine as far as hardware goes. Most monitors cap out at 350-400 nits. The Samsung HDR displays and future HDR displays are boosting up to 1000 nits and have a lower black level. Can you artificially boost contrast? I guess, if enjoy a washed out garbage image.

    That's like saying DSR 4K is the same as a 4K monitor. Sorry but no. And no amount of random "LOL"s at the end of your posts will change that.
     
  19. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    It's not that "cutting-edge" until it can burn your retina. Then, we'll have real retina displays :D
     
  20. TimmyP

    TimmyP Guest

    Messages:
    1,398
    Likes Received:
    250
    GPU:
    RTX 3070
    It not even Windows store games.

    Vsync is perfectly fine in Dx12\Windows store games. Borderless fullscreen = vsync on.

    The problem is that you cant turn it off. That will be fixed once exclusive fullscreen gets implemented.
     

Share This Page