DirectX 12 Adoption Big for Developers Microsoft shares New Info

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 23, 2016.

  1. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,889
    Likes Received:
    755
    GPU:
    Inno3D RTX 3090
    You can. NVIDIA has it implemented and AMD did that too in their latest driver. From that point on it's on the hands of the developers.
     
  2. Alessio1989

    Alessio1989 Ancient Guru

    Messages:
    2,113
    Likes Received:
    634
    GPU:
    .
    SM6 is more about HLSL language and compiler features, the comparison with SM2-SM3 transition is unsuitable. Yes, there will be added new shader semantics, but most should be randomly supported on current hardware (though I am not aware of which hardware will support what). Finally, we are moving from SM4.0 bytecode (which is the current base for SM 4.1, 5.0 and 5.1).
    Hopefully SM6.0 and the new compiler will be better suited to GPUs then SPIR-V which exposes a lot of OpenCL horros (though with OCL2.x some cleanup have been applied). I really hope Directcompute will be a valid alternative to OpenCL and Cuda, which is currently note essentially due missing a lot of core features...

    Incoming version of DXGI (1.6 I guess, 2.0 sounds like so much "everything is broken") will allow swap-chain to present at unleashed frame-rate via immediate/independent flip model too (aka border-less maximized window) like it is currently in all full-screen modes
     
    Last edited: Mar 23, 2016
  3. KissSh0t

    KissSh0t Ancient Guru

    Messages:
    9,860
    Likes Received:
    3,725
    GPU:
    ASUS RX 470 Strix
    I am more interested in Vulkan to be honest.
     
  4. __hollywood|meo

    __hollywood|meo Ancient Guru

    Messages:
    2,990
    Likes Received:
    139
    GPU:
    MSI 970 @1.55ghz
    HDR? they could talk about anything, but they focus on HDR? is this a joke

    if somebody responds to me mentioning vibrancy or immersion or something i will smack em :D

    this is of crucial importance in the near future as we hit the brick wall of die shrinks with current limitations of silicon. we need more efficient cpu utilization & properly supported parallel threads

    id be interested to see the specs. are they floating around somewhere? i havent looked. i literally could have looked it up instead of typing this. im lazy :wanker:
     

  5. tsunami231

    tsunami231 Ancient Guru

    Messages:
    12,183
    Likes Received:
    940
    GPU:
    EVGA 1070Ti Black
    Did they really say the "average consumer" cant see the difference from 1080p and 4k??

    the only way that is possible is if the feed going to the 4k TV is really 1080p or lower which most are lower then 1080p short of bluray, which in that case the TV is just upscaling. or they are blind as a bat cause there is night and day difference between 1080p feed scaled to 4k and actual 4k feed sent to a 4k tv. which at this point almost all feed from OFF air/cable tv/Satellite are all 720p with rare exception to some being 1080i and less being 1080p and almost all the feed sent to the 4k TV are using the same feeds from OFF air/cable tv/Satellite at the stores trying to sell them. so yah people not gona see diffrence if there is no actual 4k feeds sent to these tv when they look at it.

    my dad has sony xbr 65" 4k that is 5" bigger then is old 1080i sony xbr projection/lcd tv and 720p-1080i/p look worse on his 4k tv then it did on his old tv, bluray scaled to 4k look as good as blurays on actual 1080p tv but actually 4k feed sent to that 4k tv is a night and day differences.

    as for DX12 only think I want from it, is it to put end to DX9 being used forever hell everything under DX11 should be no longer used.
     
    Last edited: Mar 23, 2016
  6. MBTP

    MBTP Member Guru

    Messages:
    142
    Likes Received:
    10
    GPU:
    Sapphire RX590
    Its quite diferent, we will probably see many TVs by now saying they are HDR capable, but will not come with the true standard specs. No games or tv shows movies, takes full advantage of panels with higher then 1000:1 contrast ratio in reality.
    You should be prepared for the 1 billion NITS master race, a minority will really have those NITS in true specs, most will have a diversity of problems with blooming, backlight bleeding and so on...
    What im saying is that you dont really need those 1000 NITS to perceive the diference and it can be emulated to run on normal TVs and monitors, but they will not achieve the perfect envisioned image, as 99% of actual today monitors dont do justice to the actual standard specs. Thats why CRT are still used in professional aplications and monitoring.
     
  7. KingK76

    KingK76 Member Guru

    Messages:
    107
    Likes Received:
    12
    GPU:
    Pascal Titan X Under H2O
    This point: "Microsoft predicts that adoption of High Dynamic Range will be faster than that of 4k. It’s more clearly noticeable to users than 4k, with several TV models that have begun shopping. It’s very difficult for the average consumer to tell the difference between 1080p and 4k unless they get very close to the screen, whereas HDR is something that end users, not just professional graphics artists can see the advantage of." To me is bogus... I can report that the difference between 1080p and 4K is MASSIVE on my 70" Vizio P Series. Maybe not so much on small computer monitors but on a HDTV the difference is huge. I am talking about gaming though and not regular video however... Haven't experienced HDR yet but it does look interesting. Still 4K FTW!!!
     
  8. GenClaymore

    GenClaymore Ancient Guru

    Messages:
    6,070
    Likes Received:
    52
    GPU:
    XFX 5700 XT Thic II
    I'll be even more happy if either of those 144hz or 168ghz support 3D vision 2, then I be game, But that be wishful thinking.
     
  9. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    12,474
    Likes Received:
    4,776
    GPU:
    2080Ti @h2o
    Well with a screen that big you could possibly feel the pixel size difference with your fingers, 1080p has to be horrid on that :D

    But yeah, it's bs. They just want people to go in heads over heals for anything new they do, whether or not it makes sense at all.
     
  10. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    4,375
    Likes Received:
    1,139
    GPU:
    3070 GTX
    :cop:

    I think you mean farther and farther away.

    OT: yes, according Epic and others, they do not appear to really be caring about DX12.

    The thing about Unreal Engine is it is quite clearly a DX9 engine with PBR bolted on top of it. I think that is the case with so many of the game engines out there. It doesn't even appear to be multi-core, more like single core with additional cores bolted onto it.

    Snowdrop is the only engine which looks as if it was coded brand new from the ground up with multicore and DX12 in mind. All other engines are a rehash of existing code.
     

  11. __hollywood|meo

    __hollywood|meo Ancient Guru

    Messages:
    2,990
    Likes Received:
    139
    GPU:
    MSI 970 @1.55ghz
    nah, hes right :p farther refers to tangible distance, while further implies time...or another type of figurative abstraction

    /grammar nazi
     
  12. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    42,121
    Likes Received:
    10,096
    GPU:
    AMD | NVIDIA
    Thing is, how many people here that visit Guru3D really use an APU for gaming ? I mean 99.99% of you have installed a dedicated graphics card. I'll consider it though and if I can find some free time I'll install and test it.
     
  13. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    4,375
    Likes Received:
    1,139
    GPU:
    3070 GTX
    If you push something away from you - that's distance, if one is speaking figuratively then it must be stated, otherwise it lacks precision and clarity.
     
  14. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,889
    Likes Received:
    755
    GPU:
    Inno3D RTX 3090
    There are a lot of us with the inner geek that simply have questions. There aren't any websites that we can have this access to reviewers as we do with you man.
     
  15. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,366
    GPU:
    6900XT+AW@240Hz
    Hilbert, thing is... we see very small performance gains on those high end systems as they are not limited by TDP.
    But intel's & AMD's SoCs are more often than not TDP starved. AMD mobile chips share like 60~70% of TDP between CPU and iGPU. intel fights over magnitude higher TDP than they have on paper.

    By that I mean 2W SDP Atom which idles at 0.7W, eats 2.7W (and downclocks from 2.14GHz to 1.44GHz) while doing heavy CPU based calculations.
    When only iGPU is under heavy load it eats 3.5W (and downclocks from 600MHz to 400MHz).
    When chip is fully loaded CPU+iGPU that things eats 5.0W.

    Then you take throttlestop and CPU can stay at 2.24GHz while eating 5.4W (and nearly doubling real world performance).
    And iGPU can stay at 600MHz too, while eating 5.1W under load.
    When chip is fully loaded CPU+iGPU that things eats 9.4W with throttlestop (still some starvation 2.1GHz CPU + 540MHz iGPU).

    And that's where freeing resources on CPU truly shows its impact. There are Vulkan benchmarks from Imagination Tech, running mobile devices with reduced CPU overhead. That means CPU to GPU requirements are much lower.
    CPU utilization vs fps tells entire story of Vulkan. And it would be interesting to actually read same story for DX12.
     

  16. Denial

    Denial Ancient Guru

    Messages:
    13,527
    Likes Received:
    3,073
    GPU:
    EVGA RTX 3080
    Couldn't have said it better myself.

    I personally would never use an APU for gaming, but I'd like to see how DX12 effects lower end CPU/GPU combinations.
     
  17. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,366
    GPU:
    6900XT+AW@240Hz
    Don't say never. In 1 year we may have mobile APUs with surprisingly high performance :)

    AMD must have reason for not having in consumer market lower card than r7-360 (768SP + 128bit bus). Lower parts are only for OEMs or old unsold and already obsolete chips.
     
  18. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    12,474
    Likes Received:
    4,776
    GPU:
    2080Ti @h2o
    This.
     
  19. sykozis

    sykozis Ancient Guru

    Messages:
    21,994
    Likes Received:
    1,181
    GPU:
    MSI RX5700
    Thank you Hilbert!!

    AMD's mobile chips suck for performance in gaming. If the GPU is under heavy load, the chip (CPU and GPU both) has to downclock due to power limitations. The result is an A8-6410 that's slower than an Athlon 5350....

    I have a laptop that has an A8-6410 APU in it.... I also have an Athlon 5350 sitting here... I'd love to see a DX12 game that could actually run on either of them. Currently, the A8-6410 struggles to run Torchlight 2, even though the Athlon 5350 runs it just fine.
     
  20. Bladeforce

    Bladeforce Active Member

    Messages:
    50
    Likes Received:
    4
    GPU:
    nVidia Titan
    People still fall for this BS from Microsoft?
     

Share This Page