Crytek releases Neon Noir Ray Tracing Benchmark

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 14, 2019.

  1. Astyanax

    Astyanax Ancient Guru

    Messages:
    5,397
    Likes Received:
    1,602
    GPU:
    GTX 1080ti
    RayTracing is not done how you think its done, the RT cores are for sorting rays (is this on screen, or isn't it onscreen) which passes results onto the next stage of rendering which is done in the traditional way.

    [​IMG]

    The shadow, or reflection, or lighting you end up with gets done by the traditional shading pipe.
    You can sort faster but its the traditional shaders that are contention.

    https://www.anandtech.com/show/13282/nvidia-turing-architecture-deep-dive/5
     
  2. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    7,795
    Likes Received:
    268
    GPU:
    Zotac GTX1080Ti AMP
    This makes more sense to me, if this is the case then Turing was an even bigger mistake. Why on earth put RT/Tensor cores onto your chip that ends up starving your CUDA cores and in turn massively decreasing performance. On the one hand they were touting Turing as being a 4K60fps monster, then on the other hand they were pushing RTX which then tanks performance and the cards become 1080p cards.... More proof the tech is not ready. Its like bugatti making a new car that can do 1000mph.... but when it does it explodes.....

    Also Mantle was waaaaay more ready than RTX. I had a HD7970GHz GPU when Mantle came out and in BF4 my performance shot up and I was able to max the game out at Ultra settings 1080p and gained about -/+30% better performance with MUCH higher minimum frame rates. Mantle eventually pushed the industry to adopt low level API's after years and years of massive overheads, which eventually became Vulkun and microsoft followed suite with DX12. Not to mention AMD basically gave their Mantle code away to the OpenGL team, can you see Nvidia doing the same with their tech? Unless it becomes unprofitable they will never give it away to benefit the whole industry just their wallets.

    All RTX has done is show how far behind we are with the hardware needed to run this properly.
     
  3. Cyberdyne

    Cyberdyne Ancient Guru

    Messages:
    3,402
    Likes Received:
    189
    GPU:
    2080 Ti FTW3 Ultra
    Turing is both of those things. Just not at the same time. I never felt mislead by their marketing, I certainly was not expecting the 2080 Ti to do 4k60fps with RT.
    But Turing is the best at 4k, and it's the best at RT. Real time raytracing has to start somewhere, and NV is willing to invest.

    Mantle worked out the gate, RT also works out the gate, "it just works!" lol. Mantle was focused on more FPS, RT never made such claims. RT offers raytracing in real time, and it does that.
    RT is also not proprietary. When AMD supports RT, these current RTX games will work on AMD GPUs out of the box. That's been the case since RTX was a thing, can't say the same thing about Mantle.
     
  4. Cyberdyne

    Cyberdyne Ancient Guru

    Messages:
    3,402
    Likes Received:
    189
    GPU:
    2080 Ti FTW3 Ultra
    Idk, that depends on whether that sorting is fast enough right? There's a reason they increase the amount of RT cores per GPU, clearly they are in contention as well.
     

  5. angelgraves13

    angelgraves13 Ancient Guru

    Messages:
    1,737
    Likes Received:
    443
    GPU:
    RTX 2080 Ti FE
    I don’t look at Turing as a failure like most do. It was ahead of the curve in tech and is able to process INT and Float simultaneously.

    Ampere will be amazing compared to Turing at an architectural level in ways we have yet to understand. Turing’s other features are slowly making their way into DX12 and Vulkan. Mesh shaders and adaptive shading will be huge when they’re used in a few years.
     
    Aura89 likes this.
  6. no_1_dave

    no_1_dave Master Guru

    Messages:
    220
    Likes Received:
    4
    GPU:
    1080 Ti
    This is more proof the 1080TI was a DX11 powerhouse, but in DX12 it falls short
     
    angelgraves13 likes this.
  7. Cyberdyne

    Cyberdyne Ancient Guru

    Messages:
    3,402
    Likes Received:
    189
    GPU:
    2080 Ti FTW3 Ultra
    Pretty sure that's a dx11 benchmark.
     
    SpajdrEX likes this.
  8. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    11,389
    Likes Received:
    3,408
    GPU:
    2080Ti @h2o
    It falls short on compute (which we knew compared to AMD for instance). Since this benchmark doesn't require dx12 / DXR, I doubt it shows much about dx12 performance of Pascal, just compute?
     
  9. XenthorX

    XenthorX Ancient Guru

    Messages:
    2,928
    Likes Received:
    828
    GPU:
    EVGA XCUltra 2080Ti
    If i get this right, the demo features a mix of reflections solution seamlessly switching from different techics at runtime to optimize performance?
     
  10. fellix

    fellix Member Guru

    Messages:
    176
    Likes Received:
    13
    GPU:
    KFA² GTX 1080 Ti
    The RT solution here is only used for sharp single-bounce reflections (high ray coherency) with very few dynamic objects, all placed in a narrow/closed low-polygon environment with aggressive LOD scaling. The goal is to minimize the traversal structure re-building (for each frame update) and avoid ray scattering. In other words, the demo itself is very closely designed around the tech's performance limitations.
     
    Denial and XenthorX like this.

  11. SpajdrEX

    SpajdrEX AMD Vanguard

    Messages:
    2,257
    Likes Received:
    592
    GPU:
    Sapphire RX5700XT
    DX12 and Vulkan support is planned at next year, only DX11 version is available for now.
     
  12. JASON_Q

    JASON_Q New Member

    Messages:
    3
    Likes Received:
    0
    GPU:
    ZOTAC RTX 2070
    ZOTAC RTX 2070 AMP
    i5 2500k @4.8GHz
    P8Z68-V PRO
    16GB DDR3 2200MHz


    1080P - ULTRA - 9321 points

    Benchmark use ONLY DirectX 11 (You can run it on Windows7 and ANY DX11 graphics card)
    Not use hardware Nvidia RT Cores (Nvidia 16/20 series cards)
    Not use DXRT on Windows10 (Windows10 1803 or higher)
    Not use VulkanRT on Windows7/10
     
  13. Valerys

    Valerys Master Guru

    Messages:
    379
    Likes Received:
    14
    GPU:
    Gigabyte RTX2080S
    I'm worried that my RTX 2080Ti at 4K runs about the same as a GTX 1060 at 1080p. We really need a ray scaler in the future, I'd feel more comfortable with slightly noisier/less accurate raytracing effects with full rasterized shaders. Right now it's all or nothing of sorts.
    Like in this demo, the reflections are rendered at 1/4 of the screen resolution. At 4K I wouldn't mind having it at 720p like quality instead of 1080p, with temporal reconstruction the visual hit is likely a lot less compared to the performance gain.
     
  14. seaplane pilot

    seaplane pilot Maha Guru

    Messages:
    1,295
    Likes Received:
    2
    GPU:
    2080Ti Strix
    3440x1440
    Fullscreen
    Ultra
    i9- 9900x @ 5.0GHz / 2080ti

    8004 OC +130 core / +950 Mem
    8054 OC +145 core / +1100 Mem
    8097 OC +150 core / +1150 Mem
    8102 OC +155 core / +1200 Mem
    8133 OC +160 core/ +1300 Mem
    +165 locked up the benchmark :(

    1920 x 1080
    Fullscreen
    Ultra

    14284 OC +130 core / +950 Mem
    14981 OC +160 core/ +1300 Mem
     
    Last edited: Nov 18, 2019
    SpajdrEX likes this.
  15. Venix

    Venix Maha Guru

    Messages:
    1,099
    Likes Received:
    380
    GPU:
    Palit 1060 6gb
    Mantle was always open source the reason nvidia acted like it was not existant was because amd's name was so tightly connected with it.
     

  16. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    11,389
    Likes Received:
    3,408
    GPU:
    2080Ti @h2o
    And, under dx11, Nvidia saw a smaller gains than AMD due to different driver architecture, iirc.
     
  17. Astyanax

    Astyanax Ancient Guru

    Messages:
    5,397
    Likes Received:
    1,602
    GPU:
    GTX 1080ti
    nvidia acted like it didn't exist because they were already doing in their drivers what amd was wanting to achieve with mantle from the games development end.
     
  18. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,301
    Likes Received:
    170
    GPU:
    MSI GTX1070 GamingX
    Without Nvidia pushing RTX out, this demo wouldn't even exist. All I'm seeing is everyone jumping on the RT bandwagon. The whole industry has either gone this way or are deep into R&D to implement it into their products within 1-2yrs.
     
  19. Venix

    Venix Maha Guru

    Messages:
    1,099
    Likes Received:
    380
    GPU:
    Palit 1060 6gb
    The dx11 drivers on nvidia where for sure superior to amds on taking advantage of their cards while amd cards where to a point held back by their own dx11 drivers , now since your reasoning is that they ignored it cause they where doing so well on dx11 then why they supported vulkan on the get go ? Tech savy people already know that a huge portion of the work in it is a resault of direct contribution from the mantle code... Again the reason they supported vulkan so fast was because vulkan is not AMD's Vulkan . But this was not even the reason i did that post , if you see the part i quoted he made it appear like amd was the reason nvidia chose to ignore mantle because amd locked it down or something.

    Edit: i wanted to add that again amds approach on the graphics api's with mantle seems that it was the way to go after all the 2 dominant api's went full on with what they wanted to do with mantle... Mantle might failed to stay alive but sure changed the 2 most dominant api's to follow it's ways so in a sense it did what it was supposed to do.
     
    Last edited: Nov 18, 2019
  20. Astyanax

    Astyanax Ancient Guru

    Messages:
    5,397
    Likes Received:
    1,602
    GPU:
    GTX 1080ti
    DirectX11 hit a wall and wasn't getting any better.
     
    Valerys likes this.

Share This Page