Remedy Shows RTX Raytraing performance cost

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Oct 17, 2018.

  1. TieSKey

    TieSKey Member Guru

    Messages:
    180
    Likes Received:
    61
    GPU:
    Gtx870m 3Gb
    Completely agree on that. I think they just added it to boost cuda/AI performance on non professional cards and maybe as an excuse to increase prices cuz it does sound "cool" :S
    The new AA technique uses the tensor hard and it performs reasonably well but not enough to justify the almost dedicated cores.

    Lol, that's a good reason.
     
    -Tj- likes this.
  2. tunejunky

    tunejunky Maha Guru

    Messages:
    1,217
    Likes Received:
    433
    GPU:
    RadeonVII RTX 2070

    to me the real feature of the RTX cards is DLSS
     
  3. Denial

    Denial Ancient Guru

    Messages:
    13,234
    Likes Received:
    2,725
    GPU:
    EVGA RTX 3080
    Tensors can also be utilized to accelerate the denoising process for RT, offloading it from FP cores. No devs are choosing to utilize them thus far, probably because they can't just direct port it to other non-tensor platforms, while an OpenCompute/DirectCompute variant can be. Not to mention they also plan on adding image upscaling, video upscaling and other features overtime via NGX that all utilize tensor.
     
    -Tj- likes this.
  4. -Tj-

    -Tj- Ancient Guru

    Messages:
    17,124
    Likes Received:
    1,902
    GPU:
    Zotac GTX980Ti OC
    Ok thanks. I use to check e.g. white papers a lot by each new architecture, but this gen I was like meh.. guess that price put me off, also minimal boosts compared to older gen by normal raster.
     

  5. pharma

    pharma Ancient Guru

    Messages:
    1,660
    Likes Received:
    475
    GPU:
    Asus Strix GTX 1080
    Someone's got to be fir
    Where did you come up with this idea?
     
  6. pharma

    pharma Ancient Guru

    Messages:
    1,660
    Likes Received:
    475
    GPU:
    Asus Strix GTX 1080
  7. Denial

    Denial Ancient Guru

    Messages:
    13,234
    Likes Received:
    2,725
    GPU:
    EVGA RTX 3080
    This quote from anandtech along with multiple developers stating they are implementing their own denoising methods.
     
    Last edited: Oct 18, 2018
    Maddness likes this.
  8. pharma

    pharma Ancient Guru

    Messages:
    1,660
    Likes Received:
    475
    GPU:
    Asus Strix GTX 1080
    Denoising is primarily used by movie studios, designers, etc ... which cuts down the processing time required to hours/days, and the adoption rate is increasing. The RTX hardware gives game developers the ability to use these same denoising techniques if they desire. Those that already have their own in-house denoisers will likely continue from the investment point as well as the "learning" curve associated with implementing an AI based denoiser (both developer and consumer). It's not due to "portability" of not having an "OpenCompute/DirectCompute" solution, if that is a viable solution to begin with.

    Game designers will likely focus on these RTX features:
    Ray tracing (RT Ambient Occulsion, RT Global Illumination, RT Shadows, etc)
    Variable rate shading
    Motion Adaptive shading
    Content Adaptive shading
    Mesh Shading
    Texture Space Shading
    AI-UpRes (tensor)
    DLSS (tensor)
     
    Last edited: Oct 18, 2018
    Maddness likes this.
  9. JJayzX

    JJayzX Master Guru

    Messages:
    485
    Likes Received:
    13
    GPU:
    Evga RTX2070XCUltra
    The noise and graininess you see isn't film grain effect or noise. This is what happens when ray-tracing doesn't get to completely render a scene. They're not letting each frame render completely or it would be a stutter fest.
     
  10. Dazz

    Dazz Master Guru

    Messages:
    907
    Likes Received:
    98
    GPU:
    ASUS STRIX RTX 2080
    Gonna pass on this generation the performance impact is still to great to be useful not when you have been used to 60-100fps Although i have a Gsync monitor it's range is like 25fps to 100 fps at 3440x1440. playing a game averaging 30fps at 1080p is not to be liking. I will plod on with the 1080Ti for a bit longer.
     

  11. MrBonk

    MrBonk Ancient Guru

    Messages:
    3,189
    Likes Received:
    169
    GPU:
    MSI RTX 2080
    Good lord the noise looks awful. What a terrible image quality trade off for such much performance. You'd really be better off investing that rendering time into something else.
    As i've said many times, just one more awful artifact they try to correct with Temporal filtering or AA to ugly artifact filled results.

    Coming from someone who actually likes effects like Film Grain. That kind of noise is a different beast.
     
  12. sykozis

    sykozis Ancient Guru

    Messages:
    21,793
    Likes Received:
    1,052
    GPU:
    MSI RX5700
    For price, yes. For RayTracing, we don't know yet. This was only a tech demo and should not be used to judge the success or failure of the RTX cards. RayTracing is the future, either way. It's the only way to create the realism that people want.
     
    Maddness and Caesar like this.
  13. MegaFalloutFan

    MegaFalloutFan Master Guru

    Messages:
    821
    Likes Received:
    137
    GPU:
    RTX3090
    Same here.
    A properly build game that utilizes RTX MUST also utilize DLSS.
    1080/60fps with DLSS can be 1440p and that's a better resolution for 4K monitor.

    P.S.Although i noticed a strange thing on my OLED that I use instead of monitor, 1080p look better then 1440p, even when i tested for just text, its like the TV has some sort of 1080p to 4K optimized upscaler that only works for 1080p resolution, it somewhat makes sense since most TV sources still do 1080p and have to look good on 4K TV.
     
    tunejunky likes this.
  14. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,501
    Likes Received:
    3,193
    GPU:
    5700XT+AW@240Hz
    Display on it pattern of vertical lines: alternating black and white -> size of 1 pixel
    Then do same with horizontal lines. Or use 1 pixel checkerboard. All pixels should be clearly visible, no blurring.
    That's as long as you are on native resolution. Or resolution resulting from native one being divided by whole number without result having decimals.
    If your TV is 4K, then it is 2x horizontal and vertical from 1080p, therefore crisp image.
     
    MegaFalloutFan likes this.
  15. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,185
    Likes Received:
    191
    GPU:
    EVGA GTX 1080@2,025
    Because... (choose one or more of the following)

    1. ... they crap in every nVidia or Intel related thread no matter the content.
    2. ... they can't afford the card
    3. ... they are AMD fanboys and since AMD has nothing remotely comparable, they talk smack
    4. ... they don't understand what Raytracing, DLSS, etc are (i hope they allow raytracing and DLSS to be enabled on ALL cards just so people can see how much faster these RTX cards are)
     
    pharma and tensai28 like this.

  16. nicugoalkeper

    nicugoalkeper Master Guru

    Messages:
    895
    Likes Received:
    23
    GPU:
    ASUS GTX 1060 DUAL OC 6GB
    Everyone is entitled to an opinion and this is my:
    1. No real life game, only demos that may or may not be accurate. A developer invest a small amount of time for demos (so poor optimization), not to mention the drivers for the card.
    2. It is a new tech, it has flaws (in regard to consumer graphic cards), drivers and so on...
    3. Nvidia is doing a smart move adding raytracing, AMD is lagging behind. Not to mention that i don't think NVIDIA will make the same mistake Intel did with CPU. Nvidia is keeping the big guns ready, just in case AMD get a good card out.
    4. This is a good way for Nvidia to see de adoption rate of this new tech and fix the eventual flaws for the next gen card.
    We need to wait and see when we got games that suport the tech.
     
  17. pharma

    pharma Ancient Guru

    Messages:
    1,660
    Likes Received:
    475
    GPU:
    Asus Strix GTX 1080
    Funny how things get overblown when taken out of context. One of the editors at PCGH (PCGamesHardware.de) said the following regarding the Remedy tech demo.

     
    Last edited: Oct 19, 2018
  18. Turanis

    Turanis Ancient Guru

    Messages:
    1,748
    Likes Received:
    434
    GPU:
    Gigabyte RX500


    If you watch a game from a microscope you will see the differences,otherwise not.
    DLSS("Deep Learning" Super Sampling) its a feature not implemented anywhere,but will have a future sometime if more devs will want it.
     
    Last edited: Oct 19, 2018
  19. Maddness

    Maddness Maha Guru

    Messages:
    1,459
    Likes Received:
    674
    GPU:
    3080 Aorus Xtreme
    The thing with DLSS is, the game dev's don't have to do any work. They simply supply there game code to Nvidia and they use there super computer to run the algorithm, then Nvidia releases that in a driver update. So dev support isn't needed, there's nothing for them to do. So as long as Nvidia supports it and there is no issues with compatibility to games almost all games could use it.
     
    pharma and tunejunky like this.
  20. tunejunky

    tunejunky Maha Guru

    Messages:
    1,217
    Likes Received:
    433
    GPU:
    RadeonVII RTX 2070

    ...and DLSS gets better the more data points it has.
    nothing in the first six to 8 months will be all that impressive, but once it has collected the data it could be amazeballs.
     
    Maddness and pharma like this.

Share This Page