Remedy Shows RTX Raytraing performance cost

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Oct 17, 2018.

  1. TieSKey

    TieSKey Member Guru

    Messages:
    179
    Likes Received:
    60
    GPU:
    Gtx870m 3Gb
    Completely agree on that. I think they just added it to boost cuda/AI performance on non professional cards and maybe as an excuse to increase prices cuz it does sound "cool" :S
    The new AA technique uses the tensor hard and it performs reasonably well but not enough to justify the almost dedicated cores.

    Lol, that's a good reason.
     
    -Tj- likes this.
  2. tunejunky

    tunejunky Maha Guru

    Messages:
    1,081
    Likes Received:
    387
    GPU:
    RadeonVII RTX 2070

    to me the real feature of the RTX cards is DLSS
     
  3. Denial

    Denial Ancient Guru

    Messages:
    13,151
    Likes Received:
    2,648
    GPU:
    EVGA RTX 3080
    Tensors can also be utilized to accelerate the denoising process for RT, offloading it from FP cores. No devs are choosing to utilize them thus far, probably because they can't just direct port it to other non-tensor platforms, while an OpenCompute/DirectCompute variant can be. Not to mention they also plan on adding image upscaling, video upscaling and other features overtime via NGX that all utilize tensor.
     
    -Tj- likes this.
  4. -Tj-

    -Tj- Ancient Guru

    Messages:
    17,037
    Likes Received:
    1,863
    GPU:
    Zotac GTX980Ti OC
    Ok thanks. I use to check e.g. white papers a lot by each new architecture, but this gen I was like meh.. guess that price put me off, also minimal boosts compared to older gen by normal raster.
     

  5. pharma

    pharma Ancient Guru

    Messages:
    1,561
    Likes Received:
    409
    GPU:
    Asus Strix GTX 1080
    Someone's got to be fir
    Where did you come up with this idea?
     
  6. pharma

    pharma Ancient Guru

    Messages:
    1,561
    Likes Received:
    409
    GPU:
    Asus Strix GTX 1080
  7. Denial

    Denial Ancient Guru

    Messages:
    13,151
    Likes Received:
    2,648
    GPU:
    EVGA RTX 3080
    This quote from anandtech along with multiple developers stating they are implementing their own denoising methods.
     
    Last edited: Oct 18, 2018
    Maddness likes this.
  8. pharma

    pharma Ancient Guru

    Messages:
    1,561
    Likes Received:
    409
    GPU:
    Asus Strix GTX 1080
    Denoising is primarily used by movie studios, designers, etc ... which cuts down the processing time required to hours/days, and the adoption rate is increasing. The RTX hardware gives game developers the ability to use these same denoising techniques if they desire. Those that already have their own in-house denoisers will likely continue from the investment point as well as the "learning" curve associated with implementing an AI based denoiser (both developer and consumer). It's not due to "portability" of not having an "OpenCompute/DirectCompute" solution, if that is a viable solution to begin with.

    Game designers will likely focus on these RTX features:
    Ray tracing (RT Ambient Occulsion, RT Global Illumination, RT Shadows, etc)
    Variable rate shading
    Motion Adaptive shading
    Content Adaptive shading
    Mesh Shading
    Texture Space Shading
    AI-UpRes (tensor)
    DLSS (tensor)
     
    Last edited: Oct 18, 2018
    Maddness likes this.
  9. JJayzX

    JJayzX Master Guru

    Messages:
    483
    Likes Received:
    13
    GPU:
    Evga RTX2070XCUltra
    The noise and graininess you see isn't film grain effect or noise. This is what happens when ray-tracing doesn't get to completely render a scene. They're not letting each frame render completely or it would be a stutter fest.
     
  10. Dazz

    Dazz Master Guru

    Messages:
    901
    Likes Received:
    97
    GPU:
    ASUS STRIX RTX 2080
    Gonna pass on this generation the performance impact is still to great to be useful not when you have been used to 60-100fps Although i have a Gsync monitor it's range is like 25fps to 100 fps at 3440x1440. playing a game averaging 30fps at 1080p is not to be liking. I will plod on with the 1080Ti for a bit longer.
     

  11. MrBonk

    MrBonk Ancient Guru

    Messages:
    3,152
    Likes Received:
    149
    GPU:
    MSI RTX 2080
    Good lord the noise looks awful. What a terrible image quality trade off for such much performance. You'd really be better off investing that rendering time into something else.
    As i've said many times, just one more awful artifact they try to correct with Temporal filtering or AA to ugly artifact filled results.

    Coming from someone who actually likes effects like Film Grain. That kind of noise is a different beast.
     
  12. sykozis

    sykozis Ancient Guru

    Messages:
    21,783
    Likes Received:
    1,048
    GPU:
    MSI RX5700
    For price, yes. For RayTracing, we don't know yet. This was only a tech demo and should not be used to judge the success or failure of the RTX cards. RayTracing is the future, either way. It's the only way to create the realism that people want.
     
    Maddness and Caesar like this.
  13. MegaFalloutFan

    MegaFalloutFan Master Guru

    Messages:
    806
    Likes Received:
    131
    GPU:
    RTX3090
    Same here.
    A properly build game that utilizes RTX MUST also utilize DLSS.
    1080/60fps with DLSS can be 1440p and that's a better resolution for 4K monitor.

    P.S.Although i noticed a strange thing on my OLED that I use instead of monitor, 1080p look better then 1440p, even when i tested for just text, its like the TV has some sort of 1080p to 4K optimized upscaler that only works for 1080p resolution, it somewhat makes sense since most TV sources still do 1080p and have to look good on 4K TV.
     
    tunejunky likes this.
  14. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,187
    Likes Received:
    2,983
    GPU:
    5700XT+AW@240Hz
    Display on it pattern of vertical lines: alternating black and white -> size of 1 pixel
    Then do same with horizontal lines. Or use 1 pixel checkerboard. All pixels should be clearly visible, no blurring.
    That's as long as you are on native resolution. Or resolution resulting from native one being divided by whole number without result having decimals.
    If your TV is 4K, then it is 2x horizontal and vertical from 1080p, therefore crisp image.
     
    MegaFalloutFan likes this.
  15. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,177
    Likes Received:
    189
    GPU:
    EVGA GTX 1080@2,025
    Because... (choose one or more of the following)

    1. ... they crap in every nVidia or Intel related thread no matter the content.
    2. ... they can't afford the card
    3. ... they are AMD fanboys and since AMD has nothing remotely comparable, they talk smack
    4. ... they don't understand what Raytracing, DLSS, etc are (i hope they allow raytracing and DLSS to be enabled on ALL cards just so people can see how much faster these RTX cards are)
     
    pharma and tensai28 like this.

  16. nicugoalkeper

    nicugoalkeper Master Guru

    Messages:
    893
    Likes Received:
    12
    GPU:
    ASUS GTX 1060 DUAL OC 6GB
    Everyone is entitled to an opinion and this is my:
    1. No real life game, only demos that may or may not be accurate. A developer invest a small amount of time for demos (so poor optimization), not to mention the drivers for the card.
    2. It is a new tech, it has flaws (in regard to consumer graphic cards), drivers and so on...
    3. Nvidia is doing a smart move adding raytracing, AMD is lagging behind. Not to mention that i don't think NVIDIA will make the same mistake Intel did with CPU. Nvidia is keeping the big guns ready, just in case AMD get a good card out.
    4. This is a good way for Nvidia to see de adoption rate of this new tech and fix the eventual flaws for the next gen card.
    We need to wait and see when we got games that suport the tech.
     
  17. pharma

    pharma Ancient Guru

    Messages:
    1,561
    Likes Received:
    409
    GPU:
    Asus Strix GTX 1080
    Funny how things get overblown when taken out of context. One of the editors at PCGH (PCGamesHardware.de) said the following regarding the Remedy tech demo.

     
    Last edited: Oct 19, 2018
  18. Turanis

    Turanis Ancient Guru

    Messages:
    1,699
    Likes Received:
    394
    GPU:
    Gigabyte RX500


    If you watch a game from a microscope you will see the differences,otherwise not.
    DLSS("Deep Learning" Super Sampling) its a feature not implemented anywhere,but will have a future sometime if more devs will want it.
     
    Last edited: Oct 19, 2018
  19. karma777police

    karma777police Master Guru

    Messages:
    237
    Likes Received:
    81
    GPU:
    1080
    If you bought 2080 RTX for ray tracing reason, boy you made a mistake. Ray tracing is the future but not now, more like in 3 years at least. In other words 4000 series card is going to be the one for actualy ray tracing enabled games. 2080 RTX is nothing but demo card...and last time I checked is that I don't play or care about demos and that demo costs $1200. No, thanks.

    Speaking of DLSS, I find it useless. TAA implementation in new Tomb Raider for example is just awesome and I get 60> FPS (All the time) in that game on Ultra Setting 4k with 1080ti SLI. Two 1080ti costed me less than single 2080 RTX which dips into 30-40 in this game without Ray Tracing enabled. Got lucky to buy 1080ti for $525 when 2080 was announced and unfortunately for others the price for 1080 ti went up.
     
  20. Maddness

    Maddness Maha Guru

    Messages:
    1,447
    Likes Received:
    649
    GPU:
    3080 Aorus Xtreme
    The thing with DLSS is, the game dev's don't have to do any work. They simply supply there game code to Nvidia and they use there super computer to run the algorithm, then Nvidia releases that in a driver update. So dev support isn't needed, there's nothing for them to do. So as long as Nvidia supports it and there is no issues with compatibility to games almost all games could use it.
     
    pharma and tunejunky like this.

Share This Page