Guru3D Reviews: GeForce RTX 2080 and 2080 Ti Founder Edition

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 19, 2018.

  1. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    3,053
    Likes Received:
    1,874
    GPU:
    Rtx 3090 Strix OC
    Easy, i don't even have to look at the frametimes, the bottom one (DLSS) is much more aliased, just as in the comparison shot made by anon.

    I will repeat what i already said - DLSS just looks upscaled from lower res, with TAA and sharpening filter applied.
     
  2. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,515
    Likes Received:
    2,912
    GPU:
    MSI 6800 "Vanilla"
    Is that not how it works? AI used for learning the scaling for a cheaper (less costly.) temporal AA solution with a option for DLSS X2 and super-sampling the image instead. (According to the white paper, guess it's not a option here.)
    Not really sure on the tech though, 1920x1080 -> 3840x2160 upscaling perhaps and then custom TAA to handle aliasing though with some chance of errors but at a much faster performance comparably.
    (2x according to the white paper which I assume means for the TAA and not the entire scene or overall performance or maybe the scaling performance being faster and using less GPU resources.)

    EDIT: Although if it is upscaling from 1920x1080 shouldn't performance be even better. Then again it might put some strain on the GPU cores just to do that even if aided by the Turing GPU improvements.
    (It's nice to see a sharper TAA mode though, one of the bigger downsides currently to it before now.)
     
  3. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,366
    GPU:
    6900XT+AW@240Hz
    Funny thing is: While lower screenshot is clearly based on DLSS and upper is TAA... it is not that one sided in IQ.
    DLSS causes those artifacts on horizontal lines.
    But its AA quality is far superior on nearly vertical edges. (Checking on edges around windows.)
    Now check cover disc on front tire. That's some night and day difference where DLSS wins by mile.

    This means that they have to do some more work to ensure that goodness it has for vertically oriented edges is applied in similar fashion on horizontal edges.
     
    pharma likes this.
  4. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    42,418
    Likes Received:
    10,239
    GPU:
    AMD | NVIDIA
    Not one AA technique is perfect (and if it is you can't run it as it takes a too hard perf hit). DLSS isn't perfect either, but it's pretty good. What is exciting about it though is the fact that it does not bring in an extra performance hit. Regardless of me saying that, of course, people, are free to think what they want and what to prefer of course. Each to his own and each opinion is right at some level.

    However if in the future you own a graphics card with AI Tensor cores, you'll be hard pressed to not use this technique over other AA solutions, but that's just my subjective observation, of course.
     
    Koniakki, Noisiv, XenthorX and 2 others like this.

  5. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    3,053
    Likes Received:
    1,874
    GPU:
    Rtx 3090 Strix OC
    According to what i've read, it's upscaling from 1440p, which would match the performance increase from using DLSS.
     
  6. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,206
    Likes Received:
    202
    GPU:
    EVGA GTX 1080@2,025
    Never saw anyone saying that. What i did see is the usual suspects proclaiming the 2080ti being slower than 1080ti in rastorized games, or only able to do 30fps in the tomb raider game, and that the 2080 would never be faster than the 1080ti. None of which is true.

    While playing current rasterized games i can see your point. But that's not the point of this card. There are 25 soon to be released games utilizing DLSS which not only greatly improves image quality, it does so while freeing cuda cores which results in much higher framerates.

    It's quite entertaining to see the same people praising AMD for having the first cards with full DX12 support which had a grand total of 15 games utilizing it in the first two years, while complaining about nVidia bringing far more revolutionary graphics technology like DLSS which is already listed in 25 incoming games. Oh wait... i get it now. I'd be upset too if I was an AMD owner, looking at this awesome new tech while knowing i can never have it without buying an nVidia card. You can't blame nVidia for that though. Blame AMD for their lack of innovation.
     
  7. msotirov

    msotirov Active Member

    Messages:
    54
    Likes Received:
    28
    GPU:
    rx 480
    What's with the Sapphire Nitro style cooler?
     
  8. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,366
    GPU:
    6900XT+AW@240Hz
    Actually, only one to blame here is Microsoft. You have been around for very long. I am sure you remember that non-standard, vendor limited/specific extensions everywhere. It was like minefield back in the day.
    Guess what happens if AMD, intel, nVidia starts bringing their specific features around.

    I do not want those old days back, that was some serious Sh*7 to deal with. If I want to blame nVidia for something, it would not be abusing their dominant position within boundaries set by MS. I would blame them for not adhering to globally available standard like VESA Adaptive-Sync. (I totally understand why they do it, but it is short sighted reason.)

    If nVidia supported Adaptive-Sync, they would unlikely make me to jump ship. (Unless AMD no longer has GPUs sufficient for 1080p gaming.)
    But imagine how many people owning Fiji, Polaris, Vega GPUs would jump to 2080(Ti) moment VESA Adaptive-Sync works there.
     
    Singleton99 and Dragam1337 like this.
  9. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,206
    Likes Received:
    202
    GPU:
    EVGA GTX 1080@2,025
    Yet people praised AMD for being the first to market with full DX12 capability while no games existed. And only 15 existed in the first two years. I don't recall anyone blaming them for the lack of DX12 games. Since clearly you don't understand this simple concept, ill explain. nVidia doesn't make video games. Amd doesn't make video games. Every time a ground breaking graphics technology comes to market, the hardware comes first. Then the software.

    Why weren't you same complainers railing against AMD for there not being any consumer software that utilizes 16+ cpu cores other than a few synthetic benchmarks? Last I checked you all were telling everyone that high core count support in software was coming soon. See the hypocrisy? The same group of people doing this are the same ones who take a dump in every intel related thread as well. It's clear that none of you are ever going to buy these products so why even bother? its not like its going to win over anyone to start buying AMD garbage. In fact, I bet it does more to hurt their business than it helps.
     
    pharma likes this.
  10. Clawedge

    Clawedge Ancient Guru

    Messages:
    2,599
    Likes Received:
    923
    GPU:
    Radeon 570
    Is my maths wrong or is the price per given performance alot more expensive with this generation?
     

  11. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,206
    Likes Received:
    202
    GPU:
    EVGA GTX 1080@2,025
    The only way to not have vendor specific features is by ensuring all graphics card manufacturers adhere to a very standardized architecture and by doing so, it will KILL innovation and competition, and likely bring about an nVidia monopoly far sooner since they wouldn't have to spend nearly as much on R&D and instead would drop their prices to the point that AMD would never be able to compete. I don't want graphics technology to stagnate. Vendor specific hardware based features are what drives the industry forward. I guess this explains why AMD fans constantly rail against them. Becuase those features have a good amount to do with nVidia dominating the market. All one needs to do is look at history to see that hardware based graphics technologies eventually become available to everyone once system performance gets to the point where specific hardware is no longer requires. Look at PhysX for example. When it first came out, there was no way in hell it could be properly done on CPU. But once it could, it no longer became propriatary. The last thing a company wants is for consumers to have their tech run like absolute garbage since the average joe has little technological knowledge and would blame nVidia. It' exactly why you wont be seeing raytracing and DLSS on the lower end 20xx cards.
     
    Finisterre likes this.
  12. Finisterre

    Finisterre Member

    Messages:
    28
    Likes Received:
    0
    GPU:
    NV
    So far I did not see any article about a very important feature of DLSS.
    It has the capability to improve IQ over time. If the AI algorithms improve for a game title (re-trained) and a new download becomes available with a better optimized DLSS profile the IQ will improve even further.
     
  13. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,206
    Likes Received:
    202
    GPU:
    EVGA GTX 1080@2,025
    Not when you realize that RTX 2080ti is the new Titan ($1k+ top tier) becuase nVidia peeled off the GTX from those cards and putting them in their own non-gaming family.

    Jay explains it fairly well around the 6 minute mark.

     
  14. Clawedge

    Clawedge Ancient Guru

    Messages:
    2,599
    Likes Received:
    923
    GPU:
    Radeon 570
    so this is the first time the price for performance has gone up for a Ti card. Shouldn't the IT industry for progressing? Not devolving to Apple I mean appalling regression?
     
  15. Finisterre

    Finisterre Member

    Messages:
    28
    Likes Received:
    0
    GPU:
    NV
    Your math is wrong. You did not include some very important variables next to the performance increase in your formula:
    • Image quality improvements.
    • Addition of new visual effects previously not available in real time.
     

  16. HybOj

    HybOj Master Guru

    Messages:
    226
    Likes Received:
    161
    GPU:
    ASUS GTX 970 DCmini
    Clawedge, Your math is right, and 2080ti being new titan or whatever is just nonsense, because RTX2080ti is a gaming card and Titan is not. Also, its called 2080ti and not TITAN btw.

    Someone posted frame/usd comparison on page 4 at this discussion
    example:
    1070ti = 8,57usd/FPS
    1080ti = 9.57usd/FPS
    2080ti = 12,9usd/FPS

    so thats 30% price per frame increase comparing 1080ti to 2080ti... LOL

    EDIT:
    A)DLSS looks worse than standard AA so I fail to see the image quality improvements
    B)new visual effects are not available anywhere, all we know is, that they hit the frametimes very hard and thats it
     
    Last edited: Sep 20, 2018
  17. gx-x

    gx-x Ancient Guru

    Messages:
    1,509
    Likes Received:
    156
    GPU:
    1070Ti Phoenix
    guilty as charged. Oh well...It SHOULD be faster...
     
    SerotoNiN and Dragam1337 like this.
  18. alanm

    alanm Ancient Guru

    Messages:
    10,663
    Likes Received:
    2,760
    GPU:
    Asus 2080 Dual OC
    Only positive thing I can say about the RTX launch is that Nvidia finally set the ball rolling with real time ray tracing. Game studios would not have wasted their time with it if the hardware was not available first. And Nvidia, at some risk, brought out the first prototype cards to demonstrate it. I am sure they knew they would face strong criticism for the weak capability of the cards to perform RT at high res/FPS, but they were also confident that they could still reel in a LOT of first time adopters to buy the cards (albeit with a lot of hype, smoke and mirrors). So a big thank you to all the early adopters for pitching in to help finance Nvidias RT development efforts in this regard, which I can reap the fruits of in next or 3rd gen RTX cards to come out.
     
    StewieTech and Clawedge like this.
  19. Texter

    Texter Ancient Guru

    Messages:
    3,175
    Likes Received:
    256
    GPU:
    Club3d GF6800GT 256MB AGP
    Seems to me nVidia are also letting them pay for Volta...o_O
     
  20. aNoN_

    aNoN_ Member

    Messages:
    47
    Likes Received:
    3
    Thanks for the download link H! Appreciate it, now I can do a somewhat proper analysis. Yeah, I really just did that, because there wasn't any other way at that moment. I don't have access to review samples of the RTX cards or fancy experimental DLSS demos :p But you do, hence why I asked for raw sources ;)

    Anyway, here are my latest comparisons based on your shared material:

    1. FFXV - http://screenshotcomparison.com/comparison/120782
    2. Infiltrator (PS) - http://screenshotcomparison.com/comparison/120783
    3. Infiltrator (RAW) - http://screenshotcomparison.com/comparison/120784

    My findings:

    In FFXV, with DLSS aliasing is much better resolved, but the whole image is blurrier and also textures seem way less sharp, this is especially evident when looking at the ground below the car.
    In Infiltrator, it's kinda hard to do a proper comparison due to the sh*t-ton of post processing going on and the fact that there's still some compression going on in the videos, but with DLSS ON textures seem to present slightly less detail. I also found that aliasing is a little bit better resolved without DLSS. It's strange, but for some reason texture-shaders or detail texture layers doesn't seem to load in properly or at all with DLSS. However, I did also spot that reflections are of better quality with DLSS, this is evident from the reflection of the valve that is positioned on the wall to the right just above the rightmost bridge. Overall though, the differences are quite small and we are nitpicking here, I think nVidia has done a pretty good job and this DLSS technique is something really interesting, especially considering it's in its infancy. I mean, gaining 20 something extra FPS from activating DLSS can make a supported GPU output 30 FPS instead of 10, or 60 FPS instead of 40, which is a pretty big deal, especially the former in terms of playability. So I approve of this new technique!

    If Hilbert could take raw screenshots from the opening scene in them Infiltrator demos, timing it like a second or two after the machines "arms" stopps moving, to get two similar timestamps, that would be even more awesome and we could do a really proper comparison.
    But that's asking for too much I suppose, he already did so so much for us.

    Thanks again Hilbert! Stay awesome!
     
    Prince Valiant likes this.

Share This Page