NVIDIA Ampere rumored to get four times more Ray-Tracing perf, no more GTX either

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 6, 2020.

  1. Mesab67

    Mesab67 Guest

    Messages:
    244
    Likes Received:
    85
    GPU:
    GTX 1080
    To categorically and stupidly state that "those who were going to buy a ti did" pretty much sums up the mentality of Ngreedia's paid shill army. The primary rule of price-to-performance applies, just as it did previously and will do so in the future. Note the emphasis, as always, on price...particularly in the current climate.
     
    HARDRESET likes this.
  2. Valerys

    Valerys Master Guru

    Messages:
    395
    Likes Received:
    18
    GPU:
    Gigabyte RTX2080S
    The "primary rule of price-to-performance" got shaky at the higher end of the spectrum. You can get a gpu with great price-to-performance that does let's say 45 fps in a game at 4K and another one that does 60 but the price/performance ratio is skewed towards the price. It's not fair but it's the only product that can give that performance. Buying it is frustrating because it may feel like a ripoff but buying the lower tier with great price is also frustrating because it makes the games less playable at 4K. Not buying anything is also frustrating because no one really knows if things will improve significantly in the price/performance ratio and postponing playing the games also makes them lose some their value. Personally I found the RTX 2080 Super as the middle ground this generation, slightly expensive for what it offers but still adequate enough for 4K without going all the way to the Ti.
     
  3. Venix

    Venix Ancient Guru

    Messages:
    3,473
    Likes Received:
    1,972
    GPU:
    Rtx 4070 super
    @Valerys agreed , but the lines getting blurry at the enthusiast market there are people that will not compromise they have the money to spend and they want the best regardless of cost. There are a lot of enthusiast that their budget is no virtually unlimited and they will choose an rx 5700 xt or rtx 2070 and play on 1440p because after that you choose 1000-1200 for a gpu and aim 4k .....or the same amount for the whole pc and 1440p ? I for example am very much on the 2nd category and i am not convinced that 4k worth the cost over 1440p in 4-5 years i will most likely make that jump when everyone is aiming 6k or 8k :p
     
  4. Silva

    Silva Ancient Guru

    Messages:
    2,051
    Likes Received:
    1,201
    GPU:
    Asus Dual RX580 O4G
    I understand we now have dedicated hardware for the current way of processing graphics. But having two different techs on a silicon, where mostly we only use the old one, is not smart for me.
    I'll stick with the old tech and high fps, I don't see such a big difference with RTX. It needs to evolve to be the dominant way of computing graphics before I'll adopt it.
     

  5. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,040
    Likes Received:
    7,381
    GPU:
    GTX 1080ti
    Again, if you use words like ngreedia, you disqualify yourself from being part of a discussion.
     
  6. wavetrex

    wavetrex Ancient Guru

    Messages:
    2,465
    Likes Received:
    2,578
    GPU:
    ROG RTX 6090 Ultra
    <- ngreedia product owner.

    Doesn't change the fact that their product prices are creeping up uncontrollably.
    GTX 1080 already felt very expensive for an otherwise mid-range product (Only the 1080 Ti and Titan were the real "high-end" ones)

    Turing ? Simply unacceptable.
    Sorry, excuse me: Turding.
     
    HARDRESET likes this.
  7. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Maybe but you can say this about basically every new feature that requires hardware.

    For example, when they added tessellation to DX11, there was a lag time where it was basically just useless transistors on every card and when it start showing up in games. Now it's basically ubiquitous. GCN had asynchronous hardware for nearly three years before it was utilized. Honestly most of GCN in general was underutilized and still is. The list goes on and on.

    Right now, sure RT cores and Tensors for the most part are wasted silicon but at some point that technology will be ubiquitous. The consoles having support is definitely going to drive this forward. If that happens quickly and Turing gets carried along we can all sit here and say "fine wine" if not then maybe it was a little premature, regardless it's a chicken and egg problem. I personally rather companies push for new innovation then sit around. That being said the price of Turing is obviously very prohibitive. I've purchased almost every generation of Nvidia cards since the 8800GTX (the GTX 780 Ti is the only one I skipped because i had a 690), and now I'm skipping the 2080Ti. I can't justify shelling out $1200 for a marginal upgrade.
     
    Exodite, Aura89 and Maddness like this.
  8. H83

    H83 Ancient Guru

    Messages:
    5,512
    Likes Received:
    3,036
    GPU:
    XFX Black 6950XT
    I think the problem is that when we finally reach the point when the tech is mature, the current cards are probably going to old and to weak to take proper advantage of the features they are currently promising... Anyway, my next card is going to depend of the overall performance they provide, not because of RT performance.
     
    Backstabak and Silva like this.
  9. alanm

    alanm Ancient Guru

    Messages:
    12,273
    Likes Received:
    4,477
    GPU:
    RTX 4080
    Oh I think just about everyone knew that even before Turings release. We all saw the poor performance of the RT demos with a 2080ti @ 1080p on launch day. I doubt anyone bought their Turing cards with RT as the main buying criteria. It was obvious to most that Turing was just to lay the groundwork for RT and to get developers to use it in their games and that Ampere may be point to where we take it more seriously.
     
  10. Mesab67

    Mesab67 Guest

    Messages:
    244
    Likes Received:
    85
    GPU:
    GTX 1080
    ...and yet you felt the need to reply...again ;)
     
    Silva likes this.

  11. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    15,140
    Likes Received:
    1,743
    GPU:
    GTX 1080 Ti
    Let's stop. Both of you.
     
  12. metagamer

    metagamer Ancient Guru

    Messages:
    2,596
    Likes Received:
    1,165
    GPU:
    Asus Dual 4070 OC
    NVIDIA have been smashing it out the park recently. These new GPUs will offer a lot more grunt for RTX and let's hope DLSS 2.0 really catches on because it's a game changer. I would go as far as to say that it's the coolest bit of tech NVIDIA have come up with recently, the performance gain is ridiculous. And then we have RTX Voice.

    All this on top of the CPU battle that is really starting to look interesting. Intel releasing 10c/20t 5ghz+ CPU for sub $500, their 6c/12t also look great at $160. Then there's Ryzen to come with Zen3 later on. And then we'll have the 11xxx Intels, that's where it starts heating up.

    Good times. Finally.
     
    Last edited: May 7, 2020
    Noisiv, jbscotchman and CPC_RedDawn like this.
  13. user1

    user1 Ancient Guru

    Messages:
    2,782
    Likes Received:
    1,305
    GPU:
    Mi25/IGP
    The more you buy, the more you save!
     
    Undying likes this.
  14. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    10,451
    Likes Received:
    3,131
    GPU:
    PNY RTX4090
    needs to be at least 80% faster than my 1080Ti for me to consider dropping the kind of money Nvidia are going to be charging for these. Unless AMD can bring some competition back to the table and force a price war! :D
     
  15. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,793
    Likes Received:
    1,396
    GPU:
    黃仁勳 stole my 4090
    Joke's on you, there are no arms or legs left for nVidia to try to collect, so now they charge in virgin blood and souls. You'd think virgin blood would be common but it's very rare in places like Canada because our government fracks us all on a regular basis.

    As much as I'd like an upgrade in order to play Cyberpunk 2077 at high frame rates, my wallet is gonna laugh at me, pause and ask if I'm serious, then laugh harder.
     
    -Tj- and carnivore like this.

  16. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Currently RT core cluster takes around 15% of SM. With some optimizations, nVidia can make 4 times as many RT cores per SM without them taking more than 30% of SM.
    I guess that they may have 3 times as many and rest of performance comes from small clock bump and optimizations.

    That really means next generation does not have to be that much more expensive while providing same rasterization performance + double triple RT.
    But as I wrote in thread for rumor which states double RT cores per SM + small clock bump: Such increase in RT performance will place all existing cards from Turing line at bottom of food chain in RT enabled games.
     
  17. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Where are you getting the 15% from?

    Based on DF's video of Minecraft RT, the Nvidia devs made it sound like biggest bottleneck is the denoise process - which is currently not an AI based denoiser. I'm not even sure if it runs on tensor cores at all. I have a feeling all these rumors talking about double or more RTs per SM are fake tbh. Nvidia already talked about how the RT algorithms can be improved within the RT core for future architecture iterations - so you could see a % improvement even with the same RT core count on that side.

    Depending on how Turing's RT performance lines up with RDNA 2 - Ampere might just be really far head while Turing and RDNA 2 is just baseline for console ports. Hopefully VRS/Mesh will come along for that ride as well.
     
  18. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    I did not write about Tenors, but RTs.
    Secondly, if you have 4 times as many rays per frame, you do not need as good denoising.

    15% is my guess from nVidia's die shots images comparisons. Do you have other guess? I doubt it would be bigger. If anything, it will be smaller. But in Turing I tend to believe they had to spend extra I/O and cache budget since they "glued" it together.
    They surely had space for improvements since then. And please, do not take my post as attack. If you can remember, when nVidia came with RT, I wrote that it is bad move with given performance.
    And that I would consider pure RT accelerator card like original 3Dfx. Because one really needs much higher RT performance than Turing delivered for real interesting implementations.
    - - - -
    And as far as Fake 2x RT per SM goes. 2x is believable to me. And I see it as good thing even while it kills resale value for current RTX cards.
    As I wrote before: As long as RT provides bit better alternative to planar reflections of same old "ugly" non-realistic renders, I am not very interested.
    I want RT results like in CGI movies. I want it to move from usual computer game render graphics to something closer to photo-realism in movies.
     
  19. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    I know you wrote RT.

    Honestly I was just curious because I've seen a lot of discussion on the size of the RT cores but I've never seen any concrete proof how large they are.

    And yeah I think that's a good point about the denoising.
     
    Fox2232 likes this.
  20. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,040
    Likes Received:
    7,381
    GPU:
    GTX 1080ti
    Last edited: May 8, 2020

Share This Page