AMD Radeon 7900 XT / RDNA3 series announcements and preview discussion

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 3, 2022.

  1. pegasus1

    pegasus1 Ancient Guru

    Messages:
    5,227
    Likes Received:
    3,636
    GPU:
    TUF 4090
    Very possibly, if so then maybe thats what happened,
     
  2. vestibule

    vestibule Ancient Guru

    Messages:
    2,237
    Likes Received:
    1,474
    GPU:
    Radeon RX6600XT
    Strange how they have not given the card exhaust capability. It kind of implies that it runs cool.
     
  3. pegasus1

    pegasus1 Ancient Guru

    Messages:
    5,227
    Likes Received:
    3,636
    GPU:
    TUF 4090
    The cooling fins don't run end to end.
     
    vestibule likes this.
  4. Brasky

    Brasky Ancient Guru

    Messages:
    2,611
    Likes Received:
    650
    GPU:
    Gigabyte 4070 Ti Su
    I'll wait for the reviews, but I'm optimistic that it'll be solid competition for Nvidia.
     

  5. barbacot

    barbacot Maha Guru

    Messages:
    1,005
    Likes Received:
    986
    GPU:
    MSI 4090 SuprimX
    I am bothered by the fact that they gave little to no details on their benchmarks: what CPU did they used? Did they used smart access memory?
     
    AuerX likes this.
  6. brogadget

    brogadget Master Guru

    Messages:
    289
    Likes Received:
    78
    GPU:
    2xR9 280x 3GB
    As far as I know, this is a tech forum, so,
    never say impossible: there has been always people saying "not possible". What about an add-on "plugged" directly onto main card with an interconnect like SLI, no PCIe bus needed. There will be always many challenging problems in hard and software, but if you don´t pay attention, become fat, lazy and greedy, you lost already. Keep your mind open for new ideas. Interconnects got incredible fast: e.g. you can run a current modern GPU in PCIe5.0x4 mode without lost in performance.

    Imo, even if you don´t use RT + AI, you must power and cool them, what a waste of energy and what a waste of space on die, as a designer, this is the stupiest thing you can do. Anyway, the main reason for seperate RT + AI from GPU is, that customer has a choice then.
    Second, simply, because current RT implemention is just much toooooo weak, you will never get a good RT performance, which is needed for a fully raytraced game, at least not for years (never say never..lol..). If you want to do it right, you need a special RT GPU with 100x RT performance than current ADA.

    Edit: btw, this is not at all a "new idea", happened already a very long time ago: remember "physics" cards, Voodoo, 3DFX, the cards which made nvidia "rich".
     
    Last edited: Nov 4, 2022
  7. brogadget

    brogadget Master Guru

    Messages:
    289
    Likes Received:
    78
    GPU:
    2xR9 280x 3GB
    yeah getting old....honestly, I can´t see a big difference in RT on/off, for me it is just a very expensive gimmick....I don´t need...
     
  8. Undying

    Undying Ancient Guru

    Messages:
    25,529
    Likes Received:
    12,916
    GPU:
    XFX RX6800XT 16GB
    Raytracing is nice where it counts but people dont get it there is no games to play. When you finish Dying Light 2, Control, Metro you are left wanting more but there is no more. Well there is but those games do not make such a big difference and was not made with raytracing in mind. You can run 3DMark and Cyberpunk on your 4090 all day long that must be fun lol
     
    pegasus1 likes this.
  9. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080

    I'll start off by saying it's not technically "impossible" but it's definitely not viable and not worth the tradeoff and I think it is *practically* impossible to make it so.

    You say "what about SLI" but if you know anything about SLI you know that it doesn't send frame data between cards. It sends a completed final frame at the end of the render process. It does that because that's the best way to solve the latency problem. Your hypothetical RT SLI card would need tons of interframe data to calculate the BVH - it would then need to send that data back to the "Main GPU" to render the final image. That's not going to happen on SLI latency. In fact, neither AMD/Nvdia has even demonstrated that on chiplet latency which is order of magnitudes faster. All the white papers we have on that issue (mostly from Nvidia) indicate that you'd need 3 or 4x times the bandwidth than current interconnect technology allows to overcome the latency problems with chipets to break up the GPU cluster. You'd also need a completely new scheduler designed to overcome a ton of obstacles related to that and even then it doesn't scale well with most workloads.

    Tensor cores are power gated when off. RT cores use power all the time because it's connected to the SM.. but they represent a small part of the SM and no one has shown any real data indicating that removing them would yield a large die savings. It's not like AMD's dies are drastically smaller per tflop of FP32.

    I don't really see how it's too weak in it's current form. 4090 is running games with pretty heavy raytracing features at ~60FPS at 4K.. some at 100fps at 4K and this is without SER optimizations & new DXR features/API changes included with the 4090.

    I've found plenty of games with RT so idk
     
    Last edited: Nov 4, 2022
    yasamoka likes this.
  10. alanm

    alanm Ancient Guru

    Messages:
    12,286
    Likes Received:
    4,488
    GPU:
    RTX 4080
    Since RDNA 3 (N31) is a big arch departure from last gen, I hope they get the drivers right early on release. Because I'm thinking of getting the XT (if performance not far off from XTX).
     
  11. Undying

    Undying Ancient Guru

    Messages:
    25,529
    Likes Received:
    12,916
    GPU:
    XFX RX6800XT 16GB
    Which are those? I played many as well and im still not sold.
     
    pegasus1 likes this.

  12. pegasus1

    pegasus1 Ancient Guru

    Messages:
    5,227
    Likes Received:
    3,636
    GPU:
    TUF 4090
    I know what you're saying bud, im worried games will go the way of eye-candy over substance. I mean Metro is a great looking game, even with mediocre levels of RT (6900xt @ 4k) but il never replay it as much as say FC 5 or FCND.
     
  13. Rich_Guy

    Rich_Guy Ancient Guru

    Messages:
    13,146
    Likes Received:
    1,096
    GPU:
    MSI 2070S X-Trio
  14. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Ascent, Doom, Hitman 3 come to mind.

    RE7 had it but it was less impactful graphically.

    Forza 5 mod is pretty cool.

    I replayed Hellblade and that was also much better - the reflections not a big deal but the torches and darker scenes later in the game was massively improved with RT.

    I'm sure there's more - like I know I played both battlefields with it on but i hated those games so idk
     
  15. Undying

    Undying Ancient Guru

    Messages:
    25,529
    Likes Received:
    12,916
    GPU:
    XFX RX6800XT 16GB
    What i've noticed as i a gamer without zooming 5x times.

    Deathloop - didnt notice any difference, WoW - no difference, Tomb Raider - didnt notice difference, Bright Memory - better reflections noticed it right away but thats it, Metro EE - lighting is amazing rt GI makes a big difference, Far Cry 6 shadows/reflections very subtle almost unnoticeable, Dying Light 2 - transformable with rt noticeable difference in GI, reflections and shadows one of the best examples, Control - another great example of reflections with rt enabled looks very nice with the concrete surfaces, RE8 - no difference, RE3 - subtle difference better reflections, Doom Eternal - better reflections, Spiderman - quite a difference with rt enabled swinging around the city.

    Lets be honest all those games look good even without it. I even finished most of them before they got rt update and still enjoyed just as much.
     
    tty8k likes this.
  16. brogadget

    brogadget Master Guru

    Messages:
    289
    Likes Received:
    78
    GPU:
    2xR9 280x 3GB
    @Denial
    I fully agree with you, you described actual "state of the art". But I do not mean SLI as a simple bus connector, maybe I should have better written Crossfire as an additional bad example, well, what I mean, is more like a future on board slot for an add. asic chip, sorry for confusion, my mistake. Maybe I just don´t like Raytracing "in current form", because "in current form", for me it has nothing to do with raytracing, it´s just another almost invisible gimmick, same with "fake frames" another expensive interpolation joke, but maybe I am just too harsh or just don´t like the way how people are beeing dazzeled. In fact, without RT implementation "in any form" we wouldn´t have any of the latency problems you described. RT is a ultra high effort for almost nothing. Without adding RT + AI, we would have GPUs with much more space for pure raster performance at much lower energy consumption. But if people want RT, go for it, let´s see where the journey ends.
    Imo AMD or Intel should better leave the RT domain as fast as they can, they can only lose in this area. Let Nvidia stomping around with it.
     
    Last edited: Nov 4, 2022

  17. AuerX

    AuerX Ancient Guru

    Messages:
    2,680
    Likes Received:
    2,540
    GPU:
    Militech Apogee
    Lets be honest some people dont want to see a difference for whatever reason.
     
    Valken and Maddness like this.
  18. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Completely unrelated but this post just made me really miss @fox2232 - just brought back memories of him commenting on all raytracing stuff and going back and forth with him on it the specific details on why it rendered things incorrectly lol

    He was great in threads like this

    I really hope he's doing well
     
    Truder, Embra, mohiuddin and 4 others like this.
  19. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,555
    Likes Received:
    609
    GPU:
    6800 XT
    RT will be fine with these. Not as fast as 4090 but welp they are anyway playing catch up with less rnd money. If they had caught up that would be almost a miracle. Unlike intel, Nvidia doesn't really rest on its laurels
     
    Picolete, wavetrex and Maddness like this.
  20. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    "List of games that support ray tracing

    Total number of games: 144"


    Strange
     
  21. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,031
    Likes Received:
    4,407
    GPU:
    Asrock 7700XT
    I don't think it really matters which CPU they picked because it'd be whatever would cause the least of a bottleneck. Whatever CPU they'd use, the next best alternative would only affect framerates by a few percent.
    From what I recall, SAM doesn't really have much impact on GPUs that aren't starved for VRAM, but I think it's safe to assume it was enabled. If you're building a new high-end system, there's no reason to leave it off, and much like with the CPU, they're going to give their results the best-case scenario.

    To me, it's more a matter of whether the difference is worth the tremendous performance hit. I feel we're better off sticking with rasterized shadows and reflections, but use RT for secondary lighting effects. That's why I felt Minecraft was one of the better first implementations, because it showed the true value of RT and didn't seem to completely tank in performance.
    I think RT is a great future to look forward to but I'm not willing to spend extra for it. I just want cheap 4K raster performance for now. By the time <$350 can get me a 4K GPU @ 60FPS GPU with RT enabled, that'll be my next upgrade.

    I forgot about him. Know what happened?
     
    tty8k likes this.

Share This Page