New Upcoming ATI/AMD GPU's Thread: Leaks, Hopes & Aftermarket GPU's

Discussion in 'Videocards - AMD Radeon' started by OnnA, Jul 9, 2016.

  1. itpro

    itpro Maha Guru

    Messages:
    1,364
    Likes Received:
    735
    GPU:
    AMD Testing
    I agree. I was too harsh above. If I can play all latest games ultra dxr 1440p 60fps stable minimum, it is a great achievement for now.
     
    lukas_1987_dion and OnnA like this.
  2. pharma

    pharma Ancient Guru

    Messages:
    2,495
    Likes Received:
    1,196
    GPU:
    Asus Strix GTX 1080
    Some early benchmarks were posted, but yanked. The were compared to the 5700XT

    2460MHz @ Valhalla, +66% perf vs 5700XT (4K)
    2450MHz @ RDR2, +69% perf vs 5700XT (4K)
    2330-2450MHz @ Watch Dogs Legion, +82% perf vs 5700XT (4K)
    no clock info HZD, +109% perf vs 5700XT (4K)
    no clock info SotTR, +95% perf vs 5700XT (4K)

    Twenty more minutes.
     
    JonasBeckman likes this.
  3. bombardier

    bombardier Master Guru

    Messages:
    268
    Likes Received:
    33
    GPU:
    4090 Phantom
  4. endbase

    endbase Maha Guru

    Messages:
    1,250
    Likes Received:
    327
    GPU:
    Asus RTX4070 OC

  5. OnnA

    OnnA Ancient Guru

    Messages:
    17,953
    Likes Received:
    6,811
    GPU:
    TiTan RTX Ampere UV
     
    Maddness likes this.
  6. OnnA

    OnnA Ancient Guru

    Messages:
    17,953
    Likes Received:
    6,811
    GPU:
    TiTan RTX Ampere UV
    AMD Radeon RX 6700 Series

    Patrick Schur published the first information on Navi 22 GPU power targets.
    This new GPU is expected to come to the upper mid-range Radeon RX 6000 series as the RX 6700 models.

    AMD Navi 22 GPU is not a mystery to us anymore. Similarly to Navi 21 and Navi 23, it leaked through macOS firmware updates, which confirmed that the GPU features up to 40 Compute Units.
    This is the same CU count as the Navi 10 GPU from the previous RDNA generation. What this means is that the upcoming Radeon RX 6700 series will succeed the RX 5700 series with the same core count, but with all the power efficiency, higher clock speeds the cards are expected to be noticeably faster.

    New information from Patrick shares some light on the power requirements of the upcoming models. According to him, the Navi 22 XT, which is expected to be used by Radeon RX 6700 XT, is believed to feature total graphics power of 186 to 211W.
    Even if AMD uses the higher value, it will still be lower than RX 5700 XT, which had a power consumption of 225W.

    When it comes to the RX 6700 non-XT, believed to feature Navi 21 XL, this model would require between 146 to 156W.
    This value is lower than the RX 5700 non-X series as well (with TBP at 180W).

    Unlike Navi 21, the 22 is equipped with a 192-bit memory bus, which means that the models based on this GPU should offer either 12 or 6GB of memory.

    -> https://videocardz.com/newz/amd-rad...ature-navi-22-gpu-and-up-to-12gb-gddr6-memory

    [​IMG]
     
  7. Undying

    Undying Ancient Guru

    Messages:
    25,473
    Likes Received:
    12,881
    GPU:
    XFX RX6800XT 16GB
    6700xt 12gb for 400$ performance between 2080S and 2080ti. That card maybe would be worth a wait.
     
  8. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    If it has at least 96MB of IC, it will be able to do superb 1080p and good 1440p. With 64MB, I guess, 1080p it is.
    Or maybe, they'll use 18Gbps memories to prevent actual memory bandwidth falling under RX 5700(XT). Because then, memory demanding scenarios would choke GPU down to 5700(XT) level or worse.
     
  9. Undying

    Undying Ancient Guru

    Messages:
    25,473
    Likes Received:
    12,881
    GPU:
    XFX RX6800XT 16GB
    I hope they use the same 128mb but on 192bit bus. That would be a 1440p beast.
     
  10. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Well, I would not call it beast. But it would be able to keep comfortable 60fps+ without sacrifices.
    And if those cards OC reasonably well...

    I have feeling that IC is going to change future of gaming a bit. Not with this generation, but it will shape next one on 5nm.
    We can expect GDDR6 @18Gbps then. Or GDDR6x with similar transfer rate.
    And Increased IC will reduce need for actual bandwidth a bit more. By then, AMD will figure what's best use of it to save bandwidth.

    And IC not only saves bandwidth, it saves energy as memory read/write is costly. (More Power for GPU.)
    +latency for data in IC is not even remotely comparable to those in memory chips.

    Older generations w/o IC can be angry all they want. But operations on smaller data set which benefit greatly from low latency/high bandwidth will prove RDNA2 much better than anything else to date. And higher IC in RDNA3 will show even higher benefit. (As RDNA2 on 1080p shows in contrast to 1440p/4K where capacity is not sufficient.)

    Same way as feature test for PCIe 4.0. Sure, till there is actual game which uses that scenario shown in feature test, there is nearly no benefit.
    But as shown, IC already gives big advantage when sufficient.
     
    Last edited: Nov 21, 2020
    Undying likes this.

  11. Undying

    Undying Ancient Guru

    Messages:
    25,473
    Likes Received:
    12,881
    GPU:
    XFX RX6800XT 16GB
    Close 2080ti would be resonable to expect thats a high refresh rate 1440p right there. Im also hope for 5700xt price range.
     
    Fox2232 likes this.
  12. ninja750

    ninja750 Master Guru

    Messages:
    366
    Likes Received:
    89
    GPU:
    RX 6800 REF
  13. OnnA

    OnnA Ancient Guru

    Messages:
    17,953
    Likes Received:
    6,811
    GPU:
    TiTan RTX Ampere UV
    ... But still not enough for 4K, they should go for 256MB (no biggy IMhO) and 6000 will have a win in every (aside from RT) scenario.
    Good GPU, but without Whoooa ;)

    IMO this IC is not only for Power envelope, but also for cuting costs of the GPU (cheapest GDDR6) and better % Margins for ATI and as 128MB is ok for 1440p is not for 4K....
    They should give us HBM2 2x8GB + IC :p then price it at those 649€

    We shall see in the future, if Our ATI fine wine is still the case.
     
    Last edited: Nov 22, 2020
  14. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,551
    Likes Received:
    608
    GPU:
    6800 XT
    The ic would be huge part of the die at that point and might up latency too much.

    But lets say with hbm or gddr6x and 128mb cache would be amazing.

    Anyway the drop in perf from 1440p to 4k is similar to what 3080 experiences. IC isnt the issue it's the raw power that the gpus have it's barely enough for 4K. The perf is 58% of the perf of 1440p whilst 1440p has 43% of the pixels of 4k.... It's no wonder AMD and nvidia struggle when they need to go up from 3.6 million pixels to 8.3...

    With the specs you listed they should ask 1000€ it would be a massive die. They aint a charity anyway.
     
  15. OnnA

    OnnA Ancient Guru

    Messages:
    17,953
    Likes Received:
    6,811
    GPU:
    TiTan RTX Ampere UV
    Maddness, holeindalip and PrEzi like this.

  16. Truder

    Truder Ancient Guru

    Messages:
    2,400
    Likes Received:
    1,430
    GPU:
    RX 6700XT Nitro+
    So..... can we get it on older Ryzen?
     
  17. OnnA

    OnnA Ancient Guru

    Messages:
    17,953
    Likes Received:
    6,811
    GPU:
    TiTan RTX Ampere UV
    IMhO Yes, i will do How To when avaible on my thread :D
    100% we can have Bar resize chunk Tweak, so no more 256MB limit.
     
    holeindalip and Truder like this.
  18. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Older Ryzens? Zen2 has same IO chip as Zen3. I guess we can have that as long as we get new Agesa and MB manufacturer gives us access to this option.
    Question is, if GPUs needs vBIOS support. And if so, which GPUs will get update.

    But considering that AMD kind of said that feature will be enabled automatically in OS (drivers) as long as compatible CPU + GPU is detected. And that thing which enables it is part of PCIe standard. I guess it is more about GPU driver and what's enabled by BIOS than CPU itself.
     
    Truder likes this.
  19. Undying

    Undying Ancient Guru

    Messages:
    25,473
    Likes Received:
    12,881
    GPU:
    XFX RX6800XT 16GB
    OnnA likes this.
  20. Elder III

    Elder III Guest

    Messages:
    3,737
    Likes Received:
    335
    GPU:
    6900 XT Nitro+ 16GB
    If I could use SAM in the future on my current system (550 mobo, 3700X cpu) that would be very very nice for me.
     
    Embra likes this.

Share This Page