Radeon RX 7900 Series Will compete with GeForce RTX 4080, not 4090 says AMD

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 7, 2022.

  1. Goiur

    Goiur Maha Guru

    Messages:
    1,341
    Likes Received:
    632
    GPU:
    ASUS TUF RTX 4080
    Not sure why i got a feeling that AMD will cut 7900 msrp after 4080 launch, 100$ down.
     
  2. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Eh, the packaging isn't cheap. The individual pieces of the chip are cheaper (6nm and yields) and thus the GPU overall should be cheaper.. but the packaging itself is far more expensive than a traditional monolithic chip.

    I also disagree about the 4090 killer - the issue with competing with a 4090 isn't the die size - Nvidia isn't limited on die size, it's power consumption.

    I'm also really curious to see how AMD chips perform across a range of games. AMD moved from GCN because they had trouble extracting ILP out of workloads.. and now with RDNA3 they brought back dual issue SIMDs. Going to be interesting to see how that plays out. Did AMD cherrypick the results in the demo from games that had good extraction? Left to be seen.
     
    JamesSneed likes this.
  3. JamesSneed

    JamesSneed Ancient Guru

    Messages:
    1,691
    Likes Received:
    962
    GPU:
    GTX 1070
    The 6nm chiplets are cheaper but here is the thing AMD are still coming in cheaper than monolithic 5nm even including the advanced packaging, the AMD GPU, and memory chiplets. InFO's from TSMC is very cheap and that small cost is still allowing the chip stay under TSMC's 5nm costs which is much higher than 6nm. I have already ran this past a couple folks that really know. To say it clearly, if AMD's chip was on 5nm and monolithic it would cost slightly more than the chiplet design split on 5nm and 6nm including the advanced packaging costs.

    See some of the power limited tests bellow and it clearly shows what you can do with a 4090. At 40% less power you lose only 10% perf. Now imagine using 40% more transistors and same power or 50% at slightly less frequency etc. There is a lot of performance left for more shaders etc if you keep the power down.
    TDP-Skalierung: 250 bis 550 W - Seite 10 - Hardwareluxx
    Improving Nvidia RTX 4090 Efficiency Through Power Limiting | Tom's Hardware (tomshardware.com)

    Yes I agree we need to see AMD cards third party reviewed before we make too many assumptions.
     
    Last edited: Nov 8, 2022
  4. Dribble

    Dribble Master Guru

    Messages:
    369
    Likes Received:
    140
    GPU:
    Geforce 1070
    Which makes you wonder why they went with such a small amount of cache on each chip - surely if the cost is in the packaging and all the interconnect then why go to the hassle of 6 chiplets with all that interconnect and a mere 16mb of cache on each one - why not have each one with 32mb or 64mb of cache. Wouldn't that barely increase cost? I mean they vastly increased all the other caches (on chip) but the final level is less then the 6xxx series?
     

  5. Maddness

    Maddness Ancient Guru

    Messages:
    2,440
    Likes Received:
    1,739
    GPU:
    3080 Aorus Xtreme
    They have to leave something for a refresh. That probably won't happen for at least a year though.
     
  6. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    10,449
    Likes Received:
    3,128
    GPU:
    PNY RTX4090
    The thing is Nvidia shot to the moon with power. So the 4090 was pushed hard for performance. Insane performance but at an insane price. Most the time Nvidia aim for 30-50% more performance gen over gen. Now we got near 2x performance in some instances.

    This made Nvidia think they could charge more for a non Ti model and released the 4080 at 1200$.

    AMD was never going to push power in the first place, they always aimed for where the 7900XTX now sits. So it was always going to be $999. The XT is the only really iffy pricing being only $100 lower but all specs have been cut. This is the real reason why I think the 4080 12GB was cancelled. Nvidia knew the 7900XT was going to come out and destroy the 12GB and the XTX probably beating the 16GB too by around 20-25ish% in raster.

    I think the 7900XT was originally going to be a 7800XTX but AMD saw the pathetic 4080 12GB and knew they were faster. So upped it to a 7900XT but charged only $100 less in order to push people to go for the higher end XTX model.

    AMD kept their pricing structure from last gen, Nvidia raised it massively especially with the 4080 sku with near double the price for what? So 100% more cost, for 30-40% more performance? AMD are charging 20% more with the XT for what looks like 30-40% more performance over the 6950XT. Even if the XT pricing is iffy it still makes way more sense than the 4080's.
     
  7. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Almost certainly to keep latency down. Splitting the cache off in the first place is giving you a latency penalty, so I imagine all their optimization was to hide/regain latency where they could.

    Also to be clear I'm not saying the packaging cost is some outrageous price. His initial post made it sound like the packaging would be cheaper than alternative solutions and I was merely clarifying that it's not, it's more expensive. But you're going to save cost everywhere else.
     
  8. kapu

    kapu Ancient Guru

    Messages:
    5,418
    Likes Received:
    802
    GPU:
    Radeon 7800XT
    Judging by forums here , i could say nVidia drivers right now are behind AMD, maybe not by a mile ....
     
  9. Maddness

    Maddness Ancient Guru

    Messages:
    2,440
    Likes Received:
    1,739
    GPU:
    3080 Aorus Xtreme
    With 80 odd percent of the market, there's always going to be more issues on the Nvidia side. That's just a fact until AMD can reach some sort of parity.
     
  10. brogadget

    brogadget Master Guru

    Messages:
    289
    Likes Received:
    78
    GPU:
    2xR9 280x 3GB
    If youre willing to drop out 1.2-1.3K for better RT + DLSS, then youre willing to drop out 1.6-1.7k for max. perf. without hesitation.
     
    Last edited: Nov 8, 2022

  11. kapu

    kapu Ancient Guru

    Messages:
    5,418
    Likes Received:
    802
    GPU:
    Radeon 7800XT
    Also true , but doesn't mean AMD drivers aren't better right now :)
     
    Maddness likes this.
  12. user1

    user1 Ancient Guru

    Messages:
    2,782
    Likes Received:
    1,305
    GPU:
    Mi25/IGP
    its worth noting that the cache helps compensate for bandwidth and latency constrained situations. with a 384bit bus + 20gbps gddr6 hitting 960gb/s, the gpu is much less starved than with the rdna 2 cards per compute unit. so its possible it simply doesnt benefit as much from increases in cache, 3d cache variants is of the cache dies are also possible for future products.
     

Share This Page