Leaked Details Emerge for AMD Radeon RX 7600 GPU

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 19, 2023.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,561
    Likes Received:
    18,886
    GPU:
    AMD | NVIDIA
    As the late May launch date approaches, leaks regarding the AMD Radeon RX 7600 GPU continue to surface. This entry-level model is expected to compete with NVIDIA's upcoming GeForce RTX 4060 graphics ...

    Leaked Details Emerge for AMD Radeon RX 7600 GPU
     
    fantaskarsef likes this.
  2. mohiuddin

    mohiuddin Maha Guru

    Messages:
    1,008
    Likes Received:
    206
    GPU:
    GTX670 4gb ll RX480 8gb
    Any 8GB card is a DOA card. Minimum should be 10GB now. An 8GB 3070 holds less value in future proofing than a 12gb 3060.
     
    vestibule likes this.
  3. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    8,740
    Likes Received:
    10,836
    GPU:
    RX 6800 XT
    AMD releasing a GPU with 8GB of vram, after spending so much time bragging about having more vram, just feels like they are taking a piss at PC gamers.
     
    pharma, AuerX and vestibule like this.
  4. Embra

    Embra Ancient Guru

    Messages:
    1,601
    Likes Received:
    956
    GPU:
    Red Devil 6950 XT
    For $250 and some one playibg 1080p, this card would be a fair deal... even with 8 gb ram. But I doubt it will sell at that price when the 4060 will be more.
     
    schmidtbag and fantaskarsef like this.

  5. Undying

    Undying Ancient Guru

    Messages:
    25,501
    Likes Received:
    12,901
    GPU:
    XFX RX6800XT 16GB
    Atleast they dont ask 399$ for it like nvidia.
     
  6. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    8,740
    Likes Received:
    10,836
    GPU:
    RX 6800 XT
    350$ to 375$ isn't that much of a diference.

    Considering that 1GB of GDDR5 is now going for 3.4$, there is no excuse or nvidia or AMD to release a GPU with only 8GB.
     
  7. vestibule

    vestibule Ancient Guru

    Messages:
    2,218
    Likes Received:
    1,444
    GPU:
    Radeon RX6600XT
    I blame AI.
    Its making all the decisions.
    Like with recruitment and no one gets a job.
     
    pegasus1 and tunejunky like this.
  8. daffy101

    daffy101 Active Member

    Messages:
    88
    Likes Received:
    4
    GPU:
    7800XT
    Need to see reviews for these and 4060 and then probably end up buying a 7800xt and eat beans for a year.
     
  9. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,023
    Likes Received:
    4,400
    GPU:
    Asrock 7700XT
    As embra pointed out, if it's marketed as a 1080p (with DXR on) GPU and priced appropriately, the VRAM is fine. Underwhelming, but fine. If MSRP is over $300 and/or this is marketed as a 1440p GPU then I agree with you.
     
  10. H83

    H83 Ancient Guru

    Messages:
    5,515
    Likes Received:
    3,037
    GPU:
    XFX Black 6950XT

    I agree. Not every card needs 20Gb of VRAM...



    That would explain a lot...:eek:
     

  11. barbacot

    barbacot Maha Guru

    Messages:
    1,004
    Likes Received:
    984
    GPU:
    MSI 4090 SuprimX
    :)That brings back memory - you may be saying this for the fun of it but back in my youth when I bought an Athlon X2 4800+ I remember eating Heinz canned beans for three weeks - I can't stand beans now...

    This is an entry level card and for an entry level (1080p) 8GB is enough as it was already said here - it doesn't have dlss, probably weaker ray tracing than competition but also lower price - anyway the price should be much lower but who knows these days...
     
    Typhon Six Six Six likes this.
  12. MonstroMart

    MonstroMart Maha Guru

    Messages:
    1,397
    Likes Received:
    878
    GPU:
    RX 6800 Red Dragon
    Whatever 8GB is enough or not depends on the price and marketing. The 3070 was sold as the 2080TI killer and a 4k raytracing card and was over 1 grand in many countries. In this context 8GB was clearly not enough.

    As long as a card is sold as a 1080p only card and at less then 300$ USD then 8GB is perfectly fine. This card will likely be more than 300$ USD though so yeah shame on AMD if it's the case.
     
  13. vestibule

    vestibule Ancient Guru

    Messages:
    2,218
    Likes Received:
    1,444
    GPU:
    Radeon RX6600XT
    @H83
    There is no getting away from it, there is something going on with AMD & Ngreedia that cannot be easily explained nor understood.
    On a less serious note. Bahhh to them. :D
    I think I need to let this go. :p and so I will. :)
    Game on. ;)
     
    Last edited: May 19, 2023
  14. Venix

    Venix Ancient Guru

    Messages:
    3,474
    Likes Received:
    1,973
    GPU:
    Rtx 4070 super
    Same cu's as the 6600xt lower clocks ....hmmm so this will be about as fast as a 6650xt at best I reckon.... I mean what kind of ipc improvements rdna 3 has over 2


    Now about people that say 8gb is enough for a 300 USD 1080p card.... Really since when it was the norm to have to lower from day 1 the texture quality to medium on a 300 USD card ? This has to happen since the GTX 460 days ! Maybe even farther back !
     
  15. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,023
    Likes Received:
    4,400
    GPU:
    Asrock 7700XT
    8GB has always been enough for 1080p and perhaps always will be. The problem is, texture detail (which seemingly accounts for the majority of VRAM consumption) is always measured as "low, medium, high, ultra" when really it should be based on what resolution it is most optimal for. You don't need to play a modern game with ultra texture detail at 1080p; it's just simply unnecessary, because those textures were meant to look crisp on 4K. Do side by side comparisons at 1080p with different texture settings and unless you're point-blank with a wall, you won't see an appreciable difference.

    Now to be clear: I'm not happy that for almost a decade, 1080p GPUs have cost around $300. That is definitely a problem, though, I'm a little less irritated when that's 1080p with DXR enabled. For 1080p without DXR, even $250 with 8GB is asking too much as far as I'm concerned.
     

  16. wavetrex

    wavetrex Ancient Guru

    Messages:
    2,465
    Likes Received:
    2,579
    GPU:
    ROG RTX 6090 Ultra
    So, where is this ?
    What the f!&# went wrong at AMD? No new releases, no actual announcements, nothing, after 7900 XT(X) .. That's more than 5 months ago.
    [​IMG]
     
    Airbud, Venix, Undying and 1 other person like this.
  17. Silva

    Silva Ancient Guru

    Messages:
    2,051
    Likes Received:
    1,201
    GPU:
    Asus Dual RX580 O4G
    You clearly don't fully understand how textures work. Resolution impact VRAM allot more than texture quality due to the mip maps being loaded. That's why 4K eat so much Vram.
    That said, most of the times High and Ultra almost look the same and there's a big penalty to take. Also Shadows, they never look good on the highest setting, I always prefer medium.
    As for allot of other stuff I like to tune to my taste and optimize. At the end of the day, I usually get 2x FPS of the advertised and IQ is virtually the same (or at least I'm satisfied with it).

    Pricing is a joke, depending on how they perform I'd rather pay 223€ for a RX6600. Still waiting to see if the RX 6700 comes down in price, common AMD: release the 7700/7800 too!
     
  18. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,209
    GPU:
    AD102/Navi21
    yup, shadows are usually unnaturally crisp and detailed at ultra, while in reality they're softer and not as dark.
    btw I've seen 6700xt's at 320eur new here, I think it was asrock challenger. imo you'll soon find one at 300.
     
    Last edited: May 19, 2023
    Silva likes this.
  19. LimitbreakOr

    LimitbreakOr Master Guru

    Messages:
    621
    Likes Received:
    158
    GPU:
    RTX 4090
    if its true that it's only 8GB, AMD missed an opportunity here to come out as a clearly better card than the Nvidia counterpart, very disappointing...
     
  20. Venix

    Venix Ancient Guru

    Messages:
    3,474
    Likes Received:
    1,973
    GPU:
    Rtx 4070 super
    Textures are not just the diffuse ...that was on dx7 era ! A lot of techniques use alot more textures than people realize techniques that add "almost free" eyecandy as far you have enough v ram with our beloved materials ! So say you have a 2 million triangle face in blender ...and you drop that to 2k triangles ! This as is with just diffuse the difference would be colossal but if you make a normal map out of the high poly count model and apply it on the low poly model you can fake the missing geometry and give you visually 99.9% the same visual resaults for the fraction of computational cost .... Now the normal map alone is not enough together with it you have to make a specular map as well witch contains the information how light behaves on the surface ! So already 1 surface needs 3 textures not just one .... Since then more and more things got added roughness map and height maps and alpha maps .....are also added so when you scale textures from say 1024x1024 to 2048 it might means 6 textures for each surface !
    Although it does not mean all the materials has to match the diffuse texture ... But here you go this is why ram requirements go up. Now .... Nanite and lumen kiiiiinda throw all that out of the window you use the ultra high poly count out of the box so you do not need materials to fake geometry and lumen does the light great as well ... So actually when you use those the vram usage drops (only if you drop materials usage!else it goes Waaaay up) but it does not drop as much high Polly models are highly compressable but huge ....so compressed down end up being a bit less than using full flashed materials in game ....anyway long story short vram usage is a very complex matter is not as easy to nail down but one thing is certain over time it goes up and even if it staled for a while and might stall again for a bit it will keep going up and up! (Except if we find some miracle compression techniques)
     
    LimitbreakOr likes this.

Share This Page