Review: Radeon RX 5600 XT 6GB (ASUS STRIX TOP, Sapphire PULSE OC and Gigabyte Gaming OC)

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 21, 2020.

  1. Undying

    Undying Ancient Guru

    Messages:
    25,478
    Likes Received:
    12,884
    GPU:
    XFX RX6800XT 16GB
    2060S is a 400$ gpu using dlss 50% resolution and it was not maxed out becouse game uses more than 8gb vram. I also think it was 1440p not 4k. Credit goes to amazing id tech engine optimization and great performance in the first place.
     
  2. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Eh, disagree. At the midrange they are certainly not great but newer titles are pushing up easily. Wolf for example, the 2080Ti can do 4K @ 60 without DLSS. It also remains to be seen with consoles having RT support how that drives game performance/adoption and techniques. Also RTX cards have mesh shaders, VRS and a few other features that AMD doesn't support at the moment. There is definitely value-add with RTX.

    Some other things:

    Nvidia's encoder is superior to RDNAs - enough to make an appreciable difference in image quality. Nvidia currently supports VRR, AMD does not.

    So outside of just sheer performance there is value with going Nvidia. How much are those things worth? Depends on the person.
     
    fantaskarsef and pharma like this.
  3. Glottiz

    Glottiz Ancient Guru

    Messages:
    1,949
    Likes Received:
    1,171
    GPU:
    TUF 3080 OC
    American dollar (they don't even include tax!) prices mean nothing to me, and I think most of this website's community, because I'm pretty sure big majority of this website are Europeans.

    For example in my country
    MSI RX 5700 GAMING X = 530 euro
    MSI RTX 2060 SUPER GAMING X = 550 euro.

    I would never recommend any of my friends to save measly 20 euro and buy GPU that lacks so many features, I would later feel terrible talking to a friend who saw some cool raytracing gameplay only to learn he can't use that feature on his shiny new AMD GPU.
     
  4. Eastcoasthandle

    Eastcoasthandle Guest

    Messages:
    3,365
    Likes Received:
    727
    GPU:
    Nitro 5700 XT
    Aww, isn't this so adorable. Looks like someone just got released from the Nv Emo Support Clinic.
    But I still don't understand why they still charging a co-pay?
    o_O

    Shhh, I was trying to offer some hope. Perhaps late 4th quarter?:D
     

  5. Glottiz

    Glottiz Ancient Guru

    Messages:
    1,949
    Likes Received:
    1,171
    GPU:
    TUF 3080 OC
    [​IMG]
     
  6. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,544
    Likes Received:
    18,856
    GPU:
    AMD | NVIDIA
    READ!

    The thread from this point onwards remains on topic and decent. Fight, and I WILL hit that ban button. Trust me, I don't care who you are or how long you've been here.
     
    Glottiz likes this.
  7. Eastcoasthandle

    Eastcoasthandle Guest

    Messages:
    3,365
    Likes Received:
    727
    GPU:
    Nitro 5700 XT
    It was already discussed.
     
  8. pharma

    pharma Ancient Guru

    Messages:
    2,496
    Likes Received:
    1,197
    GPU:
    Asus Strix GTX 1080
    Interesting tidbit from Anandtech:
     
  9. Serega_Mih

    Serega_Mih Member

    Messages:
    38
    Likes Received:
    12
    GPU:
    GainwardGTX1660S
    o_O What are the thermal pads for Gigabyte Radeon RX 5600 XT Gaming OC? There is no memory, just ASUS has cooling fins, but under them there is no memory.
    [​IMG]
     
  10. Kool64

    Kool64 Ancient Guru

    Messages:
    1,662
    Likes Received:
    788
    GPU:
    Gigabyte 4070
    I dunno for someone coming from a sub $250 card this might be a good choice but anyone with a $300-400 card would probably want to opt for a 5700. It still feels weird that the cards in this price range only have 6gb of VRAM.
     
    Turanis and MonstroMart like this.

  11. anticupidon

    anticupidon Ancient Guru

    Messages:
    7,898
    Likes Received:
    4,149
    GPU:
    Polaris/Vega/Navi
    Let's be real. No surprise here, e-tailers riding the new RX5600XT wagon, it's business what they are after.
    Let the dust settle, and little by little prices will be come down a bit.
    What really "sells me" is the quietness in idle and power consumption.
    I know at what lengths I went to tweak my RX 480 just to NOT hear it when only browsing.
    But I'll wait a bit and I'll decide what version to get or getting something else.
     
  12. wavetrex

    wavetrex Ancient Guru

    Messages:
    2,465
    Likes Received:
    2,578
    GPU:
    ROG RTX 6090 Ultra
    Assuming that number in your profile represents the number of times the ball of dust and iron spun around the other ball of very hot gas since you started observing the light from the 2nd ball, we are close in the aspect of inner g33k1|\|355 ;-)

    Lovely computers those were, C64, Spectrum, VIC ... probably the first and last generation where home g33k5 understood how the computer moves every bit of data.... after that it's all become abstraction over abstraction over more abstraction.
    These days unless one is a hardcore transistor engineer working for Intel, AMD, nVidia, ARM, Samsung, Apple ... designing chips, everyone else is just working with some layer of abstraction.
    Actually, even those engineers are using software tools to design chips. Logic goes in, layers of silicon come out.

    We are slowly but surely getting to the point of "Computer, Make me a sandwich" level of knowledge needed to use one...
     
  13. Eastcoasthandle

    Eastcoasthandle Guest

    Messages:
    3,365
    Likes Received:
    727
    GPU:
    Nitro 5700 XT
    I'm guessing that for 1080p resolution 6gb would be more then enough. I've been down scaling to 1080p now a days.

    I don't know why but if I play a game that allows you to down scale resolution, like BF5/MW, it looks better at 1440p @75% vs just 1080p. And I still get all the performance I want.
     
  14. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    I kind of wonder how the 6GB of ram thing is going to play out with XBONEX having 12GB - I feel like a lot of this gen's midrange is going to be wiped out by next gen games - especially ones that utilize RT.
     
    jbscotchman likes this.
  15. Kool64

    Kool64 Ancient Guru

    Messages:
    1,662
    Likes Received:
    788
    GPU:
    Gigabyte 4070
    Games are already taking more than 6gb of VRAM on 1080P. In Deus Ex: MKD I regularly use over 7gb on my 1070 with all the options turned on.
     
    Last edited: Jan 21, 2020

  16. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    Prices are all over the place right now upwards of $345 for the Strix. Just get a 5700 for that extra couple bucks.
     
  17. Supertribble

    Supertribble Master Guru

    Messages:
    978
    Likes Received:
    174
    GPU:
    Noctua 3070/3080 FE
    Decent perf but too expensive right now. This does nothing to help bring down prices of Nvidia cards and help consumers. I get it, AMD want higher margins but they won't increase market share by using this strategy.
     
  18. anticupidon

    anticupidon Ancient Guru

    Messages:
    7,898
    Likes Received:
    4,149
    GPU:
    Polaris/Vega/Navi
    Better find a cheap 5700 and be done with it. I'm just pissed off, AMD.
     
  19. jbscotchman

    jbscotchman Guest

    Messages:
    5,871
    Likes Received:
    4,765
    GPU:
    MSI 1660 Ti Ventus
    That's also something that's been in the back of my mind. The next gen consoles are gonna have plenty of cpu cores and vram to throw around so unless there's good optimization, we could see next gen PC game ports having very high system requirements. Call of Duty 2 immediately comes to mind, it beat the crap out of my 128mb Geforce 6800 at the time.
     
  20. Mesab67

    Mesab67 Guest

    Messages:
    244
    Likes Received:
    85
    GPU:
    GTX 1080
    While I fully applaud Nvidia for introducing RT as a consumer option (though instantly take that back on pricing), I would be very careful of bashing another company with generic statements. The reader/buyer needs to be informed exactly which RT techniques (e.g. reflection, refraction, shadows) are being used in a game/piece of software- and which are not. They should also be fully aware exactly how much is being used in the game - 5% of the scene? It's entirely possible...inevitable...that optimisation stages include "how much can we reduce the RT burden while still being able to state 'RT-ON' on the box". Take that important piece of information out and you leave yourself open to standard replies.
     
    AlmondMan likes this.

Share This Page