MMO New World (Closed Beta) is killing GeForce RTX 3090 Graphics Cards

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 22, 2021.

  1. Cave Waverider

    Cave Waverider Ancient Guru

    Messages:
    1,883
    Likes Received:
    667
    GPU:
    ASUS RTX 4090 TUF
    I hear that some people had their (EVGA) cards killed even when the game was running at a capped 60 FPS, so it may not be that capping the framerate is 100% safe either.

    One can set the "Max Frame Rate" in the Nvidia Control panel. In addition, there is a setting for "Background Application Maximum Framerate" which limits frames just for applications that are running in the background (when another application has focus).
     
    Last edited: Jul 24, 2021
    Undying likes this.
  2. metagamer

    metagamer Ancient Guru

    Messages:
    2,596
    Likes Received:
    1,165
    GPU:
    Asus Dual 4070 OC
    I'll do you one better, my 2080 for your 3090. I'll chuck in a slab of beer too.
     
    XenthorX likes this.
  3. TimmyP

    TimmyP Guest

    Messages:
    1,398
    Likes Received:
    250
    GPU:
    RTX 3070
    I hope this makes people look twice at EVGA now. They have always been garbage cards. Inferior cooling. Cheap plastic, thin fan blades (used to be that way atleast). Had 2 560tis go back in the day in one summer. Fan blades as thin as paper and it broke itself.

    Literally ANY other AIB is better.
     
  4. mackintosh

    mackintosh Maha Guru

    Messages:
    1,187
    Likes Received:
    1,094
    GPU:
    .
    I held EVGA in very high regard until two of my GTX 680s broke down within a couple of months of buying them. In fairness to them, the RMA process was swift and painless. They replaced both cards in a matter of days. Nevertheless, I never bought another GPU from them since then.
     

  5. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,665
    Likes Received:
    597
    GPU:
    RTX3090 GB GamingOC
    I always change nVidia Control Panel setting to Use the 3D application setting for V-Sync control but i'm starting to think this is a bad idea atm. Maybe it's time right now to set it to a different setting like On or Fast but i have no idea which is best to use.

    BTW if you RMA a dead gfx card and blame it on the game will that also renew your warranty or do you only get the warranty you had left in the first place?
     
  6. metagamer

    metagamer Ancient Guru

    Messages:
    2,596
    Likes Received:
    1,165
    GPU:
    Asus Dual 4070 OC
    It doesn't renew warranty. Warranty is tied to the date of purchase of the original piece of HW.
     
  7. DannyD

    DannyD Ancient Guru

    Messages:
    6,847
    Likes Received:
    3,805
    GPU:
    evga 2060
    I've always used adaptive v-sync for games.
     
  8. thesebastian

    thesebastian Member Guru

    Messages:
    173
    Likes Received:
    53
    GPU:
    RX 6800 Waterblock
    No matter the GPU. Hi-end or Low end, it shouldn't die because it's used at 100% load for a few hours non-stop.
    (Every heavy game, like Cyberpunk puts my GPU at full load).

    Most of these 3090 just faced their first stability test after probably being used at a lower capacity for a long time.

    This is a shame.

    Personally I can't put my RX 6800 at 100% capacity in this game with a 1440p@144Hz monitor and max graphics settings. It's highly CPU bound for me. And it runs in DX12 (edit I wanted to say DX11)

    So I guess people who stressed a 3090, were playing at 4K or 2k ultra wide on a high refresh rate monitor.
     
    Last edited: Jul 24, 2021
  9. Martin5000

    Martin5000 Master Guru

    Messages:
    307
    Likes Received:
    124
    GPU:
    8 gig
    You cant go blaming game devs. it shouldn't be possible to kill a card by running anything. the built in protection isn't fit for purpose. this is a failure of the hardware plain and simple.
     
    MonstroMart and itpro like this.
  10. Martin5000

    Martin5000 Master Guru

    Messages:
    307
    Likes Received:
    124
    GPU:
    8 gig
    wow this gpu is mega power hungry. but why did the vega 64 take such a hammering for its power usage when it pulled less power.
    strange that.
     

  11. DannyD

    DannyD Ancient Guru

    Messages:
    6,847
    Likes Received:
    3,805
    GPU:
    evga 2060
    You get insanely more performance /power ratio these days, especially after the 30 series and 6000 series. Not to mention undervolting tech advances.
     
  12. Krizby

    Krizby Ancient Guru

    Messages:
    3,103
    Likes Received:
    1,782
    GPU:
    Asus RTX 4090 TUF
    Problem was Vega 64 at 330W get beaten by GTX 1080 at 180W, it's about efficiency, not raw power consumption (link)
    3090 and 6900XT are pretty close in efficiency but only when limited to rasterization, when RT being used the 3090 is way ahead (also DLSS).

    I run my 3090 at around 250-270W range by undervolting to 750mV (stock Max voltage is 1093mV) , which give around 90%FPS of 3090 at 375W, so 120W less for 10% less perf.
     
    Last edited: Jul 24, 2021
    GoldenTiger and Noisiv like this.
  13. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    That's exactly how I would drive 3090 If I had one :)
    Although I would expect the power to be closer to 1/2 instead of 2/3. Looks like GDDR6X interface is adding considerably to power.
     
    Krizby likes this.
  14. Krizby

    Krizby Ancient Guru

    Messages:
    3,103
    Likes Received:
    1,782
    GPU:
    Asus RTX 4090 TUF
    Well there is under-voltage protection on desktop Ampere GPU, the min voltage is 737mV, clock speed will plunge and voltage will stay at 737mV when you set the Power Limit too low. I tried setting 220W PL and get 14k5 Timespy graphic and 17k8 Timespy graphic with 250W PL, that's 23% perf drop.
    score.png score1.png

    So yeah GDDR6X, under-voltage protection and overbuilt VRM (more phases less efficiency) make it moot to go below 2/3 of max power for GA102.
     
  15. barbacot

    barbacot Maha Guru

    Messages:
    1,002
    Likes Received:
    982
    GPU:
    MSI 4090 SuprimX
    It seems that it affects most of AMD's RX 6000 series lineup also...according to JayzTwoCents.
    So no GDDR6X nonsense, Nvidia power problems, etc...
    This is a badly written piece of software that can destroy hardware.


    [​IMG]
     
    Last edited: Jul 24, 2021

  16. ontelo

    ontelo Guest

    Messages:
    84
    Likes Received:
    16
    GPU:
    RTX 3080
    Real question is that why user was setting medium details with 3090. :D
     
  17. Zooke

    Zooke Master Guru

    Messages:
    584
    Likes Received:
    419
    GPU:
    3090FE + EK Block
    This is a classic product recall scenario.
    A manufacturer knows theres an issue that can happen under certain conditions, but the cost of a recall far outweighs the cost of replacing the few cards thet get RMA'd and paying any claims submitted.
    The figure the chance of the card failing and causing serious damage, injury or death is low enough that they will just keep quiet about it.
     
  18. Undying

    Undying Ancient Guru

    Messages:
    25,477
    Likes Received:
    12,883
    GPU:
    XFX RX6800XT 16GB
    10% less performance is putting you to 3080 level. In that case you didnt need to spend ridiculous amount for a flagship.
     
    chispy and fantaskarsef like this.
  19. itpro

    itpro Maha Guru

    Messages:
    1,364
    Likes Received:
    735
    GPU:
    AMD Testing
    Go finish a university or a college please, you and Jayz2Nothing. Software shouldn't be able to do anything bad. Blame gpu firmware that isn't capable of limiting damage. You cannot directly compare a software dev with an electronic engineer. The latter one should pamper the less technical one.
     
  20. Krizby

    Krizby Ancient Guru

    Messages:
    3,103
    Likes Received:
    1,782
    GPU:
    Asus RTX 4090 TUF
    Sure, but a 3080 would need to be running at ~350W to achieve the same performance as my 3090 at 250W, that 100W saving, plus 14GB more VRAM :D
    I also have 1000W XOC BIOS on my 3090 for some overclocking fun too, got some crazy score at 500W power consumption, there are no XOC BIOS for 3080.
    All in all I would prefer running my GPU at 90% of full tilt for efficiency and fan noise reason, if I had a 3080 I would run it at 90% anyways.
     
    Last edited: Jul 24, 2021
    Noisiv likes this.

Share This Page