3090 Owner's thread

Discussion in 'Videocards - NVIDIA GeForce' started by XenthorX, Sep 24, 2020.

  1. warrior-kid

    warrior-kid Member

    Messages:
    40
    Likes Received:
    1
    GPU:
    3090 RTX+Titan RTX
    Hi all,

    Ventus 3X OC card, latest driver, max performance setting, GPU scheduling, and 8K Dell monitor U3218K--but that should not matter much likely?

    Need help figuring out a performance bottleneck of sorts: Port Royal with defaults (and other 3DMark slow as well).

    I am getting the worst of the worst rating for my GPU+CPU pair (3090 RTX+7900X) of 12565 and that is with +115 core and +255 memory. This compares with around 13600 in the Guru3D review on all defaults.

    Any ideas?

    Or, and btw, starting the latest GPU-Z results in the blue screen of death.

    Games are generally pretty solid, with some tweaks, butter smooth 8K gaming.

    Thanks
     
    Last edited: Oct 31, 2020
  2. JaxMacFL

    JaxMacFL Ancient Guru

    Messages:
    1,708
    Likes Received:
    1,061
    GPU:
    EVGA 3090 FTW3 ULTR
    This might help:

     
  3. warrior-kid

    warrior-kid Member

    Messages:
    40
    Likes Received:
    1
    GPU:
    3090 RTX+Titan RTX
    Yeah, thanks, watched that, does not seem to apply, kept checking other stuff, nothing of note running, but hey, the guy in the video only got to 13K+ via overclocking, so something is going on perhaps with these cards specifically.

    Does anyone have their Port Royal non-overclocked results to hand, is it 12.5K or 13.5K?
     
  4. warrior-kid

    warrior-kid Member

    Messages:
    40
    Likes Received:
    1
    GPU:
    3090 RTX+Titan RTX
    Right, this confirms what I suspected, all those high scores have pretty high "Average clock frequency", above 1950 or more. Even overclocking my card, this is my best result so far and it has average clock frequency of only 1835 MHz: https://www.3dmark.com/3dm/52379954?.

    Having said that I still do not understand how the original Guru3D review got 13600 with supposedly standard core frequency. Does anyone know?
     

  5. Danny_G13

    Danny_G13 Master Guru

    Messages:
    588
    Likes Received:
    60
    GPU:
    EVGA FTW 2080Ti OC
    A little 'review' of my KFA2 3090, and this probably applies to the majority of 3090s in truth:

    This generation just isn't a good leap from Turing. The majority of stuff sees basically about 20-30% increase, and that's going from a 2080Ti. There is a tangible hike, yes, but not one which justifies the current £1700 price, and there are far too many technical issues at this stage to make it particularly enjoyable.

    RDR2 doesn't run at all for me now, the fans are unbearable because they're not just temperature based any more, and that a £1700 video card can't hit even close to 60fps in 4K for ray traced games without the aid of DLSS (and even with it in many games) is just ludicrous.

    This is not to say it's trash, it isn't - if you don't mind the waste of 50% of the outlay on it and can accept the improvement isn't going to be mindblowing, then it's acceptable. In places it's even impressive.

    But it's probably the worst video card generation I've seen for a long time, and doesn't hit the heights the 20xx series did over 10xx. Not even close.

    But, that's just my personal call and I don't claim to have a monopology on the truth.
     
  6. JaxMacFL

    JaxMacFL Ancient Guru

    Messages:
    1,708
    Likes Received:
    1,061
    GPU:
    EVGA 3090 FTW3 ULTR
    Just got a email from Amazon, delivery date on my rog strix 3090 oc has moved up to 19 - 21 November. :>)
     
  7. JaxMacFL

    JaxMacFL Ancient Guru

    Messages:
    1,708
    Likes Received:
    1,061
    GPU:
    EVGA 3090 FTW3 ULTR
    Agree with the upgrade from pascal to ampere. From what I can gather my jedi titan xp FPS will almost double with my rog 3090 oc @ 1440p. Now looking at upgrading my 5 year old 3440x1440 Acer predator 34” monitor for a LG UltraGear 38GL950G-B 38" 3840x1600 to ease the bottleneck on my 10900kf. These cards have really upped the bar for pc systems. Takes more than just the card to fully utilize them. Plus they are pcie 4.0 to boot. Another whole system upgrade for intel users when it becomes available.
     
    Tyrchlis likes this.
  8. Danny_G13

    Danny_G13 Master Guru

    Messages:
    588
    Likes Received:
    60
    GPU:
    EVGA FTW 2080Ti OC
    Calm down. I didn't claim a monopoly on the truth, just my subjective experience. Stop being so angry at someone else's opinion.
     
    Tyrchlis and TheSissyOfFremont like this.
  9. Danny_G13

    Danny_G13 Master Guru

    Messages:
    588
    Likes Received:
    60
    GPU:
    EVGA FTW 2080Ti OC
    My boost from Pascal to Turing was colossal. I must just be wrong there too.
     
  10. darrensimmons

    darrensimmons Ancient Guru

    Messages:
    1,614
    Likes Received:
    57
    GPU:
    GIGABYTE RTX3090 OC
    Hmm, I am getting around 40% gains over my 2080ti, in some games even more.
     
    Tyrchlis likes this.

  11. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    2,014
    Likes Received:
    1,026
    GPU:
    Rtx 3090 Strix OC
    Yes.

    1080 ti 2080 ti is a 30% performance increase, which is meager at best.

    That is unless you upgraded from a 1050 to a 2080 ti...
     
  12. Danny_G13

    Danny_G13 Master Guru

    Messages:
    588
    Likes Received:
    60
    GPU:
    EVGA FTW 2080Ti OC
    Apology accepted sir, sometimes tone and context get fuddled online - but remember capitals do mean shouting, not emphasis, which is why your repeated use of them made me feel yelled at!

    I can tangibly give numbers to express my point. Assassin's Creed Syndicate went from 55fps to 59/60fps. WDL was 20fps to 30fps (DLSS off RTX max). RDR2 (the one time I could run it) went from 45fps to 55fps. And Control went from 25fps RTX on DLSS off to 35fps. All at 4K.

    In your case, you've enjoyed a 40% hike, and more power to you, you've had the boost you want, and that's excellent. For me, the experience wasn't anywhere near as productive.

    PS as an aside RDR2 is broken - Steam forums littered with posts whining about it not working, particularly on Ampere. It also didn't work for me half the time on Turing, locking up in Windows on my 2080Ti.
     
  13. Danny_G13

    Danny_G13 Master Guru

    Messages:
    588
    Likes Received:
    60
    GPU:
    EVGA FTW 2080Ti OC
    Mate, 2080Ti matched 1080Ti SLI. Believe me, I tested. If that's piddling to you, fair enough.
     
  14. Danny_G13

    Danny_G13 Master Guru

    Messages:
    588
    Likes Received:
    60
    GPU:
    EVGA FTW 2080Ti OC
    I did exactly that - went from Voodoo Rush to V2 SLI. That was a lot of fun. Ah the good old days. First thing I ran on V2 SLI was Half Life and I was blown away by the upgrade.

    But in terms of other generations, it varies. You're right in some cases, but I definitely got a major boost from 1080Ti to 2080Ti.

    To strengthen your case further on the 3090 though, I just reinstalled, and where Crysis Remastered was 32fps on 2080Ti, 3090 has it at 52. I mean, still not 60fps where it should be, but much better. So that's about 70% leap?

    Watch Dogs Legion at 4K RTX on DLSS off is 22fps at best on 2080Ti, and goes up to 32fps on 3090. That's a 33% leap. Game should be far better optimised. Watch Dogs 2 goes from 30fps temp filtering off to about 55-60fps.

    Is it value for the money? No, it's really not, and the noise factor of the (at this point) uncontrollable wattage based fans on my particular card is a HUGE issue at present. It growls when I do absolutely anything.

    But I can't deny the horsepower compared to 2080Ti in certain applications. In quite a lot actually.

    It's just disappointing that a £1700 card still can't hit the 60fps it promised in everything.
     
    Tyrchlis likes this.
  15. ThirtyIR

    ThirtyIR Member

    Messages:
    41
    Likes Received:
    7
    GPU:
    NVIDIA RTX 3090 SLI
    finally set them up on Friday! :)

    [​IMG]
     
    Sycuss_MoO, Maddness, chispy and 2 others like this.

  16. Danny_G13

    Danny_G13 Master Guru

    Messages:
    588
    Likes Received:
    60
    GPU:
    EVGA FTW 2080Ti OC
    I take your point about going to 1440P but that's a hell of a trade off. The difference down to 1440P from 4K, when you're close to the screen, is absolutely huge.

    It's a question of what do you want more - treacle thick 4K or ray tracing. For me it's 4K. Although the great irony is you end up using DLSS which is basically 1440P anyway :D
     
    Tyrchlis likes this.
  17. Danny_G13

    Danny_G13 Master Guru

    Messages:
    588
    Likes Received:
    60
    GPU:
    EVGA FTW 2080Ti OC
    As yummy as that is, is there really a market still for SLI? Very little supports it?
     
  18. JaxMacFL

    JaxMacFL Ancient Guru

    Messages:
    1,708
    Likes Received:
    1,061
    GPU:
    EVGA 3090 FTW3 ULTR
    Got me a LG 34GN850-B coming tomorrow. It is 144Hz native but with the capability to be overclocked to 160Hz. Also 1ms nano IPS, Nvidia G-SYNC Compatible, AMD FreeSync Premium
     
    Tyrchlis likes this.
  19. Martigen

    Martigen Master Guru

    Messages:
    444
    Likes Received:
    188
    GPU:
    GTX 1080Ti SLI
    Official support ends Jan 2021 with respect to Nvidia making SLI profiles -- developers will need to implement it natively via DX12/Vulkan, which some may still do (like for e.g. Shadow of the Tomb Raider -- some devs are happy to invest the time).

    However all current SLI games will work from past and present games, as well as those currently being developed and coming out anytime over the next few years if they use DX9/DX11, either through community-made SLI profiles or using existing ones that work with common middleware engines like Unity or Unreal (for which current profiles often work out of the box, or with minimal changes). So, in practice, not much will change for a while -- many current and soon to be released games will be playable with SLI. A few years down the track, as DX12/Vulkan becomes more of the norm, it will be dependent on developers to implement.

    And, this wouldn't really be that big of a deal if Nvidia didn't restrict NVLINK to the 3090... it is this change that is the final nail in the coffin for SLI: by making it out of reach for the majority of the market. If NVLINK was available on all 3-series cards, it would have been plausible for larger AAA developers who have time and money to add it as a feature: it would after all help sell copies. But as that market will be restricted to 3090 owners and the exorbitant cost of these cards... well. *sadface*
     
  20. OnnA

    OnnA Ancient Guru

    Messages:
    12,098
    Likes Received:
    2,902
    GPU:
    Vega XTX LiQuiD
     
    Maddness and chispy like this.

Share This Page