Guru3D Content Review: GeForce RTX 3090 (FE) Founder edition

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 24, 2020.

  1. Fediuld

    Fediuld Master Guru

    Messages:
    612
    Likes Received:
    314
    GPU:
    AMD 5700XT AE
    No. Nvidia said that the developers would have to use mGPU provided by DX12 and Vulkan APIs.
    Nothing to do with SLI profiles.
     
  2. BReal85

    BReal85 Master Guru

    Messages:
    485
    Likes Received:
    178
    GPU:
    Sapph RX 570 4G ITX
    Shame on NV for marketing it as a 8K gaming GPU. And from that perspective, it is a patheticly performing product. Difference is close to 10% compared to the 3080. This is a shame. A big shame. For efficiency, so far both the 3080 and the 3090 sits in a not so better position than the 2080 and 2080 Ti. The only positive thing so far in the 3000 series is the performance upgrade of the 3080, but it"s still short of the 1080's performance gain, not to speak of the 1/3 of its efficiency gain. And what we have seen of the 3070 so far (Galax slides showing it's well below the 2080 Ti, while NV slides claimed it was faster than the 2080 Ti), it isn't supposed to be a bomb either. From NV, my last hope is in the 3060.

    "Of course, the GeForce RTX 3090 is a product that does not need to make sense and cannot disappoint in that department, albeit"

    I don't agree with this as it was marketed as a 8K GAMING CARD. And in that sense, it is a huge disappointment/lie.
     
    JAMVA likes this.
  3. geogan

    geogan Master Guru

    Messages:
    859
    Likes Received:
    183
    GPU:
    3070 AORUS Master
    So the 3090 has 2 SMs disabled still? Didn't know that. How many SMs in total then on the full die, and how many disabled from full on 3080?

    EDIT: found it.

    A6000 = 84 SMs
    3090 = 82 SMs
    3080 = 68 SMs

    I'm confused though how it fits 84 SMs into 6 GPCs - from what I can see it would take 7 GPCs to fit them??
    Which is also an odd number - maybe there is actually 8 GPCs on the full die and they even have an entire GPC disabled on the biggest A6000?
     
    Last edited: Sep 24, 2020
  4. SplashDown

    SplashDown Master Guru

    Messages:
    822
    Likes Received:
    177
    GPU:
    EVGA 980ti Classy
    It sucks the 3070 is only going to have 8 vram... also want to see how it runs cyberpunk. I wonder how cyberpunk will run on my 980ti @ 1080p or how bad I should say, sorry for off topic a bit.
     

  5. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    9,925
    Likes Received:
    715
    GPU:
    Asus TUF 3080 OC
    If you play at 1080p don't worry about the 8GB limit, it will be more than fine. Just as a reminder when you see all your VRAM taken in a game it's not actually in use, it's merely allocated because games often ask for everything available even if they don't need it. For 1080p you'll definitely be fine with 8GB of G6X RAM. I have no idea why they call it GDDR6X when it sounds like it's quad data rate, so just GQDR6 makes more sense. Or... wait does the 3070 not use G6X? Either way it won't matter. 8GB of GDDR6 or G6X will be plenty.
     
  6. TheDeeGee

    TheDeeGee Ancient Guru

    Messages:
    7,340
    Likes Received:
    1,519
    GPU:
    NVIDIA GTX 1070 8GB
    From what i've read and seen so far Nvidia is playing a dirty game here.

    - Card is marketed as an 8K Gaming GPU.
    - It's clearly intended for people doing 3D Rendering and such.
    - Nvidia has disabled some Titan features on purpose.

    It's clearly not designed for gamers, even though it's marketed for them. Meanwhile features on disabled on purpose putting people doing heavy workloads in a hard spot, because there might be an "unlocked" Ampere out there which would be the true Titan replacement.

    If you get this as a gamer it's only because you can afford it, and want the cream of the crop.
     
  7. alanm

    alanm Ancient Guru

    Messages:
    10,473
    Likes Received:
    2,581
    GPU:
    Asus 2080 Dual OC
    I can see enthusiasts buying this card just to play around with it. I'll bet you can shave off up to 100w by undervolting it like the 3080.

    Can also see the occasional fool buying it and returning it because it cant do Crysis Remastered on their 8k TV :D.
     
  8. H83

    H83 Ancient Guru

    Messages:
    3,542
    Likes Received:
    921
    GPU:
    MSI Duke GTX1080Ti
    So only 10% more performance than a 3080, i was expecting more from the 3090...

    The cooler is very effective but the size and wheight of is simply ridiculous! Is too extreme for a cooling solution.

    As for the silly price, i don´t know why people here are so worried, this card is for people with lots of money not for average guys who have to think twice before buying a GPU...

    And about the misleading marketing about the 8K, are people really upset about that? Absurd and deceiving marketing happens every day in all industries!

    Anyway, great review Hilbert! Oh and i won´t say no to a 3090 christmas giveaway!;):D
     
  9. Meathead01

    Meathead01 Member

    Messages:
    16
    Likes Received:
    3
    GPU:
    Nvidia 2080
    I find it funny how people say that this is a Good workstation card, it's not... For the difference that you get with a 3080 and $1500 AUD, you could build another machine and get double if not triple the performance and speed for rendering, encoding, etc. Hell, we still use 8 core builds at 4.8ghz over 16 core builds. Another issue with the 3090 does not have the features that the Quadro cards have, so we can't really use it as we do in a typical work environment.

    This was directed at Gamers who love Nvidia and are willing to throw money at them. Good for you if you can justify a 10% improvement for double the price. But for someone who has half a brain, they know that it's not worth the price tag. The only reason why they are sold out is mostly because limited stock and scalpers who reckon they can get more money for the cards.
     
  10. TheDeeGee

    TheDeeGee Ancient Guru

    Messages:
    7,340
    Likes Received:
    1,519
    GPU:
    NVIDIA GTX 1070 8GB
    If you actually watched some video's, you would find out that 10GB isn't enough in some cases and actually crashes during rendering because it runs out of VRam.
     

  11. Meathead01

    Meathead01 Member

    Messages:
    16
    Likes Received:
    3
    GPU:
    Nvidia 2080
    We must have had some magic graphics cards than for it not to ever crash for having less than 10GB... Why do you claim that it's all about the VRam? Clearly more then just that, that determines crashes..
     
  12. Meathead01

    Meathead01 Member

    Messages:
    16
    Likes Received:
    3
    GPU:
    Nvidia 2080
    What? salty because a company wants you to spend double for an extra 10% increase...

    That's just greed, that's like saying the Gov wants you to pay an extra 2 months on your rego but we going to double it and force you to pay that extra 2 months...

    Hell, we have Quadro cards and yet Nvidia still forces us on a subscription to utilize what the cards provide.

    It seems to me you're all talk. I will sell you a car for $35k because I put an exhaust on it and it gave it an extra 10hp however we sell them for 25k standard. I'm sure you will buy my 35k model instead. haha
     
  13. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    6,081
    Likes Received:
    2,419
    GPU:
    HIS R9 290
    I read most of the comments here and it seems most of you have decided to be disappointed/upset with this GPU before you even checked out reviews.

    Yes, it only has a 10% lead, in 1080p games. Who buys a GPU like this for 1080p gaming in today's market? The performance advantage increases with resolution.
    Sure, it trails behind certain predecessors in workstation tasks, but Nvidia never called it a workstation GPU. They didn't call it a Titan either.
    The 8K marketing is misleading, but, DLSS has come a long way and from what I've heard, it does allow you to play in 8K.

    Is this GPU worth $1500? No, absolutely not - this GPU is a ripoff. I don't think it needed to take up 3 slots either, though, the cooler does seem to perform well. However, this isn't a joke, shameful, or a lie like many of you claim it is. Stop being drama queens.
     
    Valken and Aura89 like this.
  14. metagamer

    metagamer Ancient Guru

    Messages:
    2,224
    Likes Received:
    895
    GPU:
    Palit GameRock 2080
    It's marketing. PS4 PRO is marketed as a 4k console.
     
  15. Agonist

    Agonist Ancient Guru

    Messages:
    3,317
    Likes Received:
    544
    GPU:
    6800XT 16GB
    So Nvidia finally made the last push needed to turn into Intel. Come on AMD, bring these greedy **** back down like intel.
     

  16. Aura89

    Aura89 Ancient Guru

    Messages:
    8,183
    Likes Received:
    1,285
    GPU:
    -
    Wtf does this even mean other then troll bait?
     
    barbacot likes this.
  17. Valken

    Valken Ancient Guru

    Messages:
    1,847
    Likes Received:
    225
    GPU:
    Forsa 1060 3GB Temp GPU
    Great Review Hilbert.

    Technology wise, wow! Finally a (baseline) 4K RTRT GPU!

    Price: eeekkk! We live in a (supposedly) free market capitalist economy. Get used to getting shaft until competition arrives, but even then, don't assume there are no "pricing discussions".

    Edit - request - can we see a CPU scaling review across the 3x series? I meant in terms of CPU per core frequency and # of cores. It would be good to know if the real minimum is 8 or 12 cores but at what frequencies to get the best out of it.

    ---
    Comment on the reviews and this applies to all the Nvidia 30x0 series mainly.

    Can you consider keeping the main manufacturer review as is, but for the other vendors, only highlight the differences? For anything that is the same, just "refer" to the original review or enclose those parts into "spoiler" tags that are optional.

    PCB break down OK, temp readings OK, testing results OK, and if the PCB or power circuits or cooling mechanism are different, then highlight those.

    I want to see and read the DIFFERENCES or juicy bits as I compare the brands unless they are all the same, but some OC a bit different

    I skip most of the wall of text, but do read at least one full review, usually the main vendors technology, to set my exceptions.
    ---

    Kinda glad there only handful of games that has RT now... I am not in a rush to buy until a must play game that has RT comes out, to ME.

    Maybe something GTA 6 (no, not you Watchdogs) or ARMA 4 or a the next Doom / Quake or Duke Nukem or Survival Horror with MP COOP and Steam Workshop support.

    These are the types of games that I play for years on so only then it would sell me on RT GPUs and give AMD time to catch up. Hopefully the market will have prices within reach by then.

    Or who knows, maybe Cyberpunk would be a total bad ass and we have to dip into our retirement to play it with RT ON.
     
    Last edited: Sep 25, 2020
  18. barbacot

    barbacot Master Guru

    Messages:
    556
    Likes Received:
    478
    GPU:
    Asus 3080 Strix OC
    Nvidia made this card because, well, they CAN.
    It is just a statement about what can be done at the max with Ampere Architecture.
    It isn't about price/performance ration, efficiency, etc.
    They never expect to make profit from selling them - anyway the big money is in the low/midrange segment and I'm sure that Nvidia will flood this market soon with 3050/3060/3070 Ti/nonTi versions.
     
  19. Thunk_It

    Thunk_It Master Guru

    Messages:
    288
    Likes Received:
    51
    GPU:
    Asus 2080ti Turbo
    Hilbert, it is amazing how busy you must have been over the past couple of weeks! Though you have been as "busy as a cat with nine tails in a room full of rocking chairs"; you managed to produce an excellent review on this 3090 graphics card. Many thanks!
     
    Hilbert Hagedoorn likes this.
  20. StarvinMarvinDK

    StarvinMarvinDK Maha Guru

    Messages:
    1,362
    Likes Received:
    82
    GPU:
    GB Gaming OC 3080
    This is only in applications - not games.

    If I'm wrong, please do throw a link :D

    I know you wrote during rendering ;)
     

Share This Page