Steam: Nvidia sold 11 GeForce RTX 3000 for every Radeon RX 6000 that AMD sold

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 5, 2021.

  1. IceVip

    IceVip Master Guru

    Messages:
    811
    Likes Received:
    151
    GPU:
    RTX 3080 Ti
    Actually got me wondering, so i headed over to the craptastic userbenchmark, and it showed the same results.

    [​IMG]
     
    alanm and nizzen like this.
  2. nizzen

    nizzen Ancient Guru

    Messages:
    1,619
    Likes Received:
    517
    GPU:
    3x3090/3060ti/2080t
    2080ti was one of the best investments. I Bought it day one, and sold it 2 years later. Bough 3080 strix + waterblock for less than I got for the 2080ti. Tnx miners Free upgrade
     
  3. nizzen

    nizzen Ancient Guru

    Messages:
    1,619
    Likes Received:
    517
    GPU:
    3x3090/3060ti/2080t
    Same story on 3dmark :)
     
  4. Denial

    Denial Ancient Guru

    Messages:
    13,565
    Likes Received:
    3,117
    GPU:
    EVGA RTX 3080
    Yeah I mean steam survey is probably accurate when it comes to metrics like this. People can deny it all they want and complain about polling but it works. AMD is probably just favoring CPU sales over GPU ones because they make more money on each sale and can actually do damage to Intel's marketshare.. where as with Nvidia it's going to be much tougher.

    @Hilbert Hagedoorn Is it possible to run a report of Guru3D's members hardware profiles?
     

  5. Fediuld

    Fediuld Master Guru

    Messages:
    650
    Likes Received:
    346
    GPU:
    AMD 5700XT AE
    AMD Radeon HD 8800 is MOBILE GPU that barely saw the light of day. Let alone outsell the NVidia lineup 2 to 1 for 8 years
    You couldn't find a laptop with new HD8800 in 2013 you cannot even today, yet on survey has biggest share of GPUs how and why is that possible?

    I will answer to you. Because most mobile and discreet AMD gpus are going there due to a bug in the survey.
    My 5700XT AE is marked at HD8800 in the survey, as did my Vega64. Many AMD who bother to run the survey you will find saying the same.

    So clearly there is an issue with the survey itself. Also do not forget is run on internet public PCs too, people in China and other poor countries will run the survey against their account for that expensive 3090 without owning one.

    And Hilbert can do some digging here as he knows better the HD8800 was a very rare laptop GPU back in 2013 that doesn't justify such a huge number of GPUs in the survey.

     
  6. Fediuld

    Fediuld Master Guru

    Messages:
    650
    Likes Received:
    346
    GPU:
    AMD 5700XT AE
    Which is something posted above. HD8800 series was a laptop GPU in 2013 and while was searching for it until 2015 couldn't find any laptop using it to buy it.
     
  7. IceVip

    IceVip Master Guru

    Messages:
    811
    Likes Received:
    151
    GPU:
    RTX 3080 Ti
    Userbenchmark is drunk af tho :D, how does this make sense.

    If 3090's 64,441 samples make up 1.1% of the market

    then how the hell

    Amd's 4 RX's at 49,851 make up 1,4% of the market
     
  8. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    1,566
    Likes Received:
    1,576
    GPU:
    RTX 2070 Super
    If we go to the amazon site and verify the most sold graphics cards, we can see that the first RX 6800, comes in 53rd place.
    In a way, it corroborates what we see in steam.
     
    GoldenTiger and rdmetz like this.
  9. IceVip

    IceVip Master Guru

    Messages:
    811
    Likes Received:
    151
    GPU:
    RTX 3080 Ti
    I'll answer myself with a wild guess and say that someone repeatedly benchmarked his card and made a lot of result samples, which userbenchmark probably knows about and doesn't add them to the market share count.

    //which leads me to believe 3090 users benchmark the hell out of their card in userbench. My lord i never launched it even once on my 3080 ti, its as useless as one gets.
     
  10. TimmyP

    TimmyP Master Guru

    Messages:
    752
    Likes Received:
    69
    GPU:
    RTX 3070
    A lot of fighting to remain ignorant. 20 years its been this way. 20 years... Still fighting for that ignorance.
     
    GoldenTiger and rdmetz like this.

  11. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    6,202
    Likes Received:
    2,516
    GPU:
    HIS R9 290
    Yeah right, you clearly haven't read my post, so how about stop jumping to conclusions and inserting your personal preferences as truth?
    You're also not woke because you have first-hand experience with it. Anyone can see the difference by watching [non-lossy] recordings of it, where you can actually compare side-by-side. Doesn't take an artist to see the differences.
    Again, re-read what I wrote. SOME implementations yield no noteworthy benefits. RT is critical to improving world detail but when all a dev uses it for is shadows and flat reflective surfaces, whatever minimal improvement you yield from RT is not worth the performance deficit. If you disagree, seems to me you're falling for a placebo or sunk cost fallacy.
    Seriously, what is up with your reading comprehension? The whole point of my post was to say that as of today, DXR is too resource-demanding, but in a short while (definitely less than 10 years), it will be optimized to the point that it's no big deal. Same sort of happened with just about every other major graphics technology. So for those of us who don't feel like downgrading the visual experience with DLSS or losing half our framerate so puddles have a minimal detail improvement, spending less while sticking with rasterization makes sense. Don't get butthurt just because you spent too much money on a watercooled 3090.
    In some circumstances, I'm sure that's true. Definitely not all.
     
  12. Denial

    Denial Ancient Guru

    Messages:
    13,565
    Likes Received:
    3,117
    GPU:
    EVGA RTX 3080
    So most mobile and discreet AMD GPUs get marked as a HD8800 and they only make up 0.70% of the market? lol

    There might be some bug but that number is so low it's basically irrelevant to the statistics of other generations and it certainly isn't most.
     
  13. Mineria

    Mineria Ancient Guru

    Messages:
    4,989
    Likes Received:
    414
    GPU:
    Asus RTX 3080 Ti
    25%, so those prices are still scalper prices.
    Just take a peek at the list and sort by price, notice how all the cards with the price-tag they are supposed to have aren't available?
    And it's basically the same with every hardware shop over here.
     
  14. Mineria

    Mineria Ancient Guru

    Messages:
    4,989
    Likes Received:
    414
    GPU:
    Asus RTX 3080 Ti
    I also find RT to resource demanding for what it adds, looking at the newer cards RT performance increases aren't that great either, especially considering that a proper upgrade would cost me at least twice of what I paid for my current card.
    Still waiting for prices to get down to a normal level and then go for at least something that performs on a 6900XT or 3090 level, one or two generations of newer GPU's will probably be available at that time.
     
  15. Krizby

    Krizby Master Guru

    Messages:
    966
    Likes Received:
    213
    GPU:
    3090 Watercooled
    All you are looking at is buying a current high-end GPU to play old games at high FPS really, rasterization will continue to be punishing to high-end GPU forever (look how 3090/6900XT play Cyberpunk 2077 at 4K Ultra without RT), so no you will never get to experience Ultra Raster+RT without some sort of DLSS/FSR, but that a good thing. Let me explain:
    Let say RT will improve Visual by 10% while costing 50%FPS while DLSS take away 5% Visual while improving FPS by 100%, the net outcome with RT+DLSS is 105% visual at 100% original FPS isn't it. That is to put lightly how transformative RT Reflection is to a game.
    Like how RT transform the visual in The Ascent

    Those reflection with the chairs and stone statues are next to impossible to imitate with Screen Space Reflections.

    I have been buying high end GPU for 19 years already, I can tell you that any noticeable improvement to visual would always cost a considerable chunk of performance, like when I play Crysis at sub 30FPS with an ATI HD4890 or when I play Witcher 3 at sub 60FPS with a Titan X Maxwell, but why do you care when you are getting playable framerate, like when RT cut the framerates from 120 to 60 and you are okay with 60FPS anyway? FYI I put a 60FPS limit in games that don't require more.
     
    GoldenTiger likes this.

  16. DannyD

    DannyD Ancient Guru

    Messages:
    3,374
    Likes Received:
    2,170
    GPU:
    1080ti FE
    Can't we just buy whatever takes our fancy and be happy? Do we need to hate and belittle others that make different choices?
    How anyone can get pumped up for a company that's only interest is to take your money is beyond me.
    You like a particular product line great! enjoy it and be happy!
     
    Maddness likes this.
  17. Krizby

    Krizby Master Guru

    Messages:
    966
    Likes Received:
    213
    GPU:
    3090 Watercooled
    Are you from a Capitalist country? be thankful because you don't want to be in a country where companies don't want to take your money :D (imagine how state owned companies operate)
     
    DannyD likes this.
  18. Airbud

    Airbud Master Guru

    Messages:
    926
    Likes Received:
    1,526
    GPU:
    PNY GTX 1060 XLR8
    Great idea!....I'd bet the 1060/1070 would still be the most popular out of say 200 members....

    Pascal is so tough and dependable...(Airbud knocks on wood)
     
    DannyD likes this.
  19. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    6,202
    Likes Received:
    2,516
    GPU:
    HIS R9 290
    Well yeah, and water is wet. As hardware gets better, detail levels increase. We're nearing uncanny valley levels of detail, so the demand for more graphical prowess has gone up. But the point at hand is that RT processing currently is not all that optimized. Tessellation was the same way - it took a while until that was worth widely implementing, where now mid-range GPUs can handle it no problem. It doesn't make sense for companies to heavily invest in a technology that, at the beginning, is niche and has an uncertain future. It has to build up over time. Case in point: the RTX 3000 series yielded a substantial performance increase over the 2000 series, because Nvidia successfully (I say that loosely) proved the value of their investment. A lot went right with DXR for the 3000 series, which basically helped solidify its future. But it's still an immature technology, and needs more time to grow. For most people who don't have the money to burn on a water cooled 3090, DXR currently (there's that keyword again) isn't worth it in most (but not all) cases.
    You say that as though that's universally true. It's not. You also say that as though everyone has a 3090. They don't. The Ascent is a good example of what you say, but games like Cyberpunk (which you brought up earlier) are not worth the performance penalty, especially when you account for mainstream hardware. But as I keep saying over and over again: the technology will mature. That's not just in terms of optimization, but also in implementation. It will reach a point where you don't need a flagship GPU to enjoy it. And that leads me to my original point:
    Most people don't want to spend extra for an immature feature. For such people, it makes more sense to pay less for a GPU with inferior DXR performance, and replace it at a time when you don't have to make so many sacrifices. If you honestly don't understand that, your head is too high in the clouds. Assuming there wasn't pricing issues, AMD would have the better rasterization performance per dollar; Nvidia has the better DXR performance per dollar. So, people who want to wait for DXR to mature would save money on AMD. In today's market, there is no obvious choice, but if you're going to overpay, you might as well go Nvidia.
     
    Last edited: Aug 5, 2021
  20. rdmetz

    rdmetz Active Member

    Messages:
    58
    Likes Received:
    22
    GPU:
    2x EVGA GTX 780 6GB SC
    I read every page of these posts and I just have to say the amount of excuses for amd performing exactly as one should have expected is unbelievable!

    The simple truth is they couldn't deliver on their promise and people quickly figured that out.

    What sounded good on launch day turned into no stock, much higher aib prices than expected, and lack of must have features to make you work around the previous two.

    It was by week 2 I truly expected that their cards would not come close to Nvidia this gen (like almost all gen in recent past) and the numbers are paying off and the excuses are rolling in.

    I get that people have brand preferences but to deny the truth is the definition of Fanboyism.

    The flack so many amd fans gave Nvidia for things like their cost, their performance in things like rt or how little dlss does, or how they went with samsung instead of TSMC has all literally blown up in their face.

    Nvidias features (dlss vs fsr, Nvidia rt vs amd rt, etc) are objectively better their pricing (at msrp and scalping comparing comparing what was promised as launch vs what you'll have to pay in reality) and their much more available and able to ACTUALLY deliver chips from Samsung.


    All of it paid off and is why we are where we are.

    Be fanboy all you want deny facts and the general market itself it won't change anything and the surveys, sales charts, and benchmarks will still speak the truth.
     
    GoldenTiger likes this.

Share This Page