55% of gamers own an Nvidia GPU and 27% of them have an RTX model says JPR

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 19, 2020.

  1. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,040
    Likes Received:
    7,379
    GPU:
    GTX 1080ti
    no it isn't.
     
    carnivore and airbud7 like this.
  2. MonstroMart

    MonstroMart Maha Guru

    Messages:
    1,397
    Likes Received:
    878
    GPU:
    RX 6800 Red Dragon
    Wait I never implied i would amount to anything. Do you think an optional survey twice a year or so really is representative of the market? It's probably the best we have but let's no pretend its margin of error is not significant. It's probably in the neighbourhood of 5% very easily. 7% for intel iGPU looks very very low to me. They are everywhere on portable and people game on them. Maybe just Minecraft, Fortnite or lol but it's still market share.
     
  3. XenthorX

    XenthorX Ancient Guru

    Messages:
    5,059
    Likes Received:
    3,439
    GPU:
    MSI 4090 Suprim X
    As i read the article it's hard not to notice the highlight on the number of participants to the survey. But 4500 persons isn't a small number at all @Hilbert Hagedoorn.
    In European countries, most political survey require a minimum of 1000 participants to be considered representative of a country, it all depends on the ability to define a representative subset of persons.

    From 4,447 people to a representative 1000 subset in worst case scenario, they still had quite a margin to make it happen.
     
  4. sverek

    sverek Guest

    Messages:
    6,069
    Likes Received:
    2,975
    GPU:
    NOVIDIA -0.5GB
    Noob, you jumped a gun with your delusions. Nobody forcing me anything, yet Novidia and M$ are desperate to show that their technology is being used by majority.
    If you took a moment to think before banging your keyboard with your low IQ head, you wouldn't made such a cringe comment.

    Thank you.
     

  5. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    The margin of error isn't that big. They resample at least once a month and there are rarely large swings in the data unless a new region gets brought in. Steam does a good job of telling devs what hardware steam users have.

    Again it's hard to say what jpetty is considering a gamer. In the past they would make announcements like "there are 150m PC gamers" and they would include people who played Facebook and flash games. I don't think anyone in this site would consider those people PC gamers, yet they might be included in the sample.
     
  6. Dribble

    Dribble Master Guru

    Messages:
    369
    Likes Received:
    140
    GPU:
    Geforce 1070
    Statistically steam should be accurate, the numbers are big enough that the margin of error will be small - that's just how statistics work. Hence anyone arguing about how everyone isn't surveyed consistently and hence that makes the survey inaccurate needs to go read up on statistics maths.

    That said it still requires the survey to ask real people randomly - all this "I only get surveyed when I have a card from gpu maker X in my machine" is bs, but they did get confused by Chinese internet cafes for a while which clearly skewed the results.
     
  7. Twiddles

    Twiddles Maha Guru

    Messages:
    1,155
    Likes Received:
    11
    GPU:
    MSI 2080 2190-7550
    Okay, back to casually insulting people, nothing changed in the two years I've been gone from here. Forgot to stay out of AMD vs. Nvidia and AMD vs. Intel threads I guess.
     
  8. ruthan

    ruthan Master Guru

    Messages:
    573
    Likes Received:
    106
    GPU:
    G1070 MSI Gaming
    Why not AMD? well i my first ATI (before AMD bough them and use it as its own) was ATI Mach 64 1 MB. They where quite good 2D cards with great Win/ Dos compatibility.. after than they failed to jump fast enough in 3D cards evolution.. so i had ATI cards, just for 2D besides of Voodoo cards.. when combo 2D/3D cards appeared ATI producs agains come late.. and had more problems than Riva 128/TNT.. and Vooodoo 3..
    Change came with Radeon 7000 series, when ATI was budget king, if wanted performance for bucks, they where best, but they still had worse drivers (especially with OpenGL, Windows 98) than 3dfx, Nvidia cards.. similar problems had Matrox G200/G400 cards. After 3dfx started failing with Voodoo 4/5.. Nvidia became the best option for all.
    I had all - Radeon 7000/ Radeon 9xxx / Radeon X600, Radeon X800, X1300, X1600, X1950.. it was still the same great performance for money but more problems that Geforce. Radeon HD 2000-3000, even lost their performance edge.. Radeon 4000,5000,7000 (best ones) where better, but still same problems.
    After came Radeon R5 - 2xx series and from this period, AMD cards became inferior with used nm technology, so they had bigger TDP, where noisier.. and cooling because of more budget orienting and less manufactor forcing standards.. was often worse, in exception brands like Sapphire or something like that. This period of AMD crisis was very long up to very recent products (Radeon RX 5xxx). I had lots of AMD cards, in my hands in these years, but when i put them to my ring, i always felt that there are more noiser and had more problems with compatibility.
    Linux (for my secondary OS) support was also much worse than proprietary Nvidia Linux drivers, again more hassle... Until last 1/2 years when open Mesa drivers finally become good.

    Also recent AMD APUs looked great, i bough it form my mothers / nephews holiday gaming ring.. but again i felt to AMD drivers hole.. i had to upgrade from Win7 to Win10, only because of it, because they where not supported, i still had more games compatibility problems with games (no so deep like in the past but still some additional hassle). Whole AMD4 platform was not mature enough so my DDR4 still, which are able to run on Intel system at 3400 MHz, ale running at 2666 in it.
    But ok, it was nice for that budget.. until i tested Linux, lots of problem, freezing, after months they fixed 2400G support, but they not give a f.. about 2200G so you can still can find thread that it is unstable with major new distros.

    So if are using only newest games, OS and need performance for buck.. you can be happy with AMD card, otherwise Nvidia was always better option for last 15 years.. its similar to consoles.. MS has simply far better SW team, so they can make things like backward compatibility instead of pathetic streaming. AMD has to be cheaper because of worse SW.. but it doesnt mean that they are better. Unless lets say Mantle as base for Vulkan and DX12, Nvidia was always ahead in innovation too.. yeah lots of things failed, but some not.. For example PhysX is no standard, when is not Nvidia intended GPU physx, but CPU one, it still won the war with Intels Havok.. and its used even on Apple mobile devices by main of game engines. Nvidia Vulkan / DX implementation is now same or better than AMDs, despite of Mantle. They are ahead in OpenCL, GPU science programming (CUDA) and simulations, AI, datacenters, cars.. even without x86 license, which AMD has. Nvidia shield is still far best ARM Android TV / Gaming device, they are ruling is Gaming laptops.

    Yes, they tried raytracing in mainstream, probably too early, but they tried.. AMD, Crytech, Intel, MS etc.. are still only speaking about it.. and how it needs to be standardized. Because there is not real competitor, they are dictating insane prices as Intel did for years.. because they are affair of their know-how and have best proprietary Linux drivers, they are not willing to release signed firmware for making good open source drivers... because they wanted to show difference between CPU / GPU physX, CPU code wasnt good and was often only single threaded.. but again, they doing it because they has to real competitor.. (Bullet physics engine - ha, ha..)

    AMD is now successfull in CPU, because they made far better products than Intel, in GPUs is not true at all and drivers for GPUs is far more complex.. Old Steam games are still runing on modern Nvidia cards, because is major platform.. even when AMD will create something much faster for new games, i doubt that they will really aim to fix problems with old games.. so unless, will AMD card.. less say +50% faster for same price, im not interested for my rigs... but for someone less hardcore - my friends who just want to new gaming rig, who want to invest less amount of money, AMD is good choice if their cards are +10%+ in performance in modern games for same money.. That is how world works.
     
    bernek likes this.
  9. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,750
    Likes Received:
    1,868
    GPU:
    EVGA 1070Ti Black
    steam number are not accurate, you have to OPT in for the survey so unless the OPT in is lie and they see or system info either way, which I would be surprised of either.

    Either way I look forward (sarcastically) to see prices of these new cards and to seeing how out classed 1070ti which was rougly 500$ totel when i step up to it is compared to sub 200$ though I dont think nvidia will do that just yet 300$ cards sure
     
  10. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,222
    Likes Received:
    1,540
    GPU:
    NVIDIA RTX 4080 FE
    I haven't bought an RTX graphics card yet for the same reason I held off upgrading my TV to a 4K one until late last year: simply that the technology was too expensive and the content that made use of said technology was limited in availability and quality. I finally bought a 4K OLED TV last August having accumulated a large number of 4K UHD BD discs (which came with standard BDs as well, although I could still watch 4K discs on my Xbox One X on my then-1080p Sony TV anyway) and finally getting a 420 Mbps fibre connection so I could also stream 4K content. The time was right and I am happy having made the upgrade.

    Looking back at the last 16 months since the release of the ridiculously pricey RTX 2080 Ti and seeing how little content there was makes me so glad I resisted the urge to upgrade, not that it was that difficult with that price! I am happy to wait another year if need be before upgrading to an RTX card as I personally do not think it will become mainstream until the PS5 and Xbox Series X have been released as both of those will support ray-tracing. By then AMD may have something competitve and I may even buy one of their cards rather than NVIDIA's, particularly as both new consoles will use AMD hardware.
     

  11. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Well, JPR basically claims that nVidia managed to produce enough of RTX cards to enable 27% of their entire user base to upgrade.
    On top of that they claim that given (huge) number of users upgraded to $350+ cards... basically higher midrange, high-end, enthusiast class GPUs.
    Then we would have to add proportionally larger group of people who actually upgrade in $150~250 range. And they would end up with something like 60% of nVidia users upgrading their card since Turing. (around 15 months)

    How did nVidia produced that many GPUs? And where is revenue jump?

    So, you are right, their data are bad. They likely have disproportionate larger group of people who update their systems quite often.
     
    carnivore and airbud7 like this.
  12. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Well they polled WCCF readers, so that's almost certainly the case.
     
  13. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    Those numbers are not add-in-boards exclusive, graph didn't change much since Q1 2019.

    [​IMG]
     
  14. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,251
    Likes Received:
    232
    GPU:
    EVGA GTX 1080@2,025
    Yes, units sold is what matters, but thats not what JPR uses. JPR doesn't get their numbers directly from nVidia or AMD. they're all estimates based on sales data from "some" retailers.

    Btw... these numbers are not for Discrete GPU cards. JPR has nVidia @ ~75% or higher the last i looked.
     
  15. RTX / hardware accelerated ray-tracing is not comparable to physx. To write that shows how uninformed you are.
     

  16. KissSh0t

    KissSh0t Ancient Guru

    Messages:
    13,950
    Likes Received:
    7,771
    GPU:
    ASUS 3060 OC 12GB
    In a way it can be considered an extra that doesn't really affect gameplay.... unless it's something like a gameplay advantage in a multiplayer game where you can see a raytraced reflection that otherwise would not have been possible with a more traditional type of reflection.
     
  17. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,759
    Likes Received:
    9,647
    GPU:
    4090@H2O
    Argueably, with enough processing power, it could be a gameplay advantage to look around corners via reflecting surfaces. Or have different lighting (catchy words like "visibility" in BF5 etc.).
    But as long as minimal details ALWAYS has better visibility (in some games lowest setting is no fog at all, multiplayer titles, wtf?), it's still a gimmick. Just like physx doesn't do much in 99% of games / cases

    But what I don't really get, how did physx suddenly enter this discussion? Did I miss something?
     
  18. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,535
    Likes Received:
    3,581
    GPU:
    RTX 4090 Gaming OC
    Because alot of people view raytracing the same way they view physx - a gimmick that slightly improves the visuals, at a massive performance penalty.
     
  19. gx-x

    gx-x Ancient Guru

    Messages:
    1,530
    Likes Received:
    158
    GPU:
    1070Ti Phoenix
    Ambient Occlusion had a huge framerate impact when nVidia first introduced it and everyone went about it just like they are now about RTX. Yet, it (AO in all shapes and forms) got adopted, performance hit is acceptable even negligible on some systems, and RTX will be too, maybe under a different name and not tensor core specific, just like PhysX was buried at some point by simple software solutions (mostly because CPUs got way more powerful than they were back in 2001~2003).
     
  20. Sturmx

    Sturmx New Member

    Messages:
    6
    Likes Received:
    0
    GPU:
    Geforce RTX 2070
    RTX isn't going anywhere. When the ps5 and xbox whatever comes out a stupid amount of games will really start being released with it.
     

Share This Page