55% of gamers own an Nvidia GPU and 27% of them have an RTX model says JPR

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 19, 2020.

  1. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,329
    Likes Received:
    178
    GPU:
    MSI GTX1070 GamingX
    It's accurate enough to give a general view of market share. They'd have to be an unprecedented increase in AMD users for the results to be viewed as inaccurate.

    Also, since the results are presented as a percentage %, how much do you honestly think you would've contributed to the stats? Lets get real here.
     
  2. Astyanax

    Astyanax Ancient Guru

    Messages:
    6,650
    Likes Received:
    2,088
    GPU:
    GTX 1080ti
    no it isn't.
     
    carnivore and airbud7 like this.
  3. MonstroMart

    MonstroMart Master Guru

    Messages:
    680
    Likes Received:
    230
    GPU:
    GB 5700 XT GOC 8G
    Wait I never implied i would amount to anything. Do you think an optional survey twice a year or so really is representative of the market? It's probably the best we have but let's no pretend its margin of error is not significant. It's probably in the neighbourhood of 5% very easily. 7% for intel iGPU looks very very low to me. They are everywhere on portable and people game on them. Maybe just Minecraft, Fortnite or lol but it's still market share.
     
  4. XenthorX

    XenthorX Ancient Guru

    Messages:
    3,041
    Likes Received:
    935
    GPU:
    EVGA XCUltra 2080Ti
    As i read the article it's hard not to notice the highlight on the number of participants to the survey. But 4500 persons isn't a small number at all @Hilbert Hagedoorn.
    In European countries, most political survey require a minimum of 1000 participants to be considered representative of a country, it all depends on the ability to define a representative subset of persons.

    From 4,447 people to a representative 1000 subset in worst case scenario, they still had quite a margin to make it happen.
     

  5. sverek

    sverek Ancient Guru

    Messages:
    6,006
    Likes Received:
    2,838
    GPU:
    NOVIDIA -0.5GB
    Noob, you jumped a gun with your delusions. Nobody forcing me anything, yet Novidia and M$ are desperate to show that their technology is being used by majority.
    If you took a moment to think before banging your keyboard with your low IQ head, you wouldn't made such a cringe comment.

    Thank you.
     
  6. Denial

    Denial Ancient Guru

    Messages:
    12,788
    Likes Received:
    2,053
    GPU:
    EVGA 1080Ti
    The margin of error isn't that big. They resample at least once a month and there are rarely large swings in the data unless a new region gets brought in. Steam does a good job of telling devs what hardware steam users have.

    Again it's hard to say what jpetty is considering a gamer. In the past they would make announcements like "there are 150m PC gamers" and they would include people who played Facebook and flash games. I don't think anyone in this site would consider those people PC gamers, yet they might be included in the sample.
     
  7. Dribble

    Dribble Member Guru

    Messages:
    137
    Likes Received:
    57
    GPU:
    Geforce 1070
    Statistically steam should be accurate, the numbers are big enough that the margin of error will be small - that's just how statistics work. Hence anyone arguing about how everyone isn't surveyed consistently and hence that makes the survey inaccurate needs to go read up on statistics maths.

    That said it still requires the survey to ask real people randomly - all this "I only get surveyed when I have a card from gpu maker X in my machine" is bs, but they did get confused by Chinese internet cafes for a while which clearly skewed the results.
     
  8. Twiddles

    Twiddles Maha Guru

    Messages:
    1,157
    Likes Received:
    10
    GPU:
    MSI 2080 2190-7550
    Okay, back to casually insulting people, nothing changed in the two years I've been gone from here. Forgot to stay out of AMD vs. Nvidia and AMD vs. Intel threads I guess.
     
  9. ruthan

    ruthan Master Guru

    Messages:
    365
    Likes Received:
    48
    GPU:
    G970/3.5G MSI
    Why not AMD? well i my first ATI (before AMD bough them and use it as its own) was ATI Mach 64 1 MB. They where quite good 2D cards with great Win/ Dos compatibility.. after than they failed to jump fast enough in 3D cards evolution.. so i had ATI cards, just for 2D besides of Voodoo cards.. when combo 2D/3D cards appeared ATI producs agains come late.. and had more problems than Riva 128/TNT.. and Vooodoo 3..
    Change came with Radeon 7000 series, when ATI was budget king, if wanted performance for bucks, they where best, but they still had worse drivers (especially with OpenGL, Windows 98) than 3dfx, Nvidia cards.. similar problems had Matrox G200/G400 cards. After 3dfx started failing with Voodoo 4/5.. Nvidia became the best option for all.
    I had all - Radeon 7000/ Radeon 9xxx / Radeon X600, Radeon X800, X1300, X1600, X1950.. it was still the same great performance for money but more problems that Geforce. Radeon HD 2000-3000, even lost their performance edge.. Radeon 4000,5000,7000 (best ones) where better, but still same problems.
    After came Radeon R5 - 2xx series and from this period, AMD cards became inferior with used nm technology, so they had bigger TDP, where noisier.. and cooling because of more budget orienting and less manufactor forcing standards.. was often worse, in exception brands like Sapphire or something like that. This period of AMD crisis was very long up to very recent products (Radeon RX 5xxx). I had lots of AMD cards, in my hands in these years, but when i put them to my ring, i always felt that there are more noiser and had more problems with compatibility.
    Linux (for my secondary OS) support was also much worse than proprietary Nvidia Linux drivers, again more hassle... Until last 1/2 years when open Mesa drivers finally become good.

    Also recent AMD APUs looked great, i bough it form my mothers / nephews holiday gaming ring.. but again i felt to AMD drivers hole.. i had to upgrade from Win7 to Win10, only because of it, because they where not supported, i still had more games compatibility problems with games (no so deep like in the past but still some additional hassle). Whole AMD4 platform was not mature enough so my DDR4 still, which are able to run on Intel system at 3400 MHz, ale running at 2666 in it.
    But ok, it was nice for that budget.. until i tested Linux, lots of problem, freezing, after months they fixed 2400G support, but they not give a f.. about 2200G so you can still can find thread that it is unstable with major new distros.

    So if are using only newest games, OS and need performance for buck.. you can be happy with AMD card, otherwise Nvidia was always better option for last 15 years.. its similar to consoles.. MS has simply far better SW team, so they can make things like backward compatibility instead of pathetic streaming. AMD has to be cheaper because of worse SW.. but it doesnt mean that they are better. Unless lets say Mantle as base for Vulkan and DX12, Nvidia was always ahead in innovation too.. yeah lots of things failed, but some not.. For example PhysX is no standard, when is not Nvidia intended GPU physx, but CPU one, it still won the war with Intels Havok.. and its used even on Apple mobile devices by main of game engines. Nvidia Vulkan / DX implementation is now same or better than AMDs, despite of Mantle. They are ahead in OpenCL, GPU science programming (CUDA) and simulations, AI, datacenters, cars.. even without x86 license, which AMD has. Nvidia shield is still far best ARM Android TV / Gaming device, they are ruling is Gaming laptops.

    Yes, they tried raytracing in mainstream, probably too early, but they tried.. AMD, Crytech, Intel, MS etc.. are still only speaking about it.. and how it needs to be standardized. Because there is not real competitor, they are dictating insane prices as Intel did for years.. because they are affair of their know-how and have best proprietary Linux drivers, they are not willing to release signed firmware for making good open source drivers... because they wanted to show difference between CPU / GPU physX, CPU code wasnt good and was often only single threaded.. but again, they doing it because they has to real competitor.. (Bullet physics engine - ha, ha..)

    AMD is now successfull in CPU, because they made far better products than Intel, in GPUs is not true at all and drivers for GPUs is far more complex.. Old Steam games are still runing on modern Nvidia cards, because is major platform.. even when AMD will create something much faster for new games, i doubt that they will really aim to fix problems with old games.. so unless, will AMD card.. less say +50% faster for same price, im not interested for my rigs... but for someone less hardcore - my friends who just want to new gaming rig, who want to invest less amount of money, AMD is good choice if their cards are +10%+ in performance in modern games for same money.. That is how world works.
     
    bernek likes this.
  10. tsunami231

    tsunami231 Ancient Guru

    Messages:
    10,438
    Likes Received:
    568
    GPU:
    EVGA 1070Ti Black
    steam number are not accurate, you have to OPT in for the survey so unless the OPT in is lie and they see or system info either way, which I would be surprised of either.

    Either way I look forward (sarcastically) to see prices of these new cards and to seeing how out classed 1070ti which was rougly 500$ totel when i step up to it is compared to sub 200$ though I dont think nvidia will do that just yet 300$ cards sure
     

  11. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    15,883
    Likes Received:
    420
    GPU:
    EVGA GTX 1080 Ti SC2
    I haven't bought an RTX graphics card yet for the same reason I held off upgrading my TV to a 4K one until late last year: simply that the technology was too expensive and the content that made use of said technology was limited in availability and quality. I finally bought a 4K OLED TV last August having accumulated a large number of 4K UHD BD discs (which came with standard BDs as well, although I could still watch 4K discs on my Xbox One X on my then-1080p Sony TV anyway) and finally getting a 420 Mbps fibre connection so I could also stream 4K content. The time was right and I am happy having made the upgrade.

    Looking back at the last 16 months since the release of the ridiculously pricey RTX 2080 Ti and seeing how little content there was makes me so glad I resisted the urge to upgrade, not that it was that difficult with that price! I am happy to wait another year if need be before upgrading to an RTX card as I personally do not think it will become mainstream until the PS5 and Xbox Series X have been released as both of those will support ray-tracing. By then AMD may have something competitve and I may even buy one of their cards rather than NVIDIA's, particularly as both new consoles will use AMD hardware.
     
  12. Fox2232

    Fox2232 Ancient Guru

    Messages:
    10,350
    Likes Received:
    2,468
    GPU:
    5700XT+AW@240Hz
    Well, JPR basically claims that nVidia managed to produce enough of RTX cards to enable 27% of their entire user base to upgrade.
    On top of that they claim that given (huge) number of users upgraded to $350+ cards... basically higher midrange, high-end, enthusiast class GPUs.
    Then we would have to add proportionally larger group of people who actually upgrade in $150~250 range. And they would end up with something like 60% of nVidia users upgrading their card since Turing. (around 15 months)

    How did nVidia produced that many GPUs? And where is revenue jump?

    So, you are right, their data are bad. They likely have disproportionate larger group of people who update their systems quite often.
     
    carnivore and airbud7 like this.
  13. Denial

    Denial Ancient Guru

    Messages:
    12,788
    Likes Received:
    2,053
    GPU:
    EVGA 1080Ti
    Well they polled WCCF readers, so that's almost certainly the case.
     
  14. Mineria

    Mineria Ancient Guru

    Messages:
    4,049
    Likes Received:
    64
    GPU:
    Asus RTX 2080 Super
    Those numbers are not add-in-boards exclusive, graph didn't change much since Q1 2019.

    [​IMG]
     
  15. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,144
    Likes Received:
    172
    GPU:
    EVGA GTX 1080@2,025
    Yes, units sold is what matters, but thats not what JPR uses. JPR doesn't get their numbers directly from nVidia or AMD. they're all estimates based on sales data from "some" retailers.

    Btw... these numbers are not for Discrete GPU cards. JPR has nVidia @ ~75% or higher the last i looked.
     

  16. LIGuitar77

    LIGuitar77 Master Guru

    Messages:
    627
    Likes Received:
    44
    GPU:
    ASUS 7790 2GB OC
    What many n00b saibots don't consider is NVIDIA's poor introduction of high latency (with respect to system latency, not audio in-and-of itself) in comparison to AMD, and that is why anyone that knows the first thing chooses an AMD video solution for a machine that will have audio production being done on. You don't have to be an audio engineer to appreciate what I am saying. All you need to be is someone that owns a computer.
     
  17. K.S.

    K.S. Ancient Guru

    Messages:
    2,305
    Likes Received:
    622
    GPU:
    EVGA RTX 2080 Ti XC
    RTX / hardware accelerated ray-tracing is not comparable to physx. To write that shows how uninformed you are.
     
  18. sneazzy95

    sneazzy95 Member

    Messages:
    12
    Likes Received:
    2
    GPU:
    500
    Last edited: Feb 27, 2020
  19. KissSh0t

    KissSh0t Ancient Guru

    Messages:
    7,806
    Likes Received:
    1,938
    GPU:
    ASUS RX 470 Strix
    In a way it can be considered an extra that doesn't really affect gameplay.... unless it's something like a gameplay advantage in a multiplayer game where you can see a raytraced reflection that otherwise would not have been possible with a more traditional type of reflection.
     
  20. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    11,553
    Likes Received:
    3,514
    GPU:
    2080Ti @h2o
    Argueably, with enough processing power, it could be a gameplay advantage to look around corners via reflecting surfaces. Or have different lighting (catchy words like "visibility" in BF5 etc.).
    But as long as minimal details ALWAYS has better visibility (in some games lowest setting is no fog at all, multiplayer titles, wtf?), it's still a gimmick. Just like physx doesn't do much in 99% of games / cases

    But what I don't really get, how did physx suddenly enter this discussion? Did I miss something?
     

Share This Page