NVIDIA GeForce Ampere 3DMark Time Spy Benchmarks show it to be 30 percent faster than RTX 2080 Ti

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 22, 2020.

  1. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    "B's driver devs try to follow the spec more closely than Vendor A, but in the end this tends to do them no good because most devs just use Vendor A's driver for development and when things don't work on Vendor B they blame the vendor, not the state of GL itself."

    Thanks for quoting someone you consider authority.
     
  2. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    It's beautiful how you completely ignore the rest though. They couldn't even do texture fetching properly.

    Also no response about how their Linux OpenGL driver is actually good, while the Windows driver is crap. Meanwhile it's all an "Evil Nvidia Plot to corrupt our fingers while we code the OpenGL driver in Windows" :p

    Why do you feel the need to excuse them, when they're so obviously at fault? You pay them and they do a shitty job.

    As for someone "I consider an authority", just lel
    "Co-owner of Binomial LLC, game and open source developer, graphics programmer, lossless data and GPU texture compression specialist. Worked previously at SpaceX, Forgotten Empires, DICE, Microsoft Ensemble Studios, Valve, and Boss Fight Entertainment."

    What would he know right? It's like Tim Sweeney all over again. "What do all these Unreal Engine guys know about graphics architectures".
    Jesus Christ.
     
  3. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Issue here is that you are biased and look for supporting evidence to do so. While very people who you use, say something else.
    And that's 2014 on top of it. That's beginning of GCN and time nobody sane even touched OGL outside of Linux.

    If you noticed current DX, only imbecile would consider OGL over that or Vulkan.
    I wrote this because you can't go and ignore 1/2 of something you intended to weaponize. So I can have very same person to tell you that you are biased.

    OGL was always like DX10.
     
  4. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    And this one needs very separate addressing. Because this tells that there is something very wrong with your views.

    AMD paid tons of money (they actually overpaid as that was only way) for ATi, and that is reason why nVidia does not have GPU monopoly.
    That was big financial blow for your sake as result!

    And for your information, AMD has intel on left side and nVidia on right side.
    That's: revenue of $75B against AMD's CPU division. And revenue of $11B against GPU division where AMD's revenue is $7B.
    Go, and tell me how bad job they do in comparison on how they are "paid".

    And then go to intel and ask them about thier GPU division which is surely at least twice as big as entire AMD's CPU+GPU division and paid better.
    - - - -
    If AMD was not in GPU business, you would be just fine. You would sit on GTX 680, maybe 4GB version by now.
    But rest of us would feel damage from business strategy similar to intel's. We would be maybe at GTX 680 + 50% performance on top of it.
    Instead we have 4 times stronger GPUs and more available... thanks to fact that AMD "IS" competitive enough where it matters to move industry this way in 6 years.
    - - - -
    If AMD did what you want them to do, so you would praise them, we would be nowhere. OGL would still sux for industry (and be irrelevant for gamers) no matter how well it would run on AMD's HW. On top of that AMD's WH would sux in return on DX where it actually matters for AMD's business.
    You would phase out AMD out of GPU business.

    Following the 5% was what got AMD to problems. It took change of leadership and way of thinking, to get them back on track. AMD is doing what matters, because they do not have finance to pursue 5%. And they never did, because when they had superior products, they were prevented from cashing on that by industry.
     
    TheSissyOfFremont likes this.

  5. TheSissyOfFremont

    TheSissyOfFremont Master Guru

    Messages:
    252
    Likes Received:
    112
    GPU:
    3090 FE
    Not necessarily sure I'd agree with that - I'm currently playing MGSV at 5120x2880DSR on my 1440p native monitor and the image quality increase is enormous. It's incredibly impressive.
    Something I'd happily prioritise over many other features. I think it depends entirely on the specific game you are playing. I've not seen 8k in real life, and I don't know where the real point of significantly diminishing returns (for people with perfect eyesight) is but I think we're a way off.

    I completely agree re: rasterisation performance though, non RT benchmarks are redundant going forward - I think within 5 years you'll see almost all games use some degree of RT and at least a small minority of games use it heavily (way beyond what we've seen thus far).
     
  6. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,535
    Likes Received:
    3,581
    GPU:
    RTX 4090 Gaming OC
    Imo it is a gimmick in line with physx - even in control which everyone says is the posterchild for rtx, it only looks marginally better than screenspace reflections, and at an absurd performance hit. Dlss is just thrash, pure and simple... no one buys a 4k screen to play at sub 4k image quality. Native res or downsampling, anything else is fail.
     
  7. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    10,413
    Likes Received:
    3,079
    GPU:
    PNY RTX4090
    I agree with everything you said about AMD needing to change and I agree on every point you made.

    Trust me I am no AMD fanboy, I want them to do well so it benefits all of us and in the end we get better products at better prices. No one likes monopolies except the companies and run them.

    Why wouldn't anyone want AMD to do to Nvidia what AMD done to Intel? Hope is all we have after being let down so many times before and even that is wearing thin.

    EDIT: Oh and the job thing was a joke.... maybe you didn't get it.....
     
  8. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,017
    Likes Received:
    7,353
    GPU:
    GTX 1080ti
    Your delusion is on full display here.

    OpenGL is preferred over vulkan in a number of cases because vulkan is far more restrictive in capability.

    ATI was going just fine without AMD swooping in because they wanted to be an every chip vendor and screwing the engineering division and losing half its software division to IBM.

    PS: If AMD had have agreed to jensens terms on a merger, they wouldn't have had a decade of obscure utility due to poor chip design.
     
  9. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    ad 1st point) AMD have not done that with RDNA and that was big improvement on so many levels in comparison to GCN, most people here can't even comprehend.
    ad 2nd point) https://www.guru3d.com/news_story/c..._threads_it_all_doesnt_matter_says_intel.html (Not 1st and not last.)
    ad 3rd) It's not only retailer. And it is not like this shop refuses to sell intel or does not have them available. Their sales data are valid as any other because they do not play favorites.

    ad 4) Who prevented you or anyone other from speaking here?
    - - - -
    Only thing I see is that you got yourself triggered by someone's personal experience and a joke.
    And I would completely ignore it if it was not for point 1 and 3 not applying to current situation. Plus 2 and 4 being false.
     
  10. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Didn't they have revenue of $2B a year ($500M per quarter) while having $100M loss per quarter at that time? in Q1 2006 they made more marketing announcement than they did in entire year before, just to keep their stock price from crushing fast. So they were already about to sell.
    Jensen, who committed fraud to keep nVidia afloat in bad times. And while it enabled him to get very powerful GPU out of the house, it was far from good one in terms of industry standard. And then he went around and got game developers to bend and not follow industry standard, so games would actually run on those cards.
    That resulted in simple thing, death of most of other GPU manufacturers, who did follow standard and who could not benefit from implementing given features.
    (Investing transistors into now useless features. And being beaten by nVidia who invested transistors into doubling performance of those few they had.)

    Jensen Huang even admitted it while painting himself as an hero. He was daring, he did bet all he had (and even what he did not have) and won. But he is not someone I would trust. And without trust, there is no cooperation.

    All those people who complain about not GPU competition and praise nVidia at same time...
     

  11. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    This is so wrong. OpenGL is the entry point for graphics programming for most universities in the world. It's fine as an API, and it's going anywhere. Of course I'm biased. It runs like dogshit. It always has. The same hardware in Linux runs fine. I still don't understand how you explain that.

    It wasn't for neither my or your sake. Hector Ruiz saw that unless AMD had two sources of income, and unless they could compete with Intel which was making their own GPUs at the time (the ones we know as integrated now), AMD would be dead. That's why they overpaid, so that they wouldn't be irrelevant in five years. No company does you favors.

    Sure. So if your company is smaller, and surrounded by cutthroat giants, the solution is to mess up your driver releases, so that the unpopular OS gets the good driver. If anything, because AMD is in that situation, it makes decisions like these even worse for them as a company.



    Probably a bad example, but see it in another way. A film at 1080p still looks much more real than any game, at any resolution. If you can have processing that enables that kind of fidelity per pixel, you don't need as many pixels.

    Fraud is a big word for what every company does in its growth cycle, which is touting their projected value. That's completely normal and every investor in the world knows that. What do you mean "industry standard". What did they do?

    This all sounds very romantic and very not specific. Nvidia is here because 3dfx killed themselves with the 3000 series, and because of Maxwell. Then they won't go anywhere because of CUDA. They were lucky once and then competent twice. AMD has always either been very unlucky, bullied or shortshighted.
     
    alanm and TheSissyOfFremont like this.
  12. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Same thing you accuse AMD's OGL drivers of. If you used standard features of API, it would not run for them while it would for all others.
    This is one thing everyone should understand. That's what raytracing is about.
    Sadly, not with current RTX, nor with next ones or RDNA2.
    Seen some tech talk about current good practices for raytracing power distribution (AMD).
    Basically one AO/GI-ray per pixel, one shadow-ray per pixel. (Showcase had one light source)
    And then 1/16th to 1 reflection-ray per pixel.

    That's why AMD had that ugly looking all reflective demo. To show that they can do all that's being done via raytracing on every single pixel of screen. (Not just on some particular parts.)

    But that was not enough to do proper high quality photorealistic raytracing.
     
    Last edited: Jun 25, 2020
  13. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    Yeah, it works for all others. That's the thing. OpenGL allows vendor extensions. They all have them. Products can support them. ATi/AMD has a ton of extensions published, same as NVidia.
    Here is the registry. Anybody can contribute.
    [​IMG]
    That's how OpenGL works.

    You still haven't answered how you explain your EVIL NVIDIA theory when the AMD OpenGL Linux driver is actually better than NVIDIA's.

    No, it isn't. It's not a replacement for raster techniques, that would be unfathomably stupid. But it can do things that they cannot do.

    Again, why do you excuse their obvious incompetence.
     
  14. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Except that given situation with nVidia HW design was that their HW was not capable to even do those things. (That card which saved nVidia was Riva 128. And had 8 out of 32 blend modes HW limit while DX mandated 32. That's a lot in times of fixed function HW in terms of texture manipulation capability.)
    Who's incompetence. What's your problem. I am not even mentioning anything that can be read as excuse. It is general raytracing post about image quality. And it does not even say anything about rasterization.
    Are you in your mind reading something else than what I am writing?
     
  15. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    Apparently you're blind since Intel has mentioned AMD quite a bit over the last couple years.
    Pot, meet Kettle.....
    You and many others regularly attempt to turn every AMD related thread into Intel/NVidia "circle jerks".....
     
    HandR and carnivore like this.

  16. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,017
    Likes Received:
    7,353
    GPU:
    GTX 1080ti
  17. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,535
    Likes Received:
    3,581
    GPU:
    RTX 4090 Gaming OC
    If that tflops number is true, that is an increase over the 2080 ti of 55%... that would be very nice indeed !
     
  18. alanm

    alanm Ancient Guru

    Messages:
    12,236
    Likes Received:
    4,437
    GPU:
    RTX 4080
    Same source who came up with the fake SK Hynix Big Navi rumour.
     
  19. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    How is this a "leak". That's basically the Tesla cards but with a gaming config :p

    Can I become a "leaker" too please?
     

Share This Page