NVIDIA GeForce Ampere 3DMark Time Spy Benchmarks show it to be 30 percent faster than RTX 2080 Ti

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 22, 2020.

  1. Venix

    Venix Ancient Guru

    Messages:
    1,689
    Likes Received:
    660
    GPU:
    Palit 1060 6gb
    @CPC_RedDawn i hear you and i agree , looking at history though every time they had clear cat better cards in comparison to nvidia nvidia still outsold em 4:1 or 3:1 the 4850 and 4870 where much cheaper a bit faster and much more efficient to their 260 and 270 counterparts still Nvidia outsold em .... Anyway i really hope i am wrong i would love to be wrong !

    @Denial there is a big point fixing opengl there are emulators and online games based on opengl where if you are into em Nvidia's better performance on opengl making amd not even a runner up option.
     
    Last edited: Jun 23, 2020
    CPC_RedDawn likes this.
  2. geogan

    geogan Master Guru

    Messages:
    825
    Likes Received:
    163
    GPU:
    3070 AORUS Master
    This is what happens when you do nothing to the actual GPU architecture apart from a process die shrink.

    You get ~30% performance increase for free when you do this.

    Take same GPU design and go from 12nm->7nm and you get 20-30%.

    By doing nothing else.
     
  3. LEEc337

    LEEc337 Active Member

    Messages:
    98
    Likes Received:
    21
    GPU:
    Team Green 960
    The goose has a good point we've all assumed top tier ampere what if it's a 3070? I'm on lunch atm so haven't read all 16 pages of comments but nvidia promised a big bump in performance I think the threat of rdna2 and intels xe has nvidia needing to make sure it brings that performance at the right price or lose market share I wish em all good luck and let's enjoy the completion as they all show their goods
     
  4. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    8,591
    Likes Received:
    919
    GPU:
    6800XT Nitro+ SE
    This is what I mean they need to get the name/brand more out there. When you see events you need to see more AMD logos more Radeon logos. If I was them I would be splashing out some cash to get more sponsors on youtube videos, facebook gaming, twitch streamers, in game events, esport tournaments. I mean to this day when you see an esport event its always covered with Intel logos or when you see prebuilt PC's its always Intel and Nvidia... I know some of the shady tactics both Intel and Nvidia have used in the past to have a stranglehold on the prebuilt market and the laptop market but hopefully that will change.

    When I heard years ago that Apple would be using AMD GPU's I was quiet surprised and felt it was a big win for them but you never see it in the marketing and if you do its small and easily overlooked. They need to barter more in these negotiations for more marketing exposure.

    Even asking for a small AMD Radeon Graphics logo on the boot sequence of the new PS5 and Xbox One could go a long way to promote their brand and image. I remember the Gamecube actually had an ATi Graphics sticker on the front of the console it self, would help if the console actually sold through though lol. I think you get my drift......

    PS/ AMD I'm open to hiring offers.... please?? Anyone?? Anyone?? Bueller....?? Bueller....?
     
    Venix likes this.

  5. mikeysg

    mikeysg Ancient Guru

    Messages:
    2,726
    Likes Received:
    339
    GPU:
    Nitro+ RX 6900 XT
    I'm hoping and praying that AMD's big NAVI would do to nVidia what Ryzen did to Intel, a total shake up, not holding my breath though.
     
  6. XenthorX

    XenthorX Ancient Guru

    Messages:
    3,728
    Likes Received:
    1,713
    GPU:
    3090 Gaming X Trio
    2 words : wishful thinking

    The more i think of it, the more i believe it shall be the 3080, positionned just above the ATI Big Navi which is rumored to beat the 2080-ti by 10-15%.
    By this means, Nvidia would place at least 2 products above AMD GPUs asserting its dominance once more.

    Overall the 2080-ti would have stayed an unbeatable beast for 2 years, really happy with this card.
     
    CPC_RedDawn likes this.
  7. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    8,591
    Likes Received:
    919
    GPU:
    6800XT Nitro+ SE
    Yea I have to agree, as much as I want AMD to fight back like they have done on the CPU front, I just would never count Nvidia out of the race in any shape or form.

    I just hate monopolies as we are the ones that get burnt in the end and Nvidia's shareholders know it.
     
    XenthorX likes this.
  8. itpro

    itpro Maha Guru

    Messages:
    1,018
    Likes Received:
    537
    GPU:
    Radeon Technologies
    You cannot believe a gpu with same amount of cores and vram will beat it with +30% at least, right? You must really need a cup or coffee.
     
  9. Astyanax

    Astyanax Ancient Guru

    Messages:
    10,549
    Likes Received:
    3,843
    GPU:
    GTX 1080ti
    the simple act of shrinking the process provides a performance increase.
     
    PrMinisterGR likes this.
  10. itpro

    itpro Maha Guru

    Messages:
    1,018
    Likes Received:
    537
    GPU:
    Radeon Technologies
    Yeah yeah right. Give me a recent example of that exact scenario as mentioned above. And then I will agree with you. Until then that fubar with shrunk die tells nothing to me. Theories.
     

  11. Aura89

    Aura89 Ancient Guru

    Messages:
    8,169
    Likes Received:
    1,278
    GPU:
    -
    Delete
     
    Last edited: Jun 24, 2020
    Venix likes this.
  12. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    12,215
    Likes Received:
    4,391
    GPU:
    2080Ti @h2o
    So letting this sink in for a few days, I do actually believe this is what the 3080TI could be. Matching an overclocked 2080TI (massive OC though) with a following product, sounds about right for Nvidia. So the "leak" is plausible to me.

    If it is good to have things carve out that way, I don't really agree... a bigger jump would be nice, but with a 20% jump and a price reduction of about 25%, they'd sell a lot more cards than with the same price and a 50% performance increase.

    But we shall only know for sure once the cards are released and properly reviewd, the rest is just guessing and projecting one's own hopes and wishes, a psychologically questionable thing to do.
     
    The Goose likes this.
  13. The Goose

    The Goose Ancient Guru

    Messages:
    2,739
    Likes Received:
    194
    GPU:
    MSIrtx3070 gaming X
    If i can run the the division 2 at 4k ultra and get 60fps instead of slightly under ultra i`ll give Nvidia no more than £700 for the 3070...if not....well i still have my Msi rtx2080s ventus xc with slightly less than ultra.
     
  14. alanm

    alanm Ancient Guru

    Messages:
    10,212
    Likes Received:
    2,369
    GPU:
    Asus 2080 Dual OC
    What do you think the point of die shrinking is? Increased efficiency, less resistance, less power needed to flow through the shrunken transistors = higher clocks. Most of the perf gains are from that, higher clocks. Though things are getting more murky as processes go under 10nm as increased density in smaller spaces generates more heat.
     
    itpro likes this.
  15. itpro

    itpro Maha Guru

    Messages:
    1,018
    Likes Received:
    537
    GPU:
    Radeon Technologies
    I am talking about that performance jump number. :) People are already expecting 3080ti being ~50% faster rasterization and ~250-500% faster raytracing than 2080ti. Dreaming is healthy, but.........
     

  16. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,191
    Likes Received:
    196
    GPU:
    EVGA GTX 1080@2,025

    Exactly. I'm upgrading from a GTX 1080 and will likely buy the RTX3070 or RTX3080. Since right now my i7-10700K is bottlenecked to hell by my GPU.
     
  17. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,191
    Likes Received:
    196
    GPU:
    EVGA GTX 1080@2,025
    I wouldn't call exclusive features which can in many instances dramatically improve image quality and realism or increase frame rates on higher resolution via upscaling as "gimmicks". AMD fanboys used to always say that about PhysX... up until it was made open a few years ago. But coming from their perspective, i'd probably be a bit bitter seeing tech i can't use because i saved a few bucks and bought a competitors product.
     
    PlatinumPanther likes this.
  18. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,810
    Likes Received:
    3,363
    GPU:
    6900XT+AW@240Hz
    Sorry, no. nVidia does not support OGL just fine. nVidia is reason why OGL sux badly. Entire design of custom architecture specific hack here and there (everywhere) was wrong. You do not get to fix problems of code with extensions your driver does not have working well because they are kind of emulated and developer did nothing to include your equivalent.
    You could have written code that run on ATi better than on nVidia, but it was always nVidia 1st on OGL.
     
    carnivore likes this.
  19. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,810
    Likes Received:
    3,363
    GPU:
    6900XT+AW@240Hz
    That gimmick is parameter based. Increase rays per pixel by factor of 2, decrease performance to 1/2. It will be enough for already existing games (unless patched).
    But it will never be enough for cutting edge games unless game developer decides it is enough. (And they are sponsored by who?)
     
  20. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,191
    Likes Received:
    196
    GPU:
    EVGA GTX 1080@2,025
    How about you go make a thread about AMD's shitty marketing decisions and how to improve them instead of doing that here?

    oh ffs... you got me to bite.

    You know what would help AMD's image in the eyes of consumers?

    - Stop making huge performance claims that upon release fall far short.
    - Stop mentioning Intel or nVIdia in your advertising. Ever notice how they never mention AMD?
    - Stop promoting deceiving news articles with headlines saying how they're crushing Intel's sales, etc ... when it turns out those numbers came from a very AMD focused retailer in germany. Such articles are flat out lies and people hate being lied to.

    And most importantly....

    - Stop turning every damn Intel/nVIdia thread into an AMD circle jerk. AMD fans have become the annoying protester with a megaphone trying to disrupt someone speaking. Yea, your like minded friends will high five you, but everyone else thinks you're being an idiot.
     

Share This Page