NVIDIA Announces GeForce RTX 3070, 3080 and 3090 (With Even More Shader processors)

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 1, 2020.

  1. alanm

    alanm Ancient Guru

    Messages:
    12,235
    Likes Received:
    4,437
    GPU:
    RTX 4080
    He "suspects" that it maybe due to vram whenever he finds some titles fps a little more in 4k. Not an absolute, only an assumption. Many variables can affect performance aside from vram capacity. How do you explain 2080S with same vram beating 1080ti in those benches? The ONLY way you will truly know is to use the same exact GPU with different vram capacities, not different GPUs with different arches. When the 3080 10gb and 20gb versions are released and compared, only then you will have proper comparison criteria.

    Also other sites benches show different results to techspot:
    [​IMG]
     
    pharma likes this.
  2. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,011
    Likes Received:
    7,353
    GPU:
    GTX 1080ti
    It's not a gamer gpu, its a prosumer gpu.
     
    itpro likes this.
  3. sbacchetta

    sbacchetta Member Guru

    Messages:
    141
    Likes Received:
    66
    GPU:
    GTX 1080ti
    With geometry and texture units number being kept around the same as Turing (especially when compared to the huge increase in FP32 unit), how much variation we would see between the 3070 (shader heavy) and the 2080ti in different case scenario will be interesting to see...
    If I had to take a guess, Witcher 3 novigrad or the recent flight simulator (when flying in a city) might see the 2080ti pull ahead...
     
  4. kapu

    kapu Ancient Guru

    Messages:
    5,418
    Likes Received:
    802
    GPU:
    Radeon 7800XT
    There is not much to suspect when couple games have same results.
     

  5. alanm

    alanm Ancient Guru

    Messages:
    12,235
    Likes Received:
    4,437
    GPU:
    RTX 4080
    "Suspect" is the word he used to qualify his technically deficient analysis. He never used a 2080S alongside his flawed methodology which would have shot his argument down. Furthermore, the Doom benchmark he used is way off from the Techpowerup, GameGPU and other sites which show the 2080 FAR ahead of the 1080ti @ 4k as would be the case in Vulkan benchmarks. Other benches he used differ from Guru3d findings as well. I didnt think much of HU/techspot before, but thanks to you now I know they are not a technically competent site.

    I should add, I am only referring to the 8gb vs 11gb vram aspect of the discussion. The 2080 imo may be failure in other aspects (gen to gen improvements), but its 8gb has not yet been proven to be its limiting factor.
     
    Last edited: Sep 5, 2020
    Aura89 and pharma like this.
  6. pharma

    pharma Ancient Guru

    Messages:
    2,485
    Likes Received:
    1,180
    GPU:
    Asus Strix GTX 1080
    That's why you never take the results from one site for granted, especially HU.
    As suggested above check some competent sites instead of click-baits.
     
    alanm likes this.
  7. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    I think you misread numbers in card names I wrote.
     
  8. alanm

    alanm Ancient Guru

    Messages:
    12,235
    Likes Received:
    4,437
    GPU:
    RTX 4080
    I think we are beginning to see the limits of 1080p perf with these cards. So unless we see new, better CPUs that will not limit 1080p perf, the big differences will be at high resolutions mostly. I think 1080p perf will become almost irrelevant with next gen GPUs. Notice how Nvidia is focusing on 4k in its presentation?
     
  9. ACEB

    ACEB Member Guru

    Messages:
    129
    Likes Received:
    69
    GPU:
    2
    1080, 10 Gbps 8GB
    1080ti, 11 Gbps 11GB
    2080, 14 Gbps 8GB
    2080s, 15.5 Gbps 8GB
    2080ti, 14 Gbps 11GB
    3080, 19 Gbps 10GB

    1080ti might have a decent amount of memory but the speeds are massively down. Its also a good reason for Nvidia to only be showing 4k results. Having watched the HU video I felt it a great oversight to not include how much memory was being used, if its suspected to be a memory issue then surely it should have been a focus of the test to show it visually to the viewers. I know that when I overclocked my 2080s memory there was less memory being used so theoretically even though the 3080 has less than the 1080ti, it being much quicker probably offsets that loss of 1GB.

    I'm mega interested in the 3060, will it have G6, will it be 8GB
     
  10. H83

    H83 Ancient Guru

    Messages:
    5,465
    Likes Received:
    3,003
    GPU:
    XFX Black 6950XT
    Here in Portugal all the stores have the 2000 series on sales, that can go as high as 25% but even with those price cuts they continue to be too expensive... I wonder what they are going to do with those last units they didn´t sell before the announcement of the 3000 series...

    Not to mention all the guys that are trying to sell their own cards online...
     

  11. Shun the non believer :mad::mad::mad:
     
  12. southamptonfc

    southamptonfc Ancient Guru

    Messages:
    2,620
    Likes Received:
    646
    GPU:
    Zotac 4090 OC
    I take a bit of exception to being called confused. You yourself wrote that Nvidia changed the release process with the last series of cards so not sure why you're so confident of how it "always works". I'll point out that there was also no 2090.

    And while a 3080ti will surely be released, neither you or I have any idea whether it will be faster than the 3090.
     
  13. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    The price and specs on the 3090 seems like it fulfills the role of the previous Titan GPU models although there could still be room for an actual Ampere Titan with even more memory outside of the workstation specific cards too depending on what NVIDIA does.
    (Would perhaps be a bit extreme for most users but for work and application purposes and outside of the full on workstation cards and pricing it could be a thing, few thousand cheaper but still capable.)

    The 3080 as the enthusiast offering so a replacement for the Ti model but with customization options like a potential higher amount of memory and the price range between this and the 3090 leaves a position for a 3080 Ti too or a Super model perhaps.
    3070 as the more price/performance gamer but still high end choice although on par with if not better than the current 2080Ti GPU which is incredibly impressive for the price range this sits in and again enough room for a potential 3070 Ti or Super too.

    I expect a later on 3060 or 3050 for the more mid range segments but that could be a few months along with alternate memory variants of the 3070 and 3080 for how the bus here allows configuring the memory amount. (320 bit and 384 bit I think.)

    This is just relative performance and somewhat comparable pricing to the older GPU models though whereas this is a bit of a new process, older '90's where the dual-GPU models I believe and I doubt those will ever return. :D
     
    Last edited: Sep 5, 2020
  14. itpro

    itpro Maha Guru

    Messages:
    1,364
    Likes Received:
    735
    GPU:
    AMD Testing
    Do not post BS. At all reviews from our trusted guru3d, there isn't any occasion of 2080 losing from 1080ti in 4k. Stop this spam. Brutal strength is much more important than temporary storing capacity.
     
  15. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    While possible. Shaders can become more complex, providing more realistic results. And in such event Ampere 3070 will do better than Turing 2080Ti on 1080p.
     

  16. alanm

    alanm Ancient Guru

    Messages:
    12,235
    Likes Received:
    4,437
    GPU:
    RTX 4080
    Actually there is a few titles that 2080 may be a couple fps behind in 4k, but there is no real evidence its due to vram limits. The 1080ti is 352-bit bus vs 256 for the 2080, which may be a factor along with other variables. But throw the 2080S in there and it pulls ahead. So the 8gb theory falls flat in this case. ;)
     
  17. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,097
    Likes Received:
    2,603
    GPU:
    3080TI iChill Black
    Also these new Ampere gpus have newer vram compression so it will use less vram regardless.

    And you can't really compare 2080 vs 1080ti and say it's vram limit @4k, i think it's only pixel and texutre fillrate difference, also higher ROPs on 1080ti.

    Now I saw 3080 has 168gpixel and whopping 480gtexel, that should help too.

    Imo this 10GB limit overblown by users.
     
    Aura89, Corbus and alanm like this.
  18. alanm

    alanm Ancient Guru

    Messages:
    12,235
    Likes Received:
    4,437
    GPU:
    RTX 4080
    I agree. Both GPU manufacturers and game developers have to find ways to satisfy 1080p gamers since its still the mainstream. So yeah, there is bound to be a few titles to keep it alive, like RDR2.
     
  19. AlexM

    AlexM Member

    Messages:
    21
    Likes Received:
    20
    GPU:
    rtx 3080 Eagle OC
    Hi there! I think there is a mistake in the 3070 specs. 14Gps memory clock and a 256bit memory bus can not generate 512GB/s bandwith, its only 448GB/s.
     
    Fediuld likes this.
  20. half_empty_soul

    half_empty_soul Active Member

    Messages:
    62
    Likes Received:
    33
    GPU:
    RTX 3080 ti
    blind clueless noob
     
    Fediuld likes this.

Share This Page