Nvidia would plan to release GeForce RTX 2060 variant with 12GB memory

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 13, 2021.

  1. Krizby

    Krizby Master Guru

    Messages:
    825
    Likes Received:
    121
    GPU:
    3090 Watercooled
    Radeon VII with its 16GB HBM2 certainly aged well in mining farms :D
     
  2. DannyD

    DannyD Ancient Guru

    Messages:
    3,019
    Likes Received:
    2,020
    GPU:
    EVGA 2060
    Except it has zero efficiency.
     
  3. Undying

    Undying Ancient Guru

    Messages:
    16,463
    Likes Received:
    5,405
    GPU:
    Aorus RX580 XTR 8GB
    Certainly. Radeon VII costs as much as 6900xt this days.
     
  4. Krizby

    Krizby Master Guru

    Messages:
    825
    Likes Received:
    121
    GPU:
    3090 Watercooled
    For gaming Radeon VII is pretty much a dead card, even driver support was abandoned o_O. Even Maxwell GPU can still play all the latest AAA games with no problem, albeit at low settings.
    So yeah, AMD GPU and "longevity" don't stick.
     

  5. Undying

    Undying Ancient Guru

    Messages:
    16,463
    Likes Received:
    5,405
    GPU:
    Aorus RX580 XTR 8GB
    Radeon VII still does support latest drivers idk what your talking about and still outperforms gtx1080 as well as Vega64.
     
    DannyD likes this.
  6. DannyD

    DannyD Ancient Guru

    Messages:
    3,019
    Likes Received:
    2,020
    GPU:
    EVGA 2060
    Undying likes this.
  7. Undying

    Undying Ancient Guru

    Messages:
    16,463
    Likes Received:
    5,405
    GPU:
    Aorus RX580 XTR 8GB
    DannyD likes this.
  8. DannyD

    DannyD Ancient Guru

    Messages:
    3,019
    Likes Received:
    2,020
    GPU:
    EVGA 2060
    It's indeed incredible too at mining but like i mentioned very power hungry while doing so unlike the newer cards.
    What an incredible lifespan it's had tho.
     
  9. Krizby

    Krizby Master Guru

    Messages:
    825
    Likes Received:
    121
    GPU:
    3090 Watercooled
    Radeon VII release date Feb 2019, EOL Aug 2019 - MSRP 700usd
    GTX 1080Ti release date March 2017 - MSRP 700usd
    GTX 1080 release date June 2016 - MSRP 600usd

    Radeon VII has been so buggy since launch that even the most die hard AMD fan like AdoredTV ripped AMD a new one for terrible driver support, not sure if AMD even fixed any bug since then
     
  10. Kaarme

    Kaarme Ancient Guru

    Messages:
    2,565
    Likes Received:
    1,222
    GPU:
    Sapphire 390
    Nvidia tried to set a trend where everybody must buy an Nvidia card and pay 150 euros extra for a screen to get adaptive synchronization. That trend failed, and now Nvidia also supports the more hardware agnostic approach as well, the one pioneered by AMD.
     

  11. Krizby

    Krizby Master Guru

    Messages:
    825
    Likes Received:
    121
    GPU:
    3090 Watercooled
    huh, Variable Refresh Rate is VESA standard which came out after G-Sync, definitely not "pioneered" by AMD LMAO. Also G-Sync monitors are capable of ULMB, which was not possible on VRR monitor without G-Sync module until now.

    The fact that almost everyone now own an VRR monitor means that Nvidia is a trend setter LOL, despite their own solution cost a little too much for people's taste. Well next is Real Time RT and AI upscaling, which AMD and Intel will also have to copy.
     
    Last edited: Sep 15, 2021
  12. Kaarme

    Kaarme Ancient Guru

    Messages:
    2,565
    Likes Received:
    1,222
    GPU:
    Sapphire 390
    Freesync was AMD's competitor against G-sync. It's much closer to the VESA adaptive sync. It doesn't really matter if G-sync appeared sooner or not. It lost the race, so Nvidia didn't manage to set a trend. Before Nvidia bowed its head and adopted non-G-sync adaptative sync, it was funny to read ad brochures when plenty of screens were advertising AMD Freesync, only one or two screens G-sync, yet Nvidia's GPU market share was 75-80%. Nvidia hit itself in the knee with an axe by charging an arm and a leg for the G-sync module/license. Well, you are of course free to have you own opinion on the matter. I won't change mine, though.
     
  13. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    2,568
    Likes Received:
    1,272
    GPU:
    107001070
    290x had 4g
     
  14. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    2,568
    Likes Received:
    1,272
    GPU:
    107001070
    that's what people forget
    when I bought a 2716dg 1440p 144hz ulmb with a 980Ti in july 2015,how long did it take for amd to give me the same options ?
    a card capable of overclocked 980Ti's performance wasn't even available until Vega 56 in 2017,neither was strobing on first freesync monitors,and when it became an option,you still couldn't find adaptive v-sync in the drivers.Not to mention the Hz range was a joke,a 144hz monitor capped at 90Hz in freesync mode.
    that is the part of the story that often gets conveneintly forgotten.
    at the time I was playing dying light at 120fps 1440p with ulmb on all that amd owners could do is get frustrated with their fury x's
    tables have turned now and no one needs the module,but it took years.
     
    Last edited: Sep 15, 2021
    Krizby likes this.
  15. Krizby

    Krizby Master Guru

    Messages:
    825
    Likes Received:
    121
    GPU:
    3090 Watercooled
    What a world we live in that copycats get the credit :D, chinese companies must living in joy knowing people appreciate their effort of steal Western IPs
     

  16. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    2,568
    Likes Received:
    1,272
    GPU:
    107001070
    it was inevitable that amd would get all the cool features themselves.
    so as much as people deny that nvidia sets trends,and that includes me being sceptical about that statement,it's true they try to push a lot of cool stuff first.

    the case of g-sync/ulmb 1440p 144hz displays with a $650 best of a card to push them is probably the best example.took amd 2 years to catch up.

    same with rtx,and even now current rx6000 cards can't provide the same experience since they not only lack rt performance but a dlss 2.0 equivalent that's helping in ray traced games specifically.

    gameworks was hit or miss,but I still enjoy some of those features a lot,and if I don't I just turn them off.VXAO in rotr was probably the best looking AO implementation until ray traced ao beccame a thing.So was HTFS in WD2,looked incredible.Im playing Mirrors Edge Catalyst atm,and Ansel is so much fun that at times I prefer exploring the environment more than actually playing.

    so as much as people hate to give nvidia credit,not giving it to them for some of the things they tried (with various degree of success) is just denial.I recently watched HUB's review of 2060S vs 5700,and they completely shrugged off RT and DLSS,or the fact that one card is missing a lot of modern features completely .They did however cover RT in their 6700xt review.Pretty ironic considering that not only are 2060S and 6700XT similar in RT performance,the 2060S has DLSS available for almost every ray traced game there is and is gonna provide an overall smoother experience thanks to it.

    so shrug nvidia's tech off all you want if you don't like them personally,but that has a name - it's called living in a bubble.
     
    Krizby likes this.
  17. Krizby

    Krizby Master Guru

    Messages:
    825
    Likes Received:
    121
    GPU:
    3090 Watercooled
    Yeah I bought the Asus PG278Q which was one of the first 1440p 144hz screen back in 2014, I was totally owning in games like DOTA 2 and CSGO :cool: (top 1% ranked)

    Idk why HUB became so obsessed with the money aspect, they were more enthusiast-like back then, but now all they care about are price/perf, which is funny because they are not interested in the best price/perf GPU anyways (that would be the RX570 LOL), looks very double standard to me.

    Also HUB is as blind AF, they were dissing DLSS 2.0 before like saying it's only limited to a few games, but everyone else saw the huge potential of DLSS 2.0 (which is required for 4K 120hz gaming).
     
    Last edited: Sep 15, 2021
  18. kapu

    kapu Ancient Guru

    Messages:
    4,794
    Likes Received:
    443
    GPU:
    Radeon 6800
    I don't like raytracing . For me 1440p raster performance is main selling factor and at this 6800 is more close to 3080 . And 8gig vram is big no no .
    When i got my 6800 , price was 1:1 with 3070 , so for me was no brainer. But i guess it's a matter of perference also.
     
  19. Krizby

    Krizby Master Guru

    Messages:
    825
    Likes Received:
    121
    GPU:
    3090 Watercooled
    Yeah let just disable RT and dial Texture Quality to 11, so that I can feel good about my 16GB VRAM GPU :D.
    Can I interest you in a blind test between Ultra and High Texture Quality?
     
  20. Undying

    Undying Ancient Guru

    Messages:
    16,463
    Likes Received:
    5,405
    GPU:
    Aorus RX580 XTR 8GB
    Textures does not take a performance hit you only have to have enough vram. I see you would rather enjoy the rt slideshow. :p
     

Share This Page