Nvidia Turing: Mobile RTX graphics chips 2050/2060/2070 announced during CES

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Dec 10, 2018.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,546
    Likes Received:
    18,859
    GPU:
    AMD | NVIDIA
  2. FrostNixon

    FrostNixon Master Guru

    Messages:
    276
    Likes Received:
    57
    GPU:
    RX 5700 XT
    Let's see if the 30xx series from AMD can compete with the 2050/60, that's the real question for me, don't want to drop freesync or pay a few hundred extra just to gey gsync.
     
  3. alanm

    alanm Ancient Guru

    Messages:
    12,274
    Likes Received:
    4,479
    GPU:
    RTX 4080
    RTX 2050? Why would they put tensor and RT cores on a budget card like that? I guess it will most likely be a GTX card.
     
    fantaskarsef likes this.
  4. Corrupt^

    Corrupt^ Ancient Guru

    Messages:
    7,270
    Likes Received:
    600
    GPU:
    Geforce RTX 3090 FE
    Oh neat, € 4000 laptops...
     
    fantaskarsef likes this.

  5. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    I bet they will cost less than 15'' Apple MacBook Pro with Radeon Pro Vega 16 upgrade (3050 US$) or Radeon Pro Vega 20 (3150 US$) and will perform better for gaming.
     
  6. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,020
    Likes Received:
    4,398
    GPU:
    Asrock 7700XT
    I thought the 2070 was supposed to be the lowest-end raytracing-capable GPU? If so, I think having the RTX 2060 is a bit confusing/misleading. As far as I'm concerned, non-RT GPUs should stick with GT(X).
     
  7. tunejunky

    tunejunky Ancient Guru

    Messages:
    4,461
    Likes Received:
    3,086
    GPU:
    7900xtx/7900xt
    but there's more...
    as i teased last week Re: Alienware's upcoming "M" series, ability to change/upgrade full, fat Turing, or go Max-Q with enhanced cooling (also an option).
    but yeah vendors are excited to have current cpu/gpu combos for sale again. 10xx lasted so long (3 Intel generations for mobile) vendors had seen sales dip at the high margin (enthusiast mobile).
     
  8. tunejunky

    tunejunky Ancient Guru

    Messages:
    4,461
    Likes Received:
    3,086
    GPU:
    7900xtx/7900xt
    then you're in good luck, you will be happy.
     
  9. H83

    H83 Ancient Guru

    Messages:
    5,512
    Likes Received:
    3,036
    GPU:
    XFX Black 6950XT
    Don´t tell me you don´t like Nvidia´s misleading marketing regarding mobile chips... Of course there´s a (small) chance that the 2060 is just a cut down 2070 with reduced RT capacity.
     
  10. tensai28

    tensai28 Ancient Guru

    Messages:
    1,557
    Likes Received:
    420
    GPU:
    rtx 4080 super
    Just a repeat of what I've been saying all this time since the launch of the rtx series. The only card that makes any sense this new generation is the 2080ti.
     

  11. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,020
    Likes Received:
    4,398
    GPU:
    Asrock 7700XT
    Huh? The 2080Ti is the one that makes the least sense so far (well, excluding Titans, but they have never made sense). The 2080Ti is disproportionately way too expensive and it doesn't handle raytracing a whole lot better than the 2080.
     
  12. tensai28

    tensai28 Ancient Guru

    Messages:
    1,557
    Likes Received:
    420
    GPU:
    rtx 4080 super
    I'm not talking about price. Outperforms the highest pascal card and has rediculusly good vr performance. Can't say the same about anything lower.
     
    BangTail likes this.
  13. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,020
    Likes Received:
    4,398
    GPU:
    Asrock 7700XT
    Well if you ignore price then the RTX Titan makes more sense.
     
  14. tensai28

    tensai28 Ancient Guru

    Messages:
    1,557
    Likes Received:
    420
    GPU:
    rtx 4080 super
    Check my edit above. Not going to turn this into a price discussion.
     
    BangTail likes this.
  15. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    I could see Nvidia pulling another GeForce 4 MX stunt with regard to RTX. It wouldn't be the first time that they've grossly misled consumers on GPU technology.

    I agree that the 2080 Ti makes the least sense. The 1080 and 1080 Ti are EOL and the RTX 2070 and RTX 2080 make for decent replacements, respectively. Can't say that about the 2080 Ti.
     
    schmidtbag likes this.

  16. alanm

    alanm Ancient Guru

    Messages:
    12,274
    Likes Received:
    4,479
    GPU:
    RTX 4080
    Doubt that. Things are a bit different now. The slightest deviation from a 'proper card' and they will be eaten alive by the tech press. They know everyone is hungry for a good scandal involving Nvidia, so should tread carefully now. Last thing they need is another PR disaster.
     
  17. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    The tech community will cry foul, but I doubt it will matter - we also cried foul about the GeForce 4 MX but that didn't stop it from selling like hotcakes. If Nvidia releases a RTX 2060 at the same price as the GTX 1060 then it will undoubtedly become the best-selling GPU (lack of competition and Nvidia's strong brand recognition will ensure this).

    Of course doing this will basically kill off ray-tracing for this generation (like the MX held back DX8 development), but that might happen regardless - it's not like they're going to cram expensive RT hardware inside budget GPUs. At any rate, RTX 2060 sounds better than GTX 2060 so they might just go with that to spur sales. It'll be misleading as hell but it'll be more money in Nvidia's coffers - and that's what matters for them and their shareholders (like me :D).
     
    carnivore likes this.
  18. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,858
    Likes Received:
    442
    GPU:
    RTX 3080
    They could put ray tracing cores in lower tier cards than the RTX 2070, because the ray tracing part of the GPUs is currently holding back the performance of the 'traditional rendering' CUDA cores - in Battlefield 5 RTX 2070 through RTX 2080ti have low GPU usage when RT is enabled, that's because the bottleneck is the ray tracing cores. So actually, it would make a fair bit of sense for cards with less powerful GPU cores to have ray tracing packed alongside them, because the lower power GPU cores of those cards would not be holding back or would be in balance with the ray tracing cores - as long as NVidia didn't decrease the number of ray tracing cores below the current amount of the RTX 2070 - because after all you need a minimum amount of ray tracing cores for it to even make sense to have them in the first place, and the RTX 2070 ray tracing power has got to be the bare minimum. I could see them incorporating RTX 2070 style ray tracing power with say RTX 2060 and RTX 2050ti, but can't see them going any lower with ray tracing than a 2050 or 2050ti card. The other plus point of putting a useful amount of ray tracing on all your mainstream cards is that it would seriously push development of ray traced games, because you'd have a big market of ray tracing hardware ready to be used. If NVidia does this now they can start capitalising on their new feature well ahead of AMD.
     
  19. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    The problem with that is the die size. Including the full RT (and Tensor) cores from the 2070 into the 2060 might make the die size prohibitively large for a mainstream product - and I doubt Nvidia would eat into their margins to provide ray-tracing to the masses. It can certainly be feasible from a technical POV, but probably not a financial one.
     
  20. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,858
    Likes Received:
    442
    GPU:
    RTX 3080
    Well, it makes sense that it would definitely cost more to have ray tracing on mainstream cards, and more so if they decide to keep the amount of ray tracing cores at a minimum bare level of the RTX 2070 like I suggested, but the future benefits of that investment I think would outweigh the initial production cost - getting developers onboard for ray traced games and then by being the only player in ray tracing they could really capatalise on that, meaning more market share and more profit in both the immediate and further future. I could see it happening, but we'll have to see come 2019.
     

Share This Page