1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

GeForce RTX 2070 in November with 2304 shader cores

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 20, 2018.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    35,293
    Likes Received:
    4,433
    GPU:
    AMD | NVIDIA
  2. Kaarme

    Kaarme Ancient Guru

    Messages:
    1,506
    Likes Received:
    398
    GPU:
    Sapphire 390
    It would seem to me Nvidia would kill the chances of studios really bothering to put the HW ray tracing to much use if they removed the support from the 2070. Aren't that and the 2060 going to be the most popular GPUs among gamers at large, unlike the more expensive and more high-end 2080 and 2080 Ti? I don't know how easy or difficult the ray tracing is to implement in the engine, but considering games are typically released with glaring bugs, it's quite obvious few studios want to spend a single second of extra time on software. Would the game makers go through the trouble if only a small percentage of players could benefit from it, yet all game buyers would still need to pay for the feature, regardless of if they can actually use it.
     
  3. cryohellinc

    cryohellinc Ancient Guru

    Messages:
    2,442
    Likes Received:
    1,541
    GPU:
    GTX 1080Ti
    It will be part of Nvidia's GameWorks which is designed to be easily implemented. In games, you will have RT technology toggle on/off in the same way you can turn on/off PhysX.

    Will we see widespread implementation of it? Potentially yes as it's the next logical advancement of PC graphics, however, in what form - that will be the main question here.

    I'm sure AMD will catch up, personally what I don't want to see is stupid "manufacturer only" type of Technology where you will have - RayTracing for Nvidia and some form of BobTracing for AMD, resulting in incompatibilities and one-sides titles.
     
    carnivore likes this.
  4. Lucifer

    Lucifer Master Guru

    Messages:
    201
    Likes Received:
    4
    GPU:
    Colorful GTX 1060
    knowing nvidia, most likely 2070 Ti will follow, and perhaps 2060 Ti wil see the lights this time around.
     

  5. Kaarme

    Kaarme Ancient Guru

    Messages:
    1,506
    Likes Received:
    398
    GPU:
    Sapphire 390
    Wasn't the 1070 Ti only released because the Vega 56 outperformed the 1070? Considering AMD hasn't apparently got anything to release in the foreseeable future, Nvidia wouldn't necessarily see the need to complicate their GPU portfolio. They might, of course, do it a year from now just to have something fresh to offer to the market.
     
    tunejunky likes this.
  6. Denial

    Denial Ancient Guru

    Messages:
    12,117
    Likes Received:
    1,261
    GPU:
    EVGA 1080Ti
    The raytracing is implemented via Microsoft's DXR - RTX accelerates DXR. AMD will have its own acceleration method.
     
    cryohellinc likes this.
  7. Ziggymac

    Ziggymac Member Guru

    Messages:
    108
    Likes Received:
    35
    GPU:
    Asus 970 Strix SLi
    I doubt it will matter for the first generation of RTX cards, as even these 2080 & 2080Ti's will not be powerful enough to do real time 'in game' ray tracing except for very limited amounts on small surfaces.

    By the time GPU's are fast enough to do real time ray tracing much more widely in game, the technology will have gotten cheaper and Nvidia will probably expand the features to their lesser cards.
     
  8. SSD_PRO

    SSD_PRO Member Guru

    Messages:
    166
    Likes Received:
    20
    GPU:
    EVGA GTX 1070
    I am usually pretty positive on nvidia but doesn't this info seem a little lackluster? 2070 with 2304 shaders vs the 1070 with 1920? That's only 16% after 2.5 years. The higher memory bandwidth will help some but won't provide major gains with the same 8GB. This jump seems a lot more 1070>1170 (haha) than 1070>2070.
     
  9. Paulo Narciso

    Paulo Narciso Maha Guru

    Messages:
    1,207
    Likes Received:
    27
    GPU:
    ASUS Strix GTX 1080 Ti
    Node reduction is not drastic, also the die is bigger and there's not much margin to increase MHZ without exceding 250w.
    Also, Intel is doing it for years and people still buy their CPU's. :)
     
    Silva likes this.
  10. NiColaoS

    NiColaoS Master Guru

    Messages:
    557
    Likes Received:
    37
    GPU:
    1060 6GB Armor OC
    Anyone knows if the RTX 2070 and especially RTX 1060 will have lower power consumption that GTX 2070 and GTX 1060 respectively?
     

  11. Sylencer

    Sylencer Member

    Messages:
    19
    Likes Received:
    0
    GPU:
    2080 Asus Strix OC
    Not even sure if that could be considered an upgrade from my 1070...
     
  12. Embra

    Embra Master Guru

    Messages:
    836
    Likes Received:
    165
    GPU:
    Vega 64 Nitro+LE
    If you consider a 1080 an upgrade, maybe. Looks to be right around the lvl of a 1080... maybe a bit more.
     
  13. Noisiv

    Noisiv Ancient Guru

    Messages:
    6,591
    Likes Received:
    451
    GPU:
    RTX 2070 Strix
    They won't.
    We're fked when it comes to power consumption because all perf/Watt savings will go toward higher perf.

    Luckily Nvidia knows their job, and since they are at the good place atm, instead of MOAR CORES approach which is bound to run into the power and scaling wall, they chose this release to revolutionize their entire GPU rendering strategy.
     
  14. tunejunky

    tunejunky Master Guru

    Messages:
    875
    Likes Received:
    313
    GPU:
    RadeonVII gtx1080ti
    yes, the 1070ti was a stopgap, but one that became popular despite cannibalizing the 1080 sales.
    it is extremely doubtful they will put anything between the 2070 and the 2080, as they started w/ the 2080ti at launch
     
  15. tunejunky

    tunejunky Master Guru

    Messages:
    875
    Likes Received:
    313
    GPU:
    RadeonVII gtx1080ti

    lolz if you think Nvidia doesn't have "moar cores" in your future. there is no problem with scaling and power is reduced by the process.
    Nvidia has been working on this since AMD's original design with Vega... which is fully scalable.
    and don't get all fanboy on the last statement, it is fact.
     

  16. Noisiv

    Noisiv Ancient Guru

    Messages:
    6,591
    Likes Received:
    451
    GPU:
    RTX 2070 Strix

    Actually I know for a fact that Nvidia already has less of traditional shader cores on their RTX lineup than it could have.
    And that is because they chose to add Ray-tracing specific ASIC. Which eats into both power and area. Although it it's an order of magnitude more efficient in RT-specific collision calculations than the general purpose Cuda Core.

    So yeah, forward looking, IQ bettering, RT specialized architecture combining both GP shader + tensor cores, instead of brute forcing MOAR CORES.
    At the time when they can afford to dabble in new techniques. I call that smart. And is exactly the opposite of what AMD did with Vega, investing in future techniques that either do not work, or are not supported, at the time when they had been lagging to begin with.
     
  17. tunejunky

    tunejunky Master Guru

    Messages:
    875
    Likes Received:
    313
    GPU:
    RadeonVII gtx1080ti
    all of which is true, but misses my point entirely.
    the name of the game is Yield. simply because that is directly related to profit, cost, and ability to sell.
    larger chips have lower yields - that is just the way manufacturing works as the silicon wafers are all the same size to start with. and it's why jumping to a smaller process always increases yield.
    Nvidia is "tick-tocking" atm, RT (2080,2080ti turing at least)
    is a new architecture on the most refined process (but not the smallest).
    and they're jumping the gun a little bit to quiet the waters and have their next gen available (in the market) by the time AMD releases Navi/whatever the hell they want to call it.

    and btw... Nvidia IS going for scalable modules, they're just far behind in that area.
     
  18. Denial

    Denial Ancient Guru

    Messages:
    12,117
    Likes Received:
    1,261
    GPU:
    EVGA 1080Ti
    How are they far behind?
     
  19. Noisiv

    Noisiv Ancient Guru

    Messages:
    6,591
    Likes Received:
    451
    GPU:
    RTX 2070 Strix
    @tunejunky
    The name of the game is not Yield. At all.
    Case in point Volta's V100 - a GPU which is barely producible, yet has been raking in profits and lifting the company value ever since its creation.
    if the name of the game was Yield, AMD would be AT THE VERY LEAST equal to Nvidia.

    The name of the game is Profit = Addressing as wide as possible market, with as competitive as possible products, with as high as possible margins.
    And margins are only partially influenced by yield, because the BoM is only a tiny fraction of Nvidia's spending, being dwarfed by R&D and salaries.
     
  20. illrigger

    illrigger Member Guru

    Messages:
    120
    Likes Received:
    32
    GPU:
    GTX 1080 Ti
    If you think about it, they didn't actually change much. They couldn't launch a new Titan because the current Titan V would be more powerful and expensive still. So they launched the Ti at the OG Titan price point at launch. We will probably see a 2090 or some-such launch in the slot where the Ti currently sits.

    I'll be very interested in the performance of these cards. Various reviewers were saying that the 2080 Ti was having issues hitting smooth frame rates at 1080p in the hands-on demos at the event, with noticeable dropped frames. Likely things will improve because of drivers, but it's concerning.

    I'm also disappointed that they are relying on studios to implement Tensor assisted anti-aliasing (the feature that's being implemented on the vast majority of "RTX Ready" titles) rather than putting it into the drivers. It seems like that could be low hanging fruit that could have given a universal boost to performance and image quality, but apparently it's not as easy to do as it seems.
     

Share This Page