Cyberpunk 2077: PC version with raytracing, first screenshots

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 12, 2019.

  1. Dragam1337

    Dragam1337 Maha Guru

    Messages:
    1,419
    Likes Received:
    659
    GPU:
    1080 Gaming X SLI
    I would get the gtx version in any scenario, as i would never use raytracing in the first place... huge waste of performance, and better performance would be the reason i wanted to upgrade.
     
  2. Undying

    Undying Ancient Guru

    Messages:
    11,868
    Likes Received:
    1,542
    GPU:
    Aorus RX580 XTR 8GB
    That was Witcher only thing :D
     
  3. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,755
    Likes Received:
    2,203
    GPU:
    5700XT+AW@240Hz
    There is only one issue. 2070 vs 1660Ti.
    2070 has 64% more transistors and is in general some 37% faster. You can compare SMs of both GPUs to find out where they differ and what kind of transistor investment went to those differences.
    Or you can just take this: Turing which is enabled to do RT has some 16% worse performance per transistor than Turing which is not enabled to do RT.

    Now, imagine that There would be no GTX 1660(Ti), 1650(Ti). But there would be RTX variants with same transistor count and same price as result... all 16% slower.
    (Too weak at RT and slower at older techniques than needed.)

    Apparently, if nVidia manages to double RT capability without drastically increasing transistor count of SM, then even GPU that can do like 75~80fps on average before enabling RT, could be usable for RT. (Even while average is horrid keyword here as fps dips would be to 40s...)
     
    Dragam1337 likes this.
  4. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    10,953
    Likes Received:
    3,079
    GPU:
    2080Ti @h2o
    Dude, I so know what you're talking about. I suffered through just this kind of motion back then, and I instantly got reminded reading those lines.
    One thing I can say for sure, since going with only a single GPU, I have not had issues with drivers in the last three to four years. Not a single driver messed things up really, if one was doing weird things, I simply switched back a gen and waited for the next. Issue resolve is a matter of 5 minutes this time.
     

  5. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,215
    Likes Received:
    146
    GPU:
    MSI GTX1070 GamingX
    I doubt if you're not actually running RTX features that it will do this, so no, this scenario doesn't happen.
     
  6. Dragam1337

    Dragam1337 Maha Guru

    Messages:
    1,419
    Likes Received:
    659
    GPU:
    1080 Gaming X SLI
    If that was true, then a chip would consume 0 watts in idle... which it doesn't, so it isn't.
     
  7. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,215
    Likes Received:
    146
    GPU:
    MSI GTX1070 GamingX
    Actually, what you're assuming is that the tensor cores will still contribute to more power usage and heat when not being used. I don't think so. At least not in the way you made it out to be. In absolute terms, yes, there will be some insignificant power-draw.

    In practical terms, we expect a card running/using RTX to use more power and produce more heat vs RTX off.
     
  8. Astyanax

    Astyanax Ancient Guru

    Messages:
    3,704
    Likes Received:
    1,007
    GPU:
    GTX 1080ti
    The 16xx series will be a once off.

    um, no. the framebuffer is still projecting an image to the screen, you demonstrate a classic case of not knowing how any of it works.

    all modern gpu's employ tech to selectively turn off parts of the core but you cannot get to absolute zero power usage, ever.

    have a read
    https://hal.archives-ouvertes.fr/hal-00853849/document
     
    Last edited: Jul 4, 2019

Share This Page