Cyberpunk 2077: PC version with raytracing, first screenshots

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 12, 2019.

  1. Denial

    Denial Ancient Guru

    Messages:
    12,415
    Likes Received:
    1,655
    GPU:
    EVGA 1080Ti
    The Nvidia 600 series comes to mind, honestly - Kepler was a pretty radical architecture shift and they went from 40nm to 28nm all in the same jump. I think as the hardware development tools improve, as Nvidia's R&D cost increases and they can do more engineer sampling, it becomes less of an issue to transition to a smaller node with an architecture change.
     
  2. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,233
    Likes Received:
    149
    GPU:
    MSI GTX1070 GamingX
    Nvidia needs 2 sku+ to meet the mid/high-end gaming market demands.

    A lot of people don't want RTX, so it's obvious that GTX will be the cheaper option. Meanwhile RTX will be the new high-end with all the bells and whistles.

    For me there are two questions;

    1) Will Nvidia release both a GTX3080 and RTX3080?
    2) If so, will they actually have the same performance, minus RTX stuff?

    I would prefer for them to make two separate sku, rather than rely on disabling the RTX portions of the chip for GTX.

    What's interesting to me is how in general, on retailer sites; the specs of RTX2080ti to RTX2060 do not list the number of RT cores as a spec yet. There's only the mention of "Real-Time Ray Tracing". The number of CUDA cores, ram and gpu clock-speeds are still the most well-known and important specs that people care about. I wonder how Nvidia will market RTX performance to the public with meaningful numbers/performance that consumers can understand, especially in comparison to the previous (Turing) generation of cards.
     
    Dragam1337 likes this.
  3. XenthorX

    XenthorX Ancient Guru

    Messages:
    2,814
    Likes Received:
    745
    GPU:
    EVGA XCUltra 2080Ti
    80Gb? i'll even remove windows to make room for CP2077 if necessary
     
    mohiuddin, Embra and Maddness like this.
  4. Dragam1337

    Dragam1337 Maha Guru

    Messages:
    1,477
    Likes Received:
    695
    GPU:
    1080 Gaming X SLI
    Why didn't you just buy a better gpu in the first place then ?
     

  5. XenthorX

    XenthorX Ancient Guru

    Messages:
    2,814
    Likes Received:
    745
    GPU:
    EVGA XCUltra 2080Ti
    As the topic is around gpu in this thread lately, i personallywonder what will be the impact of the SUPER line-up release on Ampere release.
    My guess being a slight 'delay' compared to what we expected?
     
  6. angelgraves13

    angelgraves13 Ancient Guru

    Messages:
    1,527
    Likes Received:
    354
    GPU:
    RTX 2080 Ti FE
    Super line up just spells the end of Turing. Premium stuff always come last.
     
  7. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    11,052
    Likes Received:
    3,141
    GPU:
    2080Ti @h2o
    While I do believe your educated judgement on this, I just thought they'd try to keep Turing around to earn more with it. On second thought I have to agree Nvidia might have already made the money they wanted, upping the pace now for more steady revenue. I'm not worried that they couldn't, I just thought it might leave them with too little income from Turing to feed their targets since Jensen was under fire for low sales numbers.
    From a technological point of view I honestly have no doubt Nvidia could blast us all away with performance, RTX or not, just that they didn't see the need or chance to make lots of money if they drop Turing after such a short time.


    Yes I'm wondering about similar things. While I don't expect them to do more than they already did (seperating RTX as top line and GTX like 1660TI as mid line), I don't think they will produce an extra SKU like a GTX3080. I could be off, but I think they'd more likely use the fab capacities to trie and out as many big, full chips as possible, and if they can't they laser cut it down. Which brings us the RTX3070, TI maybe there, 3060 etc. but I somehow doubt they'd go through the trouble of effectively doubling their lineup. But as I said, I've been wrong in the past as well, they might just do something like this.
     
  8. Dragam1337

    Dragam1337 Maha Guru

    Messages:
    1,477
    Likes Received:
    695
    GPU:
    1080 Gaming X SLI
    Yeah, probably means that it will be another year before we see ampere... at this rate i am going to have had my 1080's far longer than any other gpu's. Have already had them for 3 years.
     
  9. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    11,052
    Likes Received:
    3,141
    GPU:
    2080Ti @h2o
    Value purchase then? :D
     
  10. Dragam1337

    Dragam1337 Maha Guru

    Messages:
    1,477
    Likes Received:
    695
    GPU:
    1080 Gaming X SLI
    Absolutely ! They are still 10-15% faster than a 2080 ti, and cost the same as a 2080 ti does now. Which is also the reason i couldn't in any way justify buying a 2080 ti. But ofc it sux as more and more games don't support sli. Luckily there are still a dedicated bunch working on unofficial sli support, which still makes sli work in most games :)

    https://www.forum-3dcenter.org/vbulletin/showthread.php?t=509912&page=173
     
    Maddness and fantaskarsef like this.

  11. Dragam1337

    Dragam1337 Maha Guru

    Messages:
    1,477
    Likes Received:
    695
    GPU:
    1080 Gaming X SLI
    I would totally buy it if they made a new gpu that had a huge improvement in traditional rasterazation performance and no RT cores.
     
  12. Picolete

    Picolete Master Guru

    Messages:
    277
    Likes Received:
    68
    GPU:
    R9 290 Sapphire Tri-x
    Will this have X64 tessellation hair again?
     
  13. torsoreaper

    torsoreaper New Member

    Messages:
    9
    Likes Received:
    1
    GPU:
    2080
    Because 2x 2080 are about 40% faster in 4K than than a 2080 Ti for practically the same price.
     
  14. Dragam1337

    Dragam1337 Maha Guru

    Messages:
    1,477
    Likes Received:
    695
    GPU:
    1080 Gaming X SLI
    True, but as a current and many year sli user, i know that sli is anything but a smooth ride at this point.

    As i wrote further up, many games don't support sli now, so you will have to spend quite some time messing with custom sli profiles in order to make it work. And when you do make it work, it usually has quirks. Such as particles and weather effects not working with sli enabled in bf5.

    I had sworn to stick with sli, but nvidia has worn me down with them completely dropping official sli support (undoubtedly their intention in order to force sli users onto overpriced single gpus such as the 2080 ti)... so i am changing to the fastest single ampare gpu when it launches at some point.
     
    fantaskarsef likes this.
  15. torsoreaper

    torsoreaper New Member

    Messages:
    9
    Likes Received:
    1
    GPU:
    2080
    There's nothing wrong with that, I just felt like the 2080 Ti was a bit of a ripoff at their pricing and as a new dad I'm trying to be responsible and live on a budget.

    As far as SLI support, It seems like the community has done a really good job of keeping up the manual profiles and given the popularity of CP2077, I'm hoping people will do a good job figuring out the compatibility bits. I got my first 2080 for about $700 and I see them on deal as low as $650 right now, maybe by the time the game comes out I can grab a 2nd for like $550. That's not bad compared to the current 2080Ti pricing of like $1200 to $1300.

    If nobody can figure it out, I guess I will sell my 2080 and buy a 2080 Ti....
     

  16. Dragam1337

    Dragam1337 Maha Guru

    Messages:
    1,477
    Likes Received:
    695
    GPU:
    1080 Gaming X SLI
    Can't disagree about that... the price increase from 1080 ti to 2080 ti was bigger than the performance increase !

    Yeah, there is still a fairly large sli community who depend on the custom sli profiles... but sadly they only work with dx11... so if they go dx12 only with cyberpunk (which they might to appease nvidia with their dxr crap), then sli is off the table :(

    I hope that ampere will be out at the time cyberpunk is released... and gives a massive performance increase over turing! (Not counting dxr... they can cut that crap off the chip for all i care)
     
  17. torsoreaper

    torsoreaper New Member

    Messages:
    9
    Likes Received:
    1
    GPU:
    2080
    Didn't realize that, bummer!! Well I guess I will just cross my fingers.
     
  18. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,233
    Likes Received:
    149
    GPU:
    MSI GTX1070 GamingX
    I think the market for "GTX" is huge.

    However, would you still get the GTX3080 over an RTX3080 if the RTX version was only say $200 more? I'm going to assume that they'd also increase the tensor cores enough to make RTX work at 1440p 60fps+.

    For me personally, in this scenario I would have to go with the RTX3080 as it looks like devs like RTX and I foresee many more games supporting it. RTX isn't going to be a fad, it's going to become part of the "standard".
     
  19. Aura89

    Aura89 Ancient Guru

    Messages:
    7,716
    Likes Received:
    955
    GPU:
    -
    It's sad that people actually want to have a company remove technology and stunt the growth of visual improvements.

    Hopefully the next generation has no GPUs in it that do not have RTX, so this idea that we should stop trying to improve is gone.

    If you want a GPU without RTX features, go buy a navi or vega 2. Get the same rasterization performance for the same dollar amount, and less features.
     
    Maddness and Stormyandcold like this.
  20. Dragam1337

    Dragam1337 Maha Guru

    Messages:
    1,477
    Likes Received:
    695
    GPU:
    1080 Gaming X SLI
    Anyone who plays online competitive games (aka the vast majority of users) only want / need the best possible performance they can get... raytracing drastically reduces performance, so anyone who wishes to remain competitive ought to have raytracing turned off... meaning that the rt cores become completely useless, and does nothing but add to the chip size and complexity, thus increasing the price of the chip significantly. Not to mention it makes the chip draw more power and become hotter.

    So for the vast majority of gamers they would be better off with a smaller, simpler, cheaper gpu without rt cores etc, that excells in traditional rasterazation performance.
     

Share This Page