1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Cyberpunk 2077: PC version with raytracing, first screenshots

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 12, 2019.

  1. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,730
    Likes Received:
    2,188
    GPU:
    5700XT+AW@240Hz
    In articles you did read. Did any of them actually cared to go into simplest of math? How many intersections/hit checks even RTX GPUs can do in reality vs marketing?
    How many rays you need per different type of ray use? (reflections/refractions/lighting/...)
    How does that change when you combine more of those technologies and are forced to actually add one more bounce.

    Realistically speaking, if game does not try to be DX-R showcase and uses minimal (real-world like) lighting for all objects. Then it stays on the ground with as few reflective objects as can since in real world we do not really have that many mostly glass and gloss on cars. And finally adds refractions where we have mostly glass and water.
    This game may run reasonably good at 1080p resolution on RTX 2080Ti. And it will have to be done very, very carefully.

    Instead you have game like Control, shown in another thread, that uses DX-R on something as flat as floor where planar reflection with proper shader can deliver exactly same result at fraction of performance cost.
    Even that SSR implementation they are giving in comparison could have looked same way as DX-R with proper shader... except left and right edge of screen.
     
  2. Grumpymangrumbling2019

    Grumpymangrumbling2019 Active Member

    Messages:
    63
    Likes Received:
    22
    GPU:
    vega
    they dump all over the last 25 years of rendering techniques.
    there is no question it is better that goes without saying but making fare honest comparisons is none existent.
    Honestly and openness goes a long way.
     
  3. msotirov

    msotirov Member

    Messages:
    48
    Likes Received:
    28
    GPU:
    rx 480
    Well that's a shame. Another title buying into Nvidia's proprietary bullshit. We've seen this story already so many times in the last 2 decades.
     
  4. TestDriver

    TestDriver Member Guru

    Messages:
    106
    Likes Received:
    6
    GPU:
    1080 Ti, C49HG90
    Well, they did the same with Witcher 3 and Nvidia proprietary tech, and we all remember how "well" that game performed before patches..
    Let's just hope that it will work better out of the box on different hardware (AMD and non RT Nvidia) this time
     

  5. Astyanax

    Astyanax Ancient Guru

    Messages:
    3,237
    Likes Received:
    836
    GPU:
    GTX 1080ti
    [​IMG]
     
    Solfaur and Undying like this.
  6. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    10,706
    Likes Received:
    2,880
    GPU:
    2080Ti @h2o
    In return to some of the people complaining about "Nvidia's propriatary RTX bs":

    It's DXR guys. With little adaption, any AMD card supporting DXR should be able to do this as well, right? And they have the brute force and power to compute this too, if I'm not mistaken?

    Nvidia pushed the money into R&D to try and bring out something new to try and justify their overpriced cards (and yes @Denial I do share your opinion on RTX 20xx gen, overpricing and not delivering RTX, and the state of it as of now). Turing 20XX RTX gen, of which I'm an owner too. But that doesn't mean that:
    - you can't play this game without Nvidia RT hardware (deactivate RT?)
    - AMD can't just adapt DXR as an already established standard
    - the game will be unplayable without RT activated (remember, it will hit consoles as well, so there's no RT there)

    Nvidia has ALWAYS done that, offer "an easy sollution" to devs by packing in gameworks. Saves them time (=money), and offers them some addition to their game that appeals, and now check this out, to most of the gamers down the line. Since most cards of Nvidia will be able to do RT in some minor way with lacking performance. But it's marketing. Remember this, always has been marketing. Even when certain game works got Nvidia's cards to throw up as well. And Nvidia still has the bigger market share... of course a dev is easily pulled in on offering "more" to aim for the majority of GPU owners. That's the base for Nvidia's gameworks anyway.

    With RT in particular, this did NOT work out in my opinion. It will take years until the games support it properly, and until the quality truely is something to behold. Right now I find any RTX I've seen in person lackluster and honestly, making no big difference at all. Hence I don't use it. But guess what, I still can play BF5, because you don't have to turn it on. In no game it's RTX only afaik, except some minor mini games maybe from studios who toy with new tech like some VR games ofcourse are VR only due to their way a game's designed to be played. CP2077 is not one of those RTX only games, though.


    Honestly, as soon as AMD adopts the DXR standard (which is directX and not Nvidia), they're all set to offer their customers / gamers the same things Nvidia can do. I can even imagine they will match RTX performance with Ray tracing as soon as with an RDNA update, so it's within reach. I expect AMD not to be as far behind on RT as Nvidia would want us to think.
    But they just have to do it, and I'm certain that AMD knows it won't have to do this to make good money, so they concentrate on what they know how to do it (every market segment besides enthusiast high end top notch four digit cost hardware). And that's good.

    So, the conclusion for me personally is:
    Good that they offer something new. Might be worth to check it out with the card I already have. And if RTX sucks, I'll just play this game without it when it's released.
    People without RTX cards can just play it without RTX, so I don't understand the backlash when the game's not even released, by far.
    Have some trust in CDPR to not make the same mistake again as they did with W3 (hairworks), they are one of the few companies I do trust to actually have a sense for what people want.
    Wait for reviews of the game before boycotting it, nobody is forced to preorder at all.
    Don't bash each other's heads over this addition of RTX to a game. Concentrate on what the game actually could offer us as gamers, a great action RPG experience, no matter what res, lighting, or API you play on.
     
    alanm, Ricardo, Solfaur and 1 other person like this.
  7. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    3,093
    Likes Received:
    447
    GPU:
    EVGA 1070 FTW
    Some fair points, but this stuck out to me - was there some hoo-harr about this? I don't remember that.
     
  8. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,730
    Likes Received:
    2,188
    GPU:
    5700XT+AW@240Hz
    DX-R is not proprietary. It is based on proprietary HW which nVidia happened to have available and knew that AMD does not. But DX-R itself is not.

    Real issue was, is and will be for quite some time feasibility in terms of IQ vs. performance.
    Complex modern game still having hybrid rasterization with DX-R and using lighting, reflection and refractions just where it is reasonably applicable would run well maybe on 1024x768.
     
    Maddness likes this.
  9. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    10,706
    Likes Received:
    2,880
    GPU:
    2080Ti @h2o
    IIRC there was a big debate about the game being downgraded generally, and how hairworks crippled AMD hardware specifically... then they introduced a way to tone down and turn hairworks off, and AMD users have found ways to reduce the tesselation on hair (which back then was crippling AMD's hardware). The first workaround took 48 hours to be released, and AMD users could play the game without hairworks just the same.

    Sure it's annoying, and that's probably Nvidia's fault, but the game is still the same, no matter how good or bad Geralt's beard looked... that's my point: Turn off the options which are proprietary, and support the dev to get them paid for a good game (if CP2077 will be a "good", enjoyable game, which I expect).
     
    Maddness and Loobyluggs like this.
  10. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    3,093
    Likes Received:
    447
    GPU:
    EVGA 1070 FTW
    Thanks - didn't know about that, I came to the W3 game later than most and got the GOTY edition with everything directly from GOG for a very good price of about $15 on offer.

    I am slightly surprised about tessellation, it's kinda de facto for game engines now, just as Opensubdiv is a thing that all engines can do, regardless of the target platform - and it's open source.
     
    fantaskarsef likes this.

  11. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    10,706
    Likes Received:
    2,880
    GPU:
    2080Ti @h2o
    Well that was back in the day, just at W3 release iirc. The main problem of course was hairworks, no doubt about it. I too later bought the GOTY edition on GOG, and played it with my 1080TI. I did deprive myself of that fun for way too long, it's a great game... haven't had such fun with an RPG since Baldur's Gate 2, which made me even open up to the RPG genre in the first place.
     
    Loobyluggs likes this.
  12. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    7,681
    Likes Received:
    199
    GPU:
    Zotac GTX1080Ti AMP
    Until that sweet sweet Nvidia money comes flying their way. If they were such a trustworthy developer then why even include something 0.01% will only be able to enable? Why include something that will dramatically change the look of the game that once again only 0.01% of players will be able to see? It alienates the other 99.9%, now if RT was more widely used in GPU's and was a staple of graphics tech that majority of the community can experience then sure, add it in.

    Its also not like CDPR to downgrade their games before launch is it? **cough**witcher3**cough**
    Its not like them to include settings in their games that crumble performance and not say anything for months??? ***cough**witcher2SSAA**cough**
     
  13. Denial

    Denial Ancient Guru

    Messages:
    12,340
    Likes Received:
    1,526
    GPU:
    EVGA 1080Ti
    More than .001% of steam gamers own an RTX card. By 2021 another generation of Nvidia and AMD will probably have DXR support. Nvidia doesn't pay people to use GameWorks, they pay their own engineers to help integrate it, significantly lowering the cost of including it. Games include it because it's a nice value-add check box they can tick off and drum up advertising about.

    Also your earlier post about requiring 35+ jobs including network engineers, producers, concept artists, etc to work on RT integration is ridiculous. Level designers? Maybe you need to update some area lights. You don't need to change animations for RT. You don't need a network engineer for RT. You don't need game designers for RT. Not to mention that you can just go to their facebook page and see that they are still deep in the development of the game and have stated it was likely to be pushed out of 2019 for over 6 months now - even at the game play reveal last year they said it was still early in development.
     
    pharma, Maddness, XenthorX and 4 others like this.
  14. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    3,093
    Likes Received:
    447
    GPU:
    EVGA 1070 FTW
    It's really a way of having a series of solvers figure out all the stuff so you don't have too. And yes, it is free of charge to download any of the tools.

    I don't know what developer tools AMD/ATI have got.
     
  15. Alessio1989

    Alessio1989 Maha Guru

    Messages:
    1,376
    Likes Received:
    231
    GPU:
    .
    Not all NVIDIA stuff is 100% free and a lot of them still have closed source core components (like some core component of physx for GPU acceleration)... But yes, that's a lot of stuff, which runs nice on Geforce GPUs mostly since NVIDIA never published deep documentation of their ISA and architectures, relegating everything to that piece of crap called CUDA VM...

    https://gpuopen.com/
    https://github.com/GPUOpen-LibrariesAndSDKs
    https://github.com/GPUOpen-Tools
    https://github.com/GPUOpen-Effects
    https://github.com/GPUOpen-ProfessionalCompute-Libraries
    https://github.com/GPUOpen-Drivers
    https://developer.amd.com/

    Yes, typical AMD having a lot of things disperse around the net... Their community sites and forums are a mess too..
     
    Last edited: Jun 14, 2019
    jura11 likes this.

  16. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,198
    Likes Received:
    137
    GPU:
    MSI GTX1070 GamingX
    Really looking forward to this game. RTX just future-proofs it for me tbh. Hopefully it'll be the kind of game we play for years to come after release. One of those where we go "oh yeah, I can turn that s*** up now!" kinda games.
     
    AuerX and Maddness like this.
  17. XenthorX

    XenthorX Ancient Guru

    Messages:
    2,697
    Likes Received:
    639
    GPU:
    EVGA XCUltra 2080Ti
    Always to take with a grain of salt from what i heard, but steam survey shows 2.5% user base own a RTX card. thought that would be less.

    From the screenshot they showed i'm not convinced they're using RT for reflection but more likely shadows/Ambient occlusion

    As reference:
     
    Last edited: Jun 14, 2019
    Maddness likes this.
  18. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    3,093
    Likes Received:
    447
    GPU:
    EVGA 1070 FTW
    Those links on GitHub do not look like they are actual 'AMD released', but they do appear to contain the code...I agree with you, AMD should tidy up their developer relations with a decent interface of the downloads.

    As for PhysX 4.1, it's incorporated into UE4 and Unity right now, and the other game engines, or, bespoke 3D engines can use it. A mention must also go to nVidia for developing the source code and plugin tools for Maya and Max. As for the cost, I guess my upper point is that there is no cost, because it saves cost.

    If you do not have to hire someone to develop the code, it saves you time-to-market and money, like all 'middleware', except nvidia does not charge you.
     
  19. Alessio1989

    Alessio1989 Maha Guru

    Messages:
    1,376
    Likes Received:
    231
    GPU:
    .
    They are actually developed from AMD, you can find all the links if you go over the GPUOpen site with a fine tooth comb..
    Just because samples and SDKs are easily accessible for quite everyone with a simple registration on the developer portal, doesn't mean it's a free time&money deal.
    It save times and money only if you accept it as it is, nothing more.
    As for binaries, which contain the closed core implementation of Gameworks and Physx middlewares you cannot:
    https://developer.nvidia.com/gameworks-sdk-eula
    As for source code, you can only:
    https://developer.nvidia.com/gameworks-source-sdk-eula

    All this means you must accept everything as it is (or just change little things to make the code compile and linking with your product), which runs like crap on other hardware or it doesn't run ad all. You cannot even change it to fit better with your product.
    If you want to anything outside copy&past binaries you must sign a closed agreement with NVIDIA and pay, which can be translate as: "you pay NVIDIA, NVIDIA send some guys to you (figurative, you will not meet them), those guys will do the work as they want "hearing your feedback", you are not allowed to do nothing about it, and you may also sign some deals precluding collaborating with other IHVs to make those closed piece of software work or work better with their hardware".
     
    yasamoka likes this.
  20. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    3,093
    Likes Received:
    447
    GPU:
    EVGA 1070 FTW
    And AMD do not do anything like that?
     

Share This Page