Metro Exodus - Official gameplay video

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 19, 2018.

  1. Rx4speed

    Rx4speed Member Guru

    Messages:
    146
    Likes Received:
    66
    GPU:
    R9 295x2
     
  2. asturur

    asturur Maha Guru

    Messages:
    1,376
    Likes Received:
    503
    GPU:
    Geforce Gtx 1080TI
    If you see stopping bashing at each other, is clear what is going on.
    Amd has nothing on the plate.
    If they wanted to give you another top card for 700 dollars, they would do the equivalent of a 1080ti with no RTX and no Turing.

    There are 2 new technologies here and they are putting a premium price on them.

    at 1300$ this is still the fastest card you can buy, it can still do more fps of a 1080ti at 4k i believe ( we will find out shortly from reviews ) If is worth the price it all depends on your economic situation and willing to spend.
     
  3. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    Well AMD doesn't exactly have tensor core like perf, but it can do half precision np I think 8int too, nv goes low as 4int if I'm not mistaken. That said AMD can do hw rendering rtx as well, not as fast as NV though, or maybe too. We will see when we actually have a working rtx game..
     
  4. wavetrex

    wavetrex Ancient Guru

    Messages:
    2,465
    Likes Received:
    2,578
    GPU:
    ROG RTX 6090 Ultra
    [​IMG]

    GOTCHA!

    Typical artifacts of partial raytracing, visible in any RT software that exists until the ray density reaches a certain point.

    Nvidia is not doing "Realtime Raytracing", it's only doing a small sampling of rays per frame, then probably blurring the output with "DLSS" and merging with the rest of the frame.
    However, in very high contrast areas this process fails, resulting in the pixelated visual artefacts (Marked by my orange arrows)

    What's also visible in this screenshot is that raytracing calculations are lower in resolution than rendering calculation, so not even 1080p !
    Those points in that light/shadow are about twice the size of a pixel on the screen and have a soft blur, from which I deduct that raytracing calculation is done at half the resolution of normal rendering, and in this case at 960x540, upscaling the result.

    (Edit: And I think it's only doing raytracing once per two frames from what my eyes can tell by watching that part of the video a few times. The raytraced light pixels "linger" a bit behind. So half the res, half the framerate... of an already low resolution of 1080p and only 60fps.
    This tech is definitely not ready for prime time, even the highest end GPU 2080 Ti is still WAY too slow for proper realtime raytracing.
    Jensen was correct about one thing in his hour long spewing of lies: Realtime raytracing is still 10 years away. YES CEO dude, it is. And all you're selling us now is bulls**t fakes.)

    For those that want to see with their own eyes how raytracing actually builds a frame in a scene, get this demo (from Guru3D):
    http://downloads.guru3d.com/Frybench-download-2709.html

    Frybench draws every rendered pixel live on the framebuffer, meaning you can see the initial pixelated result and then it gets better as more and more rays add to the final image.
    To slow down the process and make it more obvious, restrict it to one thread only.
     
    Last edited: Sep 19, 2018
    FranciscoCL and -Tj- like this.

  5. XenthorX

    XenthorX Ancient Guru

    Messages:
    5,059
    Likes Received:
    3,439
    GPU:
    MSI 4090 Suprim X
    I'll start by saying that dropping to conclusions by a single developper early attempt at using this tech makes little sense.

    "Nvidia is not doing "Realtime Raytracing", it's only doing a small sampling of rays per frame" they're doing real time raytracing by bounding box volume, everything is in the name really.

    1080p? 2millions pixels, thanks to bounding box you don't have to update them all each frame.
    *edited millions for billions*
    With RTX 2080 you have 8000millions ray per second. We can do the math really.
    Roughly 133 333 333 rays per frame at 60 FPS.

    If you want to fire 2millions ray, 60 times a second, it's... 120 millions (no bounces) rays.

    Thing is, you need bounces to make ray relevant, otherwise it won't fetch the lightning information. 240millions ray you get reflection of each ray, etc...

    That's a maximum bound, with optimization you don't need that many rays.

    3 bounces per pixel and we'll have CGI-level quality in real time at 1080p. Just incredible. in 5-6 years maybe, individuals will be able to produce top CGI quality visuals from home.

    Can't begin to grasp the repercussion on movie/tv industry, this is incredible.

    It's still early days, early products, and early implementations. If you look at how developper managed to keep improving their game visuals during a single console generation lifetime you understand that everything is about optimization.

    Again, people buying those GPUs are early adopters like people who bought first generation VR headset.
     
    Last edited: Sep 19, 2018
    Maddness likes this.
  6. wavetrex

    wavetrex Ancient Guru

    Messages:
    2,465
    Likes Received:
    2,578
    GPU:
    ROG RTX 6090 Ultra
    Gigarays = BILLION rays /second.

    You said "8 million", so your calculations are totally off.
    If indeed it was capable of shooting 8 billion rays/second, that's 64 rays per pixel at 1080p 60fps, more than enough to get several reflections and refractions for EVERY pixel in the frame.

    But the 6 or 8 or 10 "gigarays" or billion rays/second are a total lie, and I'm 100% sure it will be proven as a blatant lie by those people that actually understand how raytracing works, after they get their hands on the Turing cards.
     
  7. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    Wavetrex - I think most enthusiasts here understand that DX12 RT is merely a "slice" (this is how I describe it) of RT. No-one is under any illusion that what we're getting in real-time games is anywhere near the quality of the Star wars demo or in movies. However, I still feel that this "slice" is better than all other lighting methods produced by rasterization.
     
    Maddness likes this.
  8. XenthorX

    XenthorX Ancient Guru

    Messages:
    5,059
    Likes Received:
    3,439
    GPU:
    MSI 4090 Suprim X
    My bad, updated the post. So for 8000millions ray you have 133 millions ray per frame at 60frame per seconds. Thing is, you need ray bounces to start getting informations, you trace from screen to bounding box, this bounding box trace toward multiple directions, if it finds emissive information it keep tracing new rays otherwise it stops tracing etc...

    We can already do incredible things, the Star Wars demo Hilbert ran for its Turing review, at 4K nontheless, speaks for itself.
     
  9. Toadstool

    Toadstool Member Guru

    Messages:
    119
    Likes Received:
    54
    GPU:
    Vega 64
    Ray tracing looked really good at some points and I'm excited to see how it pans out over the coming years. Overall though, I'm not all that impressed, it looks more like cranking up the contrast than anything game changing. The improvements to shadows and reflections, while cool, are the type of thing that's easy to miss, especially in a fast paced game.
     
  10. milamber

    milamber Maha Guru

    Messages:
    1,273
    Likes Received:
    55
    GPU:
    MSI 4080 Suprim X
    The lighting looks nice with RTX but it also looks overlit. That might not actually be the fault of RTX, though. My monitor isn't the greatest and developers seem to default to washed-out contrast/gamma so I always have to use ReShade to correct it.
     

  11. Alvin Widiawan

    Alvin Widiawan Guest

    Messages:
    5
    Likes Received:
    2
    GPU:
    GTX 1050 2GB
    Ray tracing looks good, but i see drop performance signifcantly when enable RTX on video , may be need few years later to create far powerfull GPU than now
     
  12. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,251
    Likes Received:
    232
    GPU:
    EVGA GTX 1080@2,025
    2080 ti starts at $999, 2080 at $699, and 2070 at $499. Get your facts straight before calling people stupid in the future

    Your post is a perfect example of projection. If you can't see the difference, you're clearly blind, stupid, or as your name implies, a paid shill.

    Also, RTX 2080 ti is taking the place that GTX Titan has held, which is now in its own category of non-gaming cards with their own Titan family drivers. So in actuality, the price hasn't really gone up. They just shifted the product lines so instead of getting a cut down card first followed by the Ti a year later, they now release them all together including the full TU102 (2080 ti). Its kind of amazing how so many of you criticized nVidia for that stupid and confusing way of releasing cards, yet when they simplify the system to basically how people wanted it, the s**t storm continues to be flung.
     
  13. StewieTech

    StewieTech Chuck Norris

    Messages:
    2,537
    Likes Received:
    934
    GPU:
    MSI gtx 960 Gaming
    Damn! Games are getting pretty realistic. :eek:
     
  14. AMDfan

    AMDfan Guest

    Messages:
    48
    Likes Received:
    5
    GPU:
    280X

    It is pretty stupid to pre-order cards and CPU's before the NDA lift..., your post is the perfect example of a biased idiot.
    Hou are zelf blind and stupid to over pay for hardware, i just try to warn people... And i am not alone, never ever pre order before the independent reviews are out!!! The 20i0Ti turned out to be only 25% faster than a 1080Ti, and the prices here in Holland START @ €1.400,- Euro!!!
     

Share This Page