Unreal Engine 4 - (2018) Realistic Looking Characters

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Apr 3, 2018.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,512
    Likes Received:
    18,814
    GPU:
    AMD | NVIDIA
  2. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,009
    Likes Received:
    4,383
    GPU:
    Asrock 7700XT
    So - is this with or without RTX? Because if this is without, it really goes to show how much that technology is over-hyped and unnecessary.
     
  3. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    I'm not sure what you mean? RTX isn't a graphical effect, it's a hardware accelerated denoiser for raytracing. In this particular video I know the Star Wars scene is raytraced on RTX, but the only benefit RTX provides is a performance one, not visual - the visual aspect comes from raytracing itself. RTX just allows a raytraced scene to be completed with a minimal number of rays. The developer has control over the number of rays they want casted along with what specific effects they want raytraced. For example, in that Star Wars demo Epic said they were using an extreme number of rays and the entire scene was raytraced - which is why it required a DGX-1 supercomputer to run it in real time. In the upcoming metro game, only the AO/GI are raytraced and the ray count will probably significantly lower, impacting the overall accuracy but allowing it to be completed on a single GPU in real time.
     
  4. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,665
    Likes Received:
    597
    GPU:
    RTX3090 GB GamingOC
    10 more years and we'll have true photo realistic games.
     

  5. jorimt

    jorimt Active Member

    Messages:
    73
    Likes Received:
    69
    GPU:
    GIGABYTE RTX 4090
    ^ To be fair, we all said that same thing 10 years ago, and yet here we are saying it again.

    I'm guessing for actual affordable, mainstream, fully real-time "photo-realistic" graphics (and that's focusing solely on traditional 3D gaming on a 2D screen; barring whatever changes VR or yet to be known technology does or doesn't bring), we're probably looking at closer to 20-25 years.

    Time has the final say though, so until then...
     
  6. H83

    H83 Ancient Guru

    Messages:
    5,510
    Likes Received:
    3,036
    GPU:
    XFX Black 6950XT
    The skin continues to be too perfect and too shiny so i think there´s still a long way before we reach photo-realistic" graphics...
     
  7. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    Imho, it's necessary to get the most performance out of the hardware. Everyone is working together on this goal. AMD will also have their equivalent solution, will you also say that their solution is unnecessary?

    The cost of the hardware to do most of the demos shown is completely out of the range of consumers. However, for studios, then, these systems are very cost effective, especially as more tools can now be used in real-time. It means that whole rooms dedicated to rendering can now be reduced in not only size, but, also power-consumption. Everything from editing to the time it takes to produce films is affected as this technology matures and becomes more efficient.

    We need to see this kind of progress years before we ever see this tech become deployed to consumers. I really don't see what the problem is. You won't be seeing these demos run real-time on your PC anytime soon. The best case scenario is Metro, which is nowhere near the same quality as those demos, but, it'll be a positive first step for consumers to get their hands on and use a cut-down version of this technology.
     
  8. Crazy Serb

    Crazy Serb Master Guru

    Messages:
    270
    Likes Received:
    69
    GPU:
    270X Hawk 1200-1302
    Nah, that is just FXAA removing all irregularities skin would naturally have because UE does not have native support for modern AA methods.
     
  9. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    Looks good. And still waiting... -_-
     
  10. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,009
    Likes Received:
    4,383
    GPU:
    Asrock 7700XT
    That's assuming there will actually be a performance improvement. It's not a whole lot different than PhysX: it was more efficient than running particle calculations on a CPU, but you're still adding intensive computations.
    If their solution is proprietary or will effectively only be used by them, then yes, I will say it's unnecessary. AMD is not immune to my criticisms. I felt TressFX, for example, to be a stupid idea too, despite it being potentially useful.
    In essence, I agree with what you're saying, but as pointed out by others, UE4 isn't quite good enough to replace the slower renderers you speak of.
    My gripe is how this is proprietary technology for a very minimal improvement. If it weren't proprietary, I'd still question the true value of it, but I wouldn't care about its existence.
     

  11. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    I wouldn't call it a minimal improvement, raytracing is essentially the end game for lighting physics. Real time reflections, caustics, GI, etc - it's all way easier to implement with raytracing then trying to fake it with rasterization techniques and as long as you sample enough rays per pixel + bounce it's essentially 100% accurate to the way the light would behave in real life. The ray tracing itself is a part of DX12 now, All RTX does is speed up that rendering process by allowing them to use less rays by denoising the image.. that's it and it also works across every OptiX supported application - so Arnold Renderer, VRay, Optis, Pixar's Renderman, SW Visualize, all of the OptiX supported applications automatically get sped up by RTX - it's a 3x performance increase on Volta for any given quality level. So even if it doesn't make some applications real time, it's a massive performance improvement.

    We're at a point now where there are going to be diminishing returns on the level of quality and performance. Rasterization can only take you so far and honestly if you watch the GDC unreal demos for volumetric lightmaps/fog/capsule shadows/etc - it's pretty clear that developers are hitting a level where the amount of manual work you have to put in to fake all these effects in real time is too costly. There are so many exceptions to all those "workarounds" for performance, certain objects that need to be manually tuned for soft shadows, certain effects like the light capsules that get buggy around other objects, light maps for AO that work in one scene but not another, etc. It's a giant mess that's all mostly fixed entirely by raytracing + adds additional accuracy and now thanks to denoising techniques like RTX can be done in real time.
     
    Stormyandcold likes this.
  12. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,009
    Likes Received:
    4,383
    GPU:
    Asrock 7700XT
    I'm well aware how important raytracing is. I have absolutely nothing against the feature in of itself. But so far, the effects of RTX do not show anything especially compelling that live renderers currently available couldn't already accomplish. I am inclined to believe RTX does in fact have improvements, but, I still feel the difference it makes is too minimal to warrant a proprietary technology.
     
  13. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Idk, I still feel like you don't grasp what RTX is - like you say raytracing is important but then you say RTX does not show anything compelling.. RTX just accelerates raytracing through hardware, that's it. That's all it does. Like when AMD added a primitive discard engine to it's architecture no one said "that's proprietary" and "it doesn't show anything compelling that we couldn't already do" - this is no different then that. Unreal for example will add DX12 raytracing in it's engine in 4.20 - engine goes "hey dx12, shoot some rays out of this light, make it 50,000" dx12 goes "yo unreal, we running on volta RTX bro, we only need 15,000 rays for the same quality" engine goes "free performance? thanks mate"

    You can invoke it through DX12/Optix and Soon Vulkan. You don't need to use their GameWorks library, that's just preconfigured lighting system - DX12 has a standardized raytracing system now and RTX can accelerate that system. It's no different than Nvidia accelerating tessellation in hardware, or geometry culling, or whatever - it's just an intrinsic part of the hardware - like no one calls those systems "proprietary" it literally is the hardware. AMD will have its own raytracing acceleration in hardware. There is no way to make these not proprietary, unless Microsoft/Khronos or whoever starts telling Nvidia/AMD how to build their hardware or these companies just start freeling opening the designs of their hardware up.
     
  14. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,009
    Likes Received:
    4,383
    GPU:
    Asrock 7700XT
    @Denial
    Before I point out the potential flaws in some of the things you said, I guess I need something cleared up before I look like [more of?] an idiot:
    I got the impression that the raytracing system in DX12 depended on RTX, where developers have to go out of their way to intentionally implement RTX. The way you phrase a lot of what you said suggests RTX instead complements DX12, and functions automatically wherever it is available.

    In other words, you still get the benefits of DX12's raytracing regardless of whether you have RTX or not, but if you do have it, you get a performance boost due to the hardware-specific optimization; RTX is not something devs have to intentionally opt into. Is this a correct assessment? If so, then I have no problem at all with RTX or its proprietary nature (and in turn, I see no flaws in what you said).
     
  15. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Yes that is correct. You can utilize DX12's raytracing (DXR) without RTX. DXR will actually work on all current generation hardware, it's just accelerated by RTX when it's detected and similarly AMD's implementation. It also seems like not only can hardware vendors accelerate this but it's possible for software developers to come up with their own CPU/DirectCompute based denoising algorithms and accelerate it as well. Here is some relevant sections from Anandtech's breakdown, I bolded the parts I find important:

    Basically Microsoft knew that the vendors would have to integrate hardware aspects of this differently so they intentionally left the acceleration part open - like RTX for example taps Volta's Tensor cores to do the denoising because they are significantly faster at Matrix operations. AMD on the other hand, may use it's FP16 cores to accelerate it's variant. But they both will use DXR as the 'base'.

    It's also not just for gaming - like one of the things I keep see people saying all over is how Epic's Starwars Demo was using 8 GV100 chips to get 24fps but the raytracing resolution there is way higher. If you watch the full GDC presentation (i'm going to paraphrase it here and perhaps get some of the numbers wrong) but for games they said they have a budget of like 1 ray per pixel per light for real time WITH RTX. Some of the effects used in that Star Wars demo required a resolution of 4 rays per pixel per light just to get the effect working and the entire scene was traced as opposed to just AO like in upcoming Metro. Epic is trying to break into pre-production in movies with their engine and DXR (in combination with the performance benefits of RTX) is accurate/fast enough that they are able to get the quality of big budget renderers like Arnold that typically require hours of offline rendering into a realtime product like Unreal and a DGX-1.
     
    schmidtbag likes this.

  16. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,009
    Likes Received:
    4,383
    GPU:
    Asrock 7700XT
    Well, that explains a lot. Thanks for going in-depth about all of this - RTX isn't as stupid as I thought it was. The way Nvidia has been pushing it (along with DXR) made it seem like all of it was their plan, their technology, and exclusive to them. I guess it makes sense - they want people to think that way, but I guess that mislead me.
     
  17. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    At some point, there's going to be a film related game released that will boldly declare that it's using the same effects as used in the actual movie.
     

Share This Page