Epic Games Demonstrates Real-Time Ray Tracing With Star Wars

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 22, 2018.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,317
    Likes Received:
    18,405
    GPU:
    AMD | NVIDIA
    Lilith likes this.
  2. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,636
    Likes Received:
    9,512
    GPU:
    4090@H2O
    Looks cool
     
    Lilith likes this.
  3. Maddness

    Maddness Ancient Guru

    Messages:
    2,434
    Likes Received:
    1,729
    GPU:
    3080 Aorus Xtreme
    Looks very cool indeed.
     
  4. XP-200

    XP-200 Ancient Guru

    Messages:
    6,383
    Likes Received:
    1,762
    GPU:
    MSI Radeon RX 6400
    So what is it going to cost us to use this new facny tech, let me guess, a new video card by any chance, in the region of say £1000. lol

    Yes, i have become somewhat cyinical of it all. lol
     
    fantaskarsef likes this.

  5. KissSh0t

    KissSh0t Ancient Guru

    Messages:
    13,784
    Likes Received:
    7,523
    GPU:
    ASUS 3060 OC 12GB
    Phasma is a pretty perfect example of a star wars character to test ray tracing on hahaha..

    I can't help but take this as shots fired at EA.
     
  6. FM57

    FM57 Master Guru

    Messages:
    221
    Likes Received:
    94
    GPU:
    Palit RTX 2070
    I find the special effects and the scenario much better than the latest movies
     
  7. tensai28

    tensai28 Ancient Guru

    Messages:
    1,539
    Likes Received:
    411
    GPU:
    rtx 4080 super
    Today's video cards can't even handle 4k 60fps ultra yet (at least not on all titles). So I'd say we're definitely quite a few years away before we can experience 4k 60fps ultra settings with Ray tracing.
     
  8. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    4K was and still is overrated.
    People should understand it from all those CGI movies where 10% of content is real and rest is CGI. You do not see that it is not real world. And on 1080p it looks better than any game ever did and does on 4K.

    So, if it is 4K old rendering vs 1080p High Quality Raytracing. I'll continue advocating better quality 1080p to further technology.

    What do you need 4K anyway?
    Because you run TAA/TXAA which blurs damn many adjacent pixels and 4K minimizes its sharpness crippling effect?
    Because you prefer High Fidelity textures over high complexity and density objects on screen?
    Why? You can have 8K resolution and it brings nothing new to the table.
    If you had supercomputer at hand, you would be playing 1080p games with movie class photo-realistic image quality.
     
  9. Denial

    Denial Ancient Guru

    Messages:
    14,201
    Likes Received:
    4,105
    GPU:
    EVGA RTX 3080
    To watch 4K movies that have photo-realistic quality CGI anyway. To be able to choose between more/higher quality objects and/or resolution depending on the game. Productivity reasons. Etc.. lots of reasons for 4K to exist.
     
  10. Vipu2

    Vipu2 Guest

    Messages:
    553
    Likes Received:
    8
    GPU:
    1070ti
    Nvidia said they used 4x NVlinked Volta V100:s to run that in 1080p 24fps.
    So yeah, no way this is getting in games until they find some cheaper solution or we wait until gpu:s have 10x or more power than currently.
     

  11. tensai28

    tensai28 Ancient Guru

    Messages:
    1,539
    Likes Received:
    411
    GPU:
    rtx 4080 super
    No, I am sorry but I enjoy 4k. That's all your opinion. I'm also guessing you don't have a 4k screen.
     
    Dragam1337 and XP-200 like this.
  12. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    No, I do not. They "fix" artificially made problems and bring other disadvantages. I would rather have 3x 1080p.
    @Denial : 4K vs multiple screens for productivity... multi screen always wins due to flexibility and comfort.
    Screen used for coding can be turned 90 degrees. Each screen has its own window set arrangement. I started to use multi-screen 15+ years ago and there is nothing better.

    And mind that I did not write about watching movies, but playing games. Movies are mentioned so everyone understands that they are product of computer. I wrote about playing CGI quality-like 1080p game vs 4K game with traditional rendering.
     
  13. Denial

    Denial Ancient Guru

    Messages:
    14,201
    Likes Received:
    4,105
    GPU:
    EVGA RTX 3080
    I know what you meant, I'm just saying people have different use cases. I used to have 3, 27" monitors, I'm down to two now and honestly probably going down to 1 in the near future for aesthetic purposes (my computer is in my living room now, where I used to have it in hidden away). I'd rather have a 4K, ~32/34" monitor. Most of the games I play like SC2, League of Legends, Overwatch, I can play them all at 4K 60+FPS with no issue. For newer games I just lower the resolution, the scaling is good enough, when I get a new GPU I can drop it in and suddenly a bunch more old games I can bump the res up again. I also occasionally watch movies at my desk when I'm working, so having 4K movie support is nice. I don't think 4K is overrated, I don't think it's underrated either, it's just rated lol. If I had to choose between 4K and other features, like adaptive sync/HDR, then yeah I'd probably drop 4K.. but I don't - I can get everything in one package.
     
  14. XP-200

    XP-200 Ancient Guru

    Messages:
    6,383
    Likes Received:
    1,762
    GPU:
    MSI Radeon RX 6400
    I was a 4k skeptic until i tried it, and i was blown away, i still am, i just bought Titanfall 2 on the XBX and the game is just breathtaking in 4k, just stunning, and now that i have let the 4k genie out of the bottle i cannot put it back in, and it is painfully obvious when i go back to 1080p games, 4k is just night and day, well for me anyway.
     
  15. JamesSneed

    JamesSneed Ancient Guru

    Messages:
    1,689
    Likes Received:
    960
    GPU:
    GTX 1070
    I'm not a fan of 4K until you get a monitor of at least 32 inches. I do like 2K on 27 to 30 inch screens as the difference between 2K and 4K on this sized screen at normal desk viewing distances is minimal plus no tweaking resolutions etc to get games to play. Also the higher res does make text nice and crisp as well for everyday use. This is how I plan to go until GPU's easily do 4K at 60fps and higher, which will be a few more years. Hopefully by then we will have some MicroLED monitors that do proper HDR without burn in issues(one can dream).
     
    fantaskarsef likes this.

  16. tensai28

    tensai28 Ancient Guru

    Messages:
    1,539
    Likes Received:
    411
    GPU:
    rtx 4080 super
    Yeah I'm gaming on a 43 inch 4k tv as a monitor. Setting my games to 1080p is painful to my eyes after getting used to 4k.
     
  17. waltc3

    waltc3 Maha Guru

    Messages:
    1,445
    Likes Received:
    562
    GPU:
    AMD 50th Ann 5700XT
    I'll believe the hype when these guys start releasing *demos* instead of pre-rendered video clips. Every couple of years "real-time ray tracing" is resurrected and hits the publicity circuits, along with a cache of proof-of-concept video clips released to demonstrate the kinds of effects being hyped--and then it all dies and returns to hibernation--waiting for the next brief wake cycle! It is all rather long in the tooth. There is something they sense about the public's gullibility for the notion of "real-time ray tracing", I suppose (maybe the "cool" and ubiquitous reflective chrome sphere?), that causes them to beat this old dead horse one more time. They (the marketers) know how well the public likes shiny objects.

    Remember Larrabee? The irony was that Intel *never* pushed that soon-to-be-defunct cpu design (Larrabee was cancelled before it was ever produced as a prototype) as "real-time ray tracing." in fact, there were many interviews with Intel employees giving demos of *simulated* Larrabee concepts (because there was no Larrabee silicon, ever) who plainly stated the concept was *not* real-time ray tracing. It made no difference to well-known personalities at certain sites (not HH!) who kept insisting over and over again that Larrabee was indeed a real-time ray-tracing GPU--these personalities had written numerous articles on the glowing future of "RTRT"--and they *would not back down* on their assertions no matter what Intel said. I almost felt sorry for Intel at that point--and so I was not surprised when without fanfare or apology Intel cancelled the Larrabee project before it got beyond the concept stage. Of course they did--they had to cancel it--as the ludicrous expectations manufactured by the so-called "pundits" could never have been reached by anyone in the GPU business--and Intel, especially, has never been known for its GPU acumen and capability *cough*...;) Larrabee, had it ever been produced, would have been a massive disappointment to all of these people putting impossible expectations on it!

    Well, as we can see Larrabee is dead--but the idea of "RTRT" is anything but...;) (The entire idea behind rasterization since the V1 is that it *simulate* ray tracing, and in "real time!" Why? Because ray tracing is so incredibly heavy computationally speaking that it can *never* be done in "real time"--so rasterization was born to give us the visual benefits of ray tracing without the computational overhead--and every year, rasterization inches closer to that laudable goal. But rasterization is not ray tracing. )
     
    Stormyandcold likes this.
  18. Elder III

    Elder III Guest

    Messages:
    3,737
    Likes Received:
    335
    GPU:
    6900 XT Nitro+ 16GB
    I used to have 3 24" 1080P screens; now I have a single 28" 4K screen. I would not go back for anything entertainment related personally. For productivity there are some benefits to having a secondary screen, although many 4K screens have a splitscreen capability that is very nice on a larger 30/32" monitor.
     
  19. Pinscher

    Pinscher Guest

    Messages:
    66
    Likes Received:
    6
    GPU:
    1080 ti Duke
    So Star wars named this new technology after the Ray character in the films. that's really cool. i hope other games can use this technology too.
     
  20. Denial

    Denial Ancient Guru

    Messages:
    14,201
    Likes Received:
    4,105
    GPU:
    EVGA RTX 3080


    Another cool demo from Epic -- not using raytracing, so kind of off topic but idk if it deserves its own thread so i figured I'd tag it onto here.
     

Share This Page