1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

NVIDIA is listing 21 Games with RTX Support

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 21, 2018.

  1. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,008
    Likes Received:
    119
    GPU:
    EVGA GTX 1080@2,025
    Many websites have been reporting they played a number of recent games at 4k w/ultra settings and were running 100fps. People who were there that didn't mention that in their articles were the ones omitting important info. Sucks the NDA kept them from stating which games were running that well.


    Interesting how people were saying the exact same thing just prior to Pascal's release.
     
  2. alanm

    alanm Ancient Guru

    Messages:
    8,643
    Likes Received:
    1,085
    GPU:
    Asus 2080 Dual OC
    Really? from 28nm > 16nm recall most had high expectations. The specs supported big perf gain, expectations, Turing 12nm does not. Again, would like to see the magic unveiled when reviews are out. I repeat, I could be very wrong, but think it would require some cool tricks up Nvidias sleeve.
     
  3. Kool64

    Kool64 Member Guru

    Messages:
    120
    Likes Received:
    47
    GPU:
    Gigabyte GTX 1070
    Perhaps this technology won’t fail to launch like Physx did. There was a time years ago I built a computer with quad Sli and a Physx gpu. Now i’m just using a single card.
     
  4. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,152
    Likes Received:
    118
    GPU:
    MSI GTX1070 GamingX
    The way I remember it was that no-one expected anything until the end of 2016, then BAM, Nvidia announces Pascal with available in a month. The only difference with Turing is that we expected it this time, but, availability is still just as quick as the Pascal announcement.
     

  5. SpookySkeleton

    SpookySkeleton Member Guru

    Messages:
    126
    Likes Received:
    18
    GPU:
    GTX 1080
    Why not?, like really, i have an 11 years 60hz 1080p monitor, the samsung syncmaster sw2333 wich still works flawlessly without a single dead pixel and very well calibrated, yes i know there are better 1080p monitors with better contrast and colors, but still is 1080p, i will get used to the new pixels easily if it's the same resolution, or even if it's 240hz, i bought recently a samsung 4K TV the QN55Q6 to be precise and holy crap it blows even the best 1080p monitor back to the stone age in therms of IQ, so i'm really if these cards doesn't give us smooth 4K and 75fps at least, i'm not really changing my 1080 because why changing my gpu if i can't use it on my new TV or a new monitor to replace my current one. Why change my already 11 years old 1080p monitor if 1080p is going still to be the norm.

    I really don't understand.

    Probably with reworked shaders/cuda core, tessellation and of course the new rtx tech, like the jump from kepler to maxwell, they both were 28nm but maxwell had a reworked architecture from ground up to be more efficient, cool and with better speeds and overclocks.
     
    Last edited: Aug 22, 2018
  6. Mesab67

    Mesab67 Active Member

    Messages:
    65
    Likes Received:
    28
    GPU:
    GTX 1080
    Thinking about NVIDIA's presentation...no company would ever be in a position to release standard benchmark data (RT off), in comparison to existing cards, while a major drive was underway to allow all stores worldwide the chance to sell off the very significant stocks. That's reasonable. Remember also, they stated that prices would start at $499 for the lower 2070 model - a model they also stated is 'more powerful' than a Titan Xp (i.e. > 1080Ti), so you can imagine where the price of a 1080Ti will end up. I also think that we shouldn't lose sight of the significance of where they chose to hold the presentation - Gamescom - then factor in what that crowd would naturally understand by the use of the term 'more powerful'.
     
  7. nevcairiel

    nevcairiel Master Guru

    Messages:
    559
    Likes Received:
    173
    GPU:
    MSI 1080 Gaming X
    Honestly the pascal announcement was similar. It was like "2x the performance" (in VR workloads), "3x the efficiency" (in VR workloads), and things like that. They don't really present non-marketing numbers in those events.
    Granted Pascal probably had a bigger FLOP advantage over the old gen at that time, but between increase of cuda cores, GDDR6, and architectural improvements, I still expect a healthy growth in performance.
     
  8. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    2,893
    Likes Received:
    362
    GPU:
    EVGA 1070 FTW
    Thanks for the correct. I thought you meant UE4 didn't have DX12 shader support for a second...
     
  9. MarkyG

    MarkyG Master Guru

    Messages:
    803
    Likes Received:
    53
    GPU:
    Nvidia GTX 970
    ARK + RTX - HAHA! Haven't laughed this hard since....well, ever! They can't even get the core game running well, let alone adding another layer of potentially performance crippling features! What a joke.
     
  10. H83

    H83 Ancient Guru

    Messages:
    2,643
    Likes Received:
    343
    GPU:
    Asus ROG Strix GTX1070
    So i´ve just read Anandtech´s hands-on with the 2080Ti and they are saying that the card struggles to hit the 60 frame mark with RT enabled on a 1080p resolution... Performance can´t be this bad right???
     

  11. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,093
    Likes Received:
    1,862
    GPU:
    -NDA +AW@240Hz
    I was helping friend with running dedicated server. ARKs server main problem is AI and range from players at which AI is enabled for each dino in full mode.

    If you had 100 players in one base and everyone would go out as one big group, servers would perform well. But because players like to have separate bases, they make almost all dinos fully active.
    I made mod package diagnosing CPU times and improved performance by reducing activation ranges. But still it would be better if developers optimized AI tick rate and threading in general.

    Graphical bottleneck is another issue which one could not really optimize w/o some comprehensive mod which was above my understanding of UE4 and time I could spare. I wonder if they actually improved graphical performance.

    I was working for some time on custom smaller map which combined all interesting parts of all official maps. It did run well because there was not that many dinos, but I did not finish work on all geometry joints, so I never released it.

    It tells something about developers. They simply ignored weak points of their design and made them even bigger.
     
    MarkyG likes this.
  12. AlmondMan

    AlmondMan Master Guru

    Messages:
    439
    Likes Received:
    26
    GPU:
    Sapphire 480 Nitro+
    Devs stated on Twitter in response that this was very early stages of implementation and optimisation and it wouldn't be in the game at launch, but be included in a patch sometime post-launch.
     
  13. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,696
    Likes Received:
    169
    GPU:
    EVGA GTX 1080Ti SC
    [​IMG]
    We ... did go for AMD cards in the 5xxx - 2xx era because they were competitive .. I loved my 7970s. Then Fury happened. Then Vega happened. Now Navi is going to happen.

    AMD lacks focus in the GPU segment incredibly.
     
    Noisiv and fantaskarsef like this.
  14. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    9,137
    Likes Received:
    253
    GPU:
    GTX 1080 Ti
    So summarize, this next gen is going to be a hilarious baby step over Pascal aside from the gimmicky ray-tracing which itself is the first baby step. Okay, good to know developers will have to optimize games a little bit so I can keep using my 1080 Ti.
     
  15. Noisiv

    Noisiv Ancient Guru

    Messages:
    6,603
    Likes Received:
    461
    GPU:
    RTX 2070 Strix

  16. nevcairiel

    nevcairiel Master Guru

    Messages:
    559
    Likes Received:
    173
    GPU:
    MSI 1080 Gaming X
    Turns out they are only going to support DLSS Anti-Aliasing, which should turn out to be a pretty decent GPU performance boost, which it certainly can use.

    I've never played ARK on an official server, or even a big custom server, only ~20 people custom servers or so, and I really cannot complain much. Sure it has its quirks, but its still plenty fun.
     
    MarkyG likes this.
  17. MarkyG

    MarkyG Master Guru

    Messages:
    803
    Likes Received:
    53
    GPU:
    Nvidia GTX 970
    Interesting. Oh, yeah, the game is fun. No question. Just hoping the performance gets sorted.
     
  18. Camaxide

    Camaxide Member

    Messages:
    40
    Likes Received:
    9
    GPU:
    MSI 1080 Ti Gaming X SLI
    deliver "better" results.. surely not :) depends on weather you like of course - but better results is usually defined as more realistic results - and these cheap methods look anything but real. You look at a still image for half a second and you recognize it as rasterized graphics just by looking at the fake sharp shadows and uniformity of shading. - If ray traced shadows dont do what should be sharp, sharp - then that is the fault of those who set up the settings for it, not the tech..
     

Share This Page