NVIDIA releases some RTX 2080 performance numbers and some info on DLSS

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 22, 2018.

  1. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,251
    Likes Received:
    232
    GPU:
    EVGA GTX 1080@2,025
    Yes, everyone is abandoning nVidia in droves in order to use AMD's FreeSync, crossfire, Tress FX, and every other copied nVidia technology (g-sync, sli, hairworks). But that i guess is to be expected from the company that got started making clones of Intel's processors (Intel 8088 and AM286).. /rolleyes
     
    BangTail and pharma like this.
  2. RzrTrek

    RzrTrek Guest

    Messages:
    2,547
    Likes Received:
    741
    GPU:
    -
    I do agree that the pricing is very aggressive and to the point of extreme, but I do know that GN is biased towards AMD and that's why I'm not subscribed.
     
  3. wavetrex

    wavetrex Ancient Guru

    Messages:
    2,466
    Likes Received:
    2,579
    GPU:
    ROG RTX 6090 Ultra
    My suspicion as well. "Sold out" in a lot of stores already? That sounds very fishy for a card without one single verifiable gaming benchmark.

    On the other hand, miners all over the world have pockets deep enough to give it a try and hope they beat existing ones significantly. And if they not, who cares, they can just resell it to a gamer.

    Btw, XBT and ETH seem to be slightly on the rise again (or at least not dropping anymore) since that official teaser video with "2080" all over it. Coincidence ? I don't believe in coincidences.
     
  4. cowie

    cowie Ancient Guru

    Messages:
    13,276
    Likes Received:
    357
    GPU:
    GTX
    a lot of people have been holding out for two years chomping at the bit for something new.
    I am not chomping that bad but show me numbers and I am ready to decide
     
    fantaskarsef likes this.

  5. Prince Valiant

    Prince Valiant Master Guru

    Messages:
    819
    Likes Received:
    146
    GPU:
    EVGA GTX 1080 ti
    You know there's an emoticon :rolleyes:?

    That said:
    G-Sync still requires expensive HW but what does it offer over FreeSync at this point? 3dfx made SLI long before Nvidia did. TressFX (Tomb Raider Mar. 2013) was out before Hairworks (CoD: Ghosts Nov. 2013), unless I'm mistaken?

    Nothing stopping these stores from putting up small numbers on their store page to make it look like it's continually 'sold out' in an attempt to drive up demand.
     
  6. pharma

    pharma Ancient Guru

    Messages:
    2,497
    Likes Received:
    1,198
    GPU:
    Asus Strix GTX 1080
    Wrong thread.
     
    Last edited: Aug 26, 2018
  7. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,535
    Likes Received:
    3,581
    GPU:
    RTX 4090 Gaming OC
    G-sync has slightly less latency than freesync, but otherwise you are completely right. Tressfx even runs tons better than hairdoesntwork.
     
  8. alanm

    alanm Ancient Guru

    Messages:
    12,287
    Likes Received:
    4,490
    GPU:
    RTX 4080
    Very easy for any store to be "sold out" if they only had a tiny amount of units to start with. Hype usually builds up over any GPU release, especially one that has gone 2 1/2 years since last one.
     
  9. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    In tests that matter Freesync has lower lag.
    To be precise, at very low fps like 30~40fps, GSync has bit smaller lag than Freesync. Yeah, like I would care if that slideshow does respond 4ms faster to input.
    At higher fps like 100+ Freesync input lag is smaller than GSync. I bet everyone understands what it means. It is better there where it matters.
     
  10. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,640
    Likes Received:
    1,143
    GPU:
    4090 FE H20
    Gsync is lower 144hz locked 142fps.


    @time 14:04

    Doesn't really matter as increased lag would not be perceptible with either set.
     
    Last edited: Aug 27, 2018
    BetA and Dragam1337 like this.

  11. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Do you understand that with so small difference 1~2ms in result, it is GSync which is slower?
    Don't you know why?
    That's because of GPUs they compared. Added lag from frametime rendering of GTX 1080 is 1/2 of added lag from frametime rendering done by RX 570.
    Locked fps or not does not change it. GTX 1080 simply produces frame in 1/2 of time it takes to RX570 to make it.

    Result would go other way around if you took Vega 64 vs GTX 1060. Don't they understand what entire "motion to pixel" is made of?

    It is highlighted by them picking overwatch which has internal framerate limiter broken. That's why their 300fps (in-game) limit makes 35ms average input delay instead of 14 when externally limited to 144fps.

    But you are right, that nobody would really notice 1~2ms difference. Thay could have made simple DXtest project which ticks internally 10000 times per second and has just black background and one polygon which changes color as it gets left-mouse-down information.
     
  12. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,640
    Likes Received:
    1,143
    GPU:
    4090 FE H20
    You did not read the graph properly.

    You compare the difference of freesync off vs on.
    144hz with cap @ 142fps

    Freesync off / Freesync on
    Max = 20.83 / 21.67
    Avg = 15 / 15.28
    Min = 11.67 / 11.67

    Max showed almost 1ms more, avg showed .28 ms increase in input lag

    Compare the same way to Gsync.

    Gsync only adds .06 ms to input lag.
    It also shows less min and the same max.

    That shows gsync has less lag 'where it matters' as you say.

    But like I said, either way is imperceptible.

    Lastly, I'm very familiar with overwatch and how it works, the ingame fps limiter is much better for input lag than an external limiter.
    Nor does the 300fps max cause any increase in input lag, which battlenonsense verifies multiple times in his analysis videos.
     
    Last edited: Aug 27, 2018
    Dragam1337 likes this.
  13. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    You lived without lots of things for 20 years, is anything innovative?
     
    Maddness likes this.
  14. sverek

    sverek Guest

    Messages:
    6,069
    Likes Received:
    2,975
    GPU:
    NOVIDIA -0.5GB
    If all you played for 20 years was a mine sweeper, no, you don't need it and never will.
     
    Maddness likes this.
  15. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    He talked about problems he faced. Sadly he did not show frametime graph in video. It is good idea to look at it, I tell you. I am not taking anyone's word for their statement about frametime w/o rtss graph in video.
    Secondly, he took very few samples, that's why you think nV side does not change in delay. It is just not measured.
    At 14:08 check 60Hz 60fps GSync OFF/ON... GSync increased average fps by ~1ms, but at same time he measured that it decreased minimum and maximum delay by ~1ms.
    Then who can blame him, 1200fps camera is not enough. It captures frame every 0,833ms => 0,41667ms average error (0,833ms max error)... Considering he is measuring as small time frame as 9ms, it is huge error.
    So, result like: Max = 20.83 / 21.67 is actually just 1 frame which either falls within 1 frame window where camera does not capture change or it hits the edge and small time difference is measured as high.

    I did ask people around about Overwatch and them posting RTSS frametime graph before, not much success on our little forums. People rather do not want to know.
     

  16. alanm

    alanm Ancient Guru

    Messages:
    12,287
    Likes Received:
    4,490
    GPU:
    RTX 4080
  17. RavenMaster

    RavenMaster Maha Guru

    Messages:
    1,360
    Likes Received:
    253
    GPU:
    1x RTX 3080 FE
  18. alanm

    alanm Ancient Guru

    Messages:
    12,287
    Likes Received:
    4,490
    GPU:
    RTX 4080
    Looks perfectly in line with expected performance. What would constitute a 'non-fake' score to you?
     
  19. RavenMaster

    RavenMaster Maha Guru

    Messages:
    1,360
    Likes Received:
    253
    GPU:
    1x RTX 3080 FE
    Official review scores by Hilbert ;)
     

Share This Page