High End GPUs Benchmarked at 4K Resolutions

Discussion in 'Frontpage news' started by Watcher, May 2, 2013.

  1. Watcher

    Watcher Ancient Guru

    Messages:
    2,695
    Likes Received:
    367
    GPU:
    Asus Dual RX6700 XT
    Last edited: May 2, 2013
  2. Blackops_2

    Blackops_2 Guest

    Messages:
    319
    Likes Received:
    0
    GPU:
    EVGA 780 Classified
    As the review notes we're still a good ways away from gaming at 4k on a single GPU. There is also the fact that the only 4k monitor i've seen is the Sharp PN-K321 which is supposed to retail at 5,500$ lol. For 4000$ less i can get pretty close to the same pixel count in a multi-monitor setup. Sure i'll have to deal with bezels, multi-GPU setups, and expense but it still isn't in that range of expense.
     
  3. FULMTL

    FULMTL Ancient Guru

    Messages:
    6,704
    Likes Received:
    2
    GPU:
    AOC 27"
    Is 50" too big to sit in front of? It would definitely go above eye level. Maybe take the base off and lean it against the wall to be a little more comfortable.
     
  4. kanej2007

    kanej2007 Guest

    Messages:
    8,394
    Likes Received:
    60
    GPU:
    MSI GTX 1080 TI 11GB
    Wow, even the 6Gb Titan gpu is brought to it's knees & is getting an average of 20fps when fully maxed.

    Current generation cards are not sufficient for maxing titles such as Crysis 3 & Far Cry 3.

    Even SLI/Crossfire struggles. Maybe next gen cards, maybe!
     

  5. bemaniac

    bemaniac Master Guru

    Messages:
    341
    Likes Received:
    27
    GPU:
    Zotac 4090
    urgh that review made me realise how bad crossfire sucks compared to sli. I'm using crossfire and I do often see 50fps but feel 20fps...........God damn them frame time drivers for the 7990 are much needed for 7970 crossfire users right now.
     
  6. (.)(.)

    (.)(.) Banned

    Messages:
    9,089
    Likes Received:
    0
    GPU:
    GTX 970
    I think some people out there are getting a little to ahead of themselves, there isnt even a single gpu that can max C3 with a decent level of AA at 1080p, and somehow people have gotten the idea into their head that the current batch can push out 4K res even in multi-gpu configuration.
     
  7. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    The drivers are coming in June.

    Why don't you use VSync? It eliminates microstutter, you get smoother gameplay even if there was no microstutter (due to syncing), you would not overwork the cards if they ran over 60FPS (they'd be capped), and you don't dip below 60FPS (since multi-GPU AFR setups work inherently similar to Triple Buffering, which keeps up FPS).

    Minimize input lag by capping FPS to 60FPS using an FPS cap with MSI Afterburner or RadeonPro, and there you go.

    Win-win-win-win situation IMO.

    If you're still worried about VSync-induced lag at this point (you should be facing VERY minimal lag at this point), then you should also know that any microstutter fixes, from AMD or Nvidia, add a frame of delay in order to smooth things out. This means that if you were thinking of running with VSync off and microstutter optimizations on, you will still get that frame of lag you were running away from with VSync. Limitations of AFR.

    So try VSync and see how it goes.
     
  8. MM10X

    MM10X Guest

    Messages:
    4,240
    Likes Received:
    1
    GPU:
    3080 FE
    I wouldn't be interested in one until they can push 96-120 hz !!

    I was chatting with my buddy about this topic, I hope display resolutions continue to evolve like this- I'd rather see higher resolution at this point than higher 3d quality. Optimization for more detail would be fantastic !
     
  9. Krogtheclown

    Krogtheclown Master Guru

    Messages:
    960
    Likes Received:
    0
    GPU:
    FuryXfire 3440x14freesync
    I use radeon Pro and do not notice any micro stutter, that doesn't mean you wont but i dont notice it.
     
  10. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,725
    Likes Received:
    1,854
    GPU:
    EVGA 1070Ti Black
    4k gaming or hell 4k standard is LONG WAYS AWAY for gaming never mind Consoles. I say 5 years at minimum maybe more before it becomes standard
     

  11. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    There is no microstutter when using RadeonPro with the proper configs. It's not about if one notices it or not; it just isn't there.
     
  12. warlord

    warlord Guest

    Messages:
    2,760
    Likes Received:
    927
    GPU:
    Null
    And I play games on the moon all day and all night. Come' on here we are pro, microstutter whenever you like it or not there always was, still is and will be forever with mGPU+AFR. It's just reduced with RadeonPro and DFC+TP+Vsync :)
     
  13. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    Vsync doesn't eliminate microstutter.
     
  14. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    10,411
    Likes Received:
    3,078
    GPU:
    PNY RTX4090
    THANK YOU!!!

    In some cases, maybe a lot more than people think Vsync (even with triple buffering) can make micro stutter even worse. As the game engine is trying to sync the frames but those "runt frames" are still there are at hardware AND driver level the cards just simply are not synced. Using dynamic frame rate control or capping frame rates, and increasing pre rendered frames can help but then once the stutter is gone you will get mouse lag more than likely. AMD need to adopt it into a hardware level like Nvidia has done.
     
  15. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    I talk from a practical perspective and not from a theoretical one. If the frames are perfectly synced, with each card outputting a frame every 16.67ms, with the other card half-done with the next frame, how is there practical, let alone any theoretical microstutter?

    Yes it does.

    I am at 60FPS with VSync off and there is immense microstutter. I enable VSync and it's completely smooth, as smooth as a single card. If I gave you a frametime graph, with FCAT, it would be a straight line, and there's no reason it wouldn't be.

    I am below 60FPS with VSync, again, immense microstutter and observed FPS is below 30. I enable RadeonPro OSD and toggle VSync on. Suddenly, observed FPS matches raw FPS.

    I'm not blind.

    EDIT: CPC_RedDawn, that doesn't make sense. Runt frames don't work that way. Runt frames are a result of lack of frame pacing. They are real rendered frames, and this can be proved. A benchmark that runs at slightly higher than 60FPS and exhibits runt frames can be VSynced to a constant smooth 60FPS where no frames are dropped. The frames are displayed.

    For anyone who disagrees, this is personal experience and extensive testing and others have verified it. So don't discuss it if you didn't try it.
     
    Last edited: May 3, 2013

  16. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    We'll discuss what ever we like. And that's great you believe personal experience somehow applies to everyone. Vsync does not eliminate microstutter. That is a fact. Frame rate capping can do a lot to minimize it. You can argue with Nvidia and AMD about it.
     
  17. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    I use both VSync and frame-capping. It applies to every single game.

    I'm telling you I feel the microstutter with it off, then with it on I can't distinguish the feeling from a single card. Then you tell me "you believe personal experience somehow applies to everyone". So basically another person with the same setup as mine, using the same settings, will be getting microstutter?

    Should I start writing peer-reviewed articles to please you?

    EDIT: Frame rate capping reduces microstutter by reducing GPU usage. VSync does the same, above 60FPS.
     
  18. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    Indeed. Combined with frame rate capping. Vsync alone does not eliminate microstutter. That is what I took issue with.
     
  19. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    Oh, fine. Maybe alone it doesn't. I always use a cap anyways.

    However, I still get greatly reduced / eliminated microstutter below 60FPS, VSync on, while the FPS cap is at 60FPS.

    Also, this article (that convinced me to go multi-GPU):
    http://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329-11.html

    Used Dynamic FRC for AMD cards, eliminates microstutter. No VSync involved (shouldn't they call it Dynamic FRC and not Dynamic VSync?)

    Used Adaptive VSync for Nvidia cards, eliminates microstutter in the places where the GTX690 was achieving those 60FPS.

    Wouldn't this imply that Vsync and FPS capping both have the same effect on microstutter, at least when FPS is above the FPS cap / refresh rate?
     
  20. Krogtheclown

    Krogtheclown Master Guru

    Messages:
    960
    Likes Received:
    0
    GPU:
    FuryXfire 3440x14freesync
    I also use frame capping, but no matter what you say someone will see it differently...which is why I say I dont notice micro stutter i'm sure others will say its there.
     

Share This Page