How to play pc games at 30fps?

Discussion in 'Videocards - AMD Radeon' started by CrunchyBiscuit, Jan 16, 2013.

  1. angmar

    angmar Guest

    There is already an incredibly easy way to do this and has been. Download MSI Afterburner....when its running look in your system tray a thing called rivatuner statistics is in it. Open it up quick where it says framerate limit it set it to what you want to cap frames at....as someone else already said over a year ago at the beginning of this thread.
     
  2. CrunchyBiscuit

    CrunchyBiscuit Master Guru

    Messages:
    343
    Likes Received:
    126
    GPU:
    MSI GTX 980 Ti
    I'm pretty sure you didn't read this thread carefully enough.

    I'll take some time to reply tomorrow or during the weekends. Accidentally deleted my post due to an image edit and am too tired to repost.
     
    Last edited: Jul 24, 2015
  3. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Let me participate in the necro too. It's one of my favorite subjects. In order to get properly paced 30fps, you need to make sure you have both double vsync enabled, and use a limiter. Those two are very different things. Vsync by itself won't pace the frames until 30 (you might get judder), and a frame limiter by itself can only pace so much usually.
     
  4. flow

    flow Maha Guru

    Messages:
    1,023
    Likes Received:
    17
    GPU:
    Asus TUF RTX3080 oc
    ?

    30fps on consoles is acceptable since those games are taylored towards that medium.
    On pc it's an entirely different situation and playing 30fps is getting close to torture.
    Or else play at low resolutions, but that doesn't look nice on a pc screen.
    Again, console games are taylored towards that medium.
     

  5. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Only there is at least a 70% overlap between PC and console games, and the whole meaning of owning a PC is the choice it gives you. As someone who plays the Witcher 3 downsampled from 1440p@30 with a DualShock 2, I disagree sir :p
     
  6. CrunchyBiscuit

    CrunchyBiscuit Master Guru

    Messages:
    343
    Likes Received:
    126
    GPU:
    MSI GTX 980 Ti
    Hey there, cool you took the time to participate!

    Sound advice. I have 29 background programs running when I press Ctrl+Alt+Del on Win7x64 when idle - all default Windows tasks. Useless stuff has been disabled.

    I always make sure the frame times are near perfect in all titles I use as examples, or as perfect as possible. All factors that could cause any form of stutter related to uneven frame times are eliminated, as are potential mouse polling rate and other input fequency misalignments. Tests are best done on game engines that don't have any odd engine-related stutters or VRAM access/load stutters. Indeed, it varies from engine to engine and even from game to game on the same engine.

    Back to the general topic - a manual fps cap using arbitrary numbers (RadeonPro, Dxtory, RTSS, FPS_Limiter), without exception, is fundamentally unable to guarantee that the manually set frame rate is a proper and exact fraction (half, for example) of the refresh rate. This will eventually always lead to judder when using half frame rates (30fps@60Hz for example) or thirds (20fps@60Hz for example).

    Only a 1/2 refreshrate fps cap or double v-sync fps cap feature can potentially guarantee a judder free experience, which is why the feature is so useful.

    Unless you're on G-Sync or FreeSync, individual frame timing isn't the problem here, this is about judder and the basics of frequencies and harmonics on traditional monitors and display devices.

    I'll try explaining the issue in some more detail using FRAFS graphs, and also try to demonstrate that these graphs won't (are literally unable to) capture this form of judder.

    I tested many games (old and new) on various platforms (both AMD and Intel, various motherboards and RAM, some on SSD), all give pretty much the same results with regards to judder or frame rate/refresh rate misalignment. I'll use the game Two Worlds II as an example below, since this game has few issues, is very predictable in its behavior and runs in DX10 so it reacts properly to RadeonPro's Double v-sync setting. I also don't have any other DX10/11 games installed at the moment (RadeonPro's Double Vsync feature only works in DX10/11 games). All graphs were produced using the same ingame conditions, triple buffering and v-sync enabled through RadeonPro (on a HD6950 2GB, Win7x64):

    [​IMG]
    This is Two Worlds II, capped with DFC (RadeonPro's fps capper) manually at 30fps@60Hz (produces small but perceivable judders once every few seconds).

    To rule out the framerate being the variable, to keep the graphs consistent and to be able to make my point, I lowered my refreshrate instead of my framerate in the graph below:

    [​IMG]
    This is Two Worlds II, capped with DFC manually at the same 30fps as on the first graph, but @50Hz this time (produces a VERY noticeable judder all the time).

    Both graphs look pretty much identical. However, 30fps@50Hz very clearly does not look as smooth during movement visually as 30fps@60Hz does (despite the graph suggesting otherwise), for the same reason (which is well known in the tv industry) that 25fps doesn't look smooth at 60Hz and 30fps doesn't look smooth at 75Hz, namely, the framerate doesn't align very well with the refreshrate. The easily observable and very obvious judder at 30fps@50Hz gets lost in translation and can't be seen on the graph.

    [​IMG]
    This is Two Worlds II using RadeonPro's double v-sync without an arbitrary fps cap @60Hz (no judder at all, perfectly smooth visually, just like 30fps on my consoles).

    The third graph looks pretty much the same as the first two, but at a closer glance it looks less precise. Despite the graph indicating slightly worse frame times and more timing variety, there was absolutely no noticeable judder AT ALL during this last test. The visual results were by far the best, most fluid and smoothest (perfect 30fps, just like on a console) compared to the other two tests. There were no judders, no stutters, no missed or lost frames observed, just a perfectly consistent 30fps. There was very bad input lag while playing though, which is to be expected using the double v-sync method without a manual fps cap alongside of it. Again, the graph does not reflect the observed behaviour, because it simply can't - it's not designed to.

    An absolutely perfectly consistent manually capped 30fps@50Hz looks way worse in action than a double v-synced 30fps@60Hz for obvious reasons, even though the graph might show a near flat line using the arbitrary cap while it shows more spikes and fluctuations using double v-sync without a cap. These graphs do not take the refresh rate of the monitor into consideration.

    As I mentioned in a previous post, to properly put the issue described in this thread in a graph, we'd need to know both the timing/rate of the monitor's blank intervals and the timing/rate of the video card's frame output, not just the latter. These graphs are based on data intercepted only from the video card, not from the monitor (frame time benchmarks still work when the monitor is off). It's just like the observable tearline visible during movement due to a lack of v-sync - an internal screenshot will never capture that tearline, just like how a frame timing graph can't capture the type of judder this thread's about. It's different with G-Sync and FreeSync, since fps=Hz in those cases.

    Not sure, but I'm very curious too.

    Indeed, many devices have a slight offset when it comes to refresh rates as you already noticed, which can definitely cause discrepancies. Both PowerStrip and Reclock can also display the exact current display refresh rate. I've also noticed the default 60Hz mode on my primary display device (Dell 2209WA) is different from the one on my Samsung television (2nd monitor). My brother's monitor runs at yet another frequency close to 60Hz, but not exact. CRU is an awesome little tool that's extremely useful in this area indeed. I'm not sure exactly how consoles handle these matters, I know the guys over at Digital Foundry are able to measure frame rates on consoles, maybe they can offer some insight into how consoles handle the frame cap as well.

    No matter how close I get, I'll never be able to perfectly sync fps/Hz manually. I tried. For far too long. It's quite literally impossible, neither CRU nor any frame limiters offer enough precision. Refresh rate or frame rate always ends up being either too high or too low compared to the other, with no smaller adjustable steps inbetween.

    I think the importance and usefulness of a feature such as nVidia's 1/2 refreshrate or RadeonPro's Double Vsync is largely overlooked. It's not the same as an arbitrary fps capper (since the frame rate is dictated by the monitor) and offers the perfect solution to the judder problem encountered while using a regular fps cap. The feature should be more widely understood to get rid of the misconceptions and more widely supported to get rid of juddering.

    G-Sync and FreeSync might also offer a solution, but I have yet to try those out. Seen some in action but didn't have the time to test properly and compare. I imagine frame times are more important with G-Sync and FreeSync, since frames are being displayed immediately.

    If you want to play at perfect half frame rates in DX9, it seems like the only option is to go nVIDIA.

    Yes, flip queue size can and will affect frame timing, provided the game actually reacts to the setting (some games override it). However, adjusting the setting rarely helped me with solving any stutter issues, it mostly just causes them, or moves them from one area to another. It can help with reducing input lag though, useful on games that perform flawless regardless of what the setting's at. Using a value of 1 is almost always bad when v-sync is enabled and will more often lead to stutters than solve them, but it got rid of the cursor skips in Lara Croft and the Guardian of Light. Situation might be different using G-Sync or FreeSync.

    Official support from AMD for features that offer solutions to this problem would be great indeed!

    I agree, low frame rates can be very acceptable when they're synchronized and being displayed in a perfectly consistent fashion. Nothing beats the glorious smoothness of 120fps@120Hz though.
     
    Last edited: Oct 15, 2015
  7. NiGMa46

    NiGMa46 Guest

    Messages:
    3
    Likes Received:
    0
    GPU:
    TG Extreem Dark 1066 6GB
    Thanks for the thorough reply.

    I have been trying to use regular vsync lately without triple buffering then setting the ingame settings to be demanding enough to keep it between the integer multiples of the refresh rate, (i.e. with 75 Hz monitor 37.5 - <75 FPS), this gives me the smoothest and most consistent frame times. Unfortunately modern games tend to have triple buffering enabled by default, this wasn't the case a few years ago because I remember having to enable it with external apps. I wasn't aware of judder back then but I find it extremely distracting now. Even if the frame time drops just a little bit there is a massive interruption to the motion so triple buffering really isn't a solution since I'd have to set the game graphics settings low enough to prevent those dips anyway. Because I like to use downsampling, using triple buffering uses up quite a bit of ram I'm told so its really not doing much for me.

    Ideally it would be good to have double vsync working solidly. Just need to educate the masses on consistent frame times.... e.g. telling someone, imagine going to the movies and watching a movie that stutters while people are talking or the camera is panning, haha.
     
  8. CrunchyBiscuit

    CrunchyBiscuit Master Guru

    Messages:
    343
    Likes Received:
    126
    GPU:
    MSI GTX 980 Ti
    Spent the last two days trying out everything GeDoSaTo has to offer. I'm impressed by its features, the frame rate limiter works great and is very precise, more precise than RPs frame limiter in many titles which are known for fluctuating frame times (Skyrim, Far Cry 3).

    It also has a double and triple v-sync feature, which just works in DX9. Combined with a frame limiter capping the fps slightly below half of the refresh rate (for example, 32.499fps@65Hz) gives excellent results without any added input lag. Frame rates don't drop back to a third of the refresh rate when half can't be kept up. Durante's frame limiting method also offers prediction, which in fact works pretty well (noticeably reduces input lag when the frame rate is above the v-sync target).

    The only downside (which unfortunately renders the utility completely futile to me), is that just running GeDoSaTo in the background with all effect disabled (as clean and bare bone as possible) already reduces my frame rate quite a lot in CPU limited scenarios (from 34fps minimum down to 29fps minimum in the same scene). With GeDoSaTo running, I cannot keep up a minimum of 32.5fps (half of 65Hz) in Skyrim.

    I don't understand why there is such a heavy impact on performance. Like I stated before, not even enabling ANY options at all in GeDoSaTo, just running it in the background, impacts performance noticeably. What a bummer. Still no proper solution to this. Back to 32fps@65Hz without v-sync.
     
  9. Nurmi

    Nurmi Guest

    Messages:
    140
    Likes Received:
    0
    GPU:
    R9 Fury Tri-X
    I tried to suggest at AMD forum if it'd be possible to get some sort of frame doubler, like simple setting to 2,3,4x copy the frames, with option to cap where it turns off.
    OR same but with target fps so that it automatically doubles, or triples them.

    I personally cannot comprehend what's going on at 30fps game, and it's very similar to extreme motion blur like wtf?. For example, I really haven't been able to enjoy DA:I cutscene story as it's locked to 30fps, which is really sad. "workaround breaks mp."

    https://community.amd.com/message/2673042
     
    Last edited: Sep 28, 2015
  10. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    RTSS frame limiter set to 30
    Run game in fullscreen boderless windowed mode.
    Disable in game v-sync.

    This is the smoothest way of playing games in my opinion.
     

  11. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    MGSV is such a well optimized game. Sometimes can't believe it.
    I did what I said in page 2, and it ran a solid 30ms frametime through an entire mission. It didn't feel great though as I'm used to higher FPS.
     
  12. dieandromeda

    dieandromeda Guest

    Messages:
    19
    Likes Received:
    0
    GPU:
    Gigabyte 980 Ti Xtreme WF
    some ppl like me rather using 30fps for smooth gameplay instead of higher framerate but inconsistent fps. catalyst only able to limit fps to 50~. they should add fps limiter like radeonpro/msi afterburner.

    im happy with my current setting which include VSR + 30fps limit using msi afterburner. i tested with older games which look perfectly fine in 30fps (except first person shooter games which i set to remain @ 60fps)

    refer here for video. i test this setting using windows 10 and new catalyst.

     
  13. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Nice! I wouldn't bet that the 270x would have the horsepower for that, but it's quite impressive.
     
  14. GanjaStar

    GanjaStar Guest

    Messages:
    1,146
    Likes Received:
    2
    GPU:
    MSI 4G gtx970 1506/8000
    the problem with 30 fps is, that for me at least, i'd have to play everything at 30 fps to get used to it. just looking at the above project cars 30 fps vid looks horrible to me, because i've been playing games at constant 60fps with the 970, and tweaked settings until i had constant 60. even 55 fps I immediately notice choppiness.


    replaying crysis 1, and not being able to get constant 60 fps in some of the scenes, as the game is notoriously cpu limited even to this day, makes me cringe.

    I can imagine how 120fps people feel then when looking at a 30fps vid. must be eye straining :)
     
  15. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    What's wrong with 50FPS or even 40FPS? It's better than 30FPS :D
     

  16. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Anything more than 60 minimum is quite unrealistic today. Even with multiple GPUs you have to rely on multi-gpu profiles for the thing to work, and then you get latencies and spikes. Since games will probably stay stuck on today's visual levels for 3-4 years at least, next-gen GPUs will probably manage the minimum 60fps at 1080p.
     
  17. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    I run games at 50FPS. I found my GPU is most stable at this FPS. If I up it to 60FPS I get unstable frametimes.
     
  18. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    That's one of the nicest tricks to do actually ;)
     
  19. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    You are sadly right. :(

    Even with enough raw GPU power the crossfire dependency of profiles and the frametimes problems are the norm and the gameplay suffers.

    Mantle was not a solution ,Mantle vs DX11 in BF4 case show us it sometimes performs even worst!

    Is "miracle" DX12 the all-in-one universal multi-GPUs/multi-core CPU gaming solution?

    I want to see to believe. :nerd:
     
    Last edited: Oct 1, 2015
  20. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    We will now depend on developers for multi-GPU tweaking. I don't believe much will change, but we'll see. My geek self trusts AMD and NVIDIA engineers more than any game developer :p
     

Share This Page