The truth about PRE-RENDERING 0?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Tastic, Jul 16, 2012.

  1. rewt

    rewt Guest

    Messages:
    1,279
    Likes Received:
    15
    GPU:
    Yes
    Benchmark scores start to decrease as you decrease the amount of frames rendered ahead. In drivers before r300, zero would have had a negative impact on benchmark scores, but not with r300+ drivers.

    Combine this with the fact that AMD removed the ability to tweak flip queue in their drivers and you begin to see the primary motivation Nvidia had for changing the setting's behavior in recent drivers.
     
    Last edited: Jul 17, 2012
  2. nick0323

    nick0323 Maha Guru

    Messages:
    1,032
    Likes Received:
    77
    GPU:
    Asus DUAL RTX2060S
    Very bottlenecked. I had a Q9450 running at 3.2GHz and BF3 didn't like it, had to run the game in Medium Quality of all the things in the world.

    Switched to an i5 2500k running at 4.5GHz and I can now run BF3 in High / Very High with the same GTX470 which I still have now and had in my previous system.

    Not going to make the same "costly" mistake as I did on my AGP Motherboard which was to constantly get a better graphic card with a bottlenecked CPU. Then again we also hit the bandwitdth wall with AGP.

    I'll have to give this a go as I'm curious now. I'll try 1 and 2 but BF3 works a treat for me as it is.
     
  3. tweakpower

    tweakpower Banned

    Messages:
    932
    Likes Received:
    1
    GPU:
    MSI HD 6770
    Actually, graph clearly shows less FPS in RFA = 0, but, that could be CPU limitation on this game. On the other side, I agree with this you said, on recent systems, maybe, with recent drivers ofc.

    Graph shows latency between frames, which is much more important than AVG FPS in mine opinion. As you can see, RFA 0 added some extra latency on some parts (i actually did on part that was most heavy in Mafia II on mine system), and it shows GPU+CPU limtation on parts you can see clearly. But there is other thing, you maybe can't notice in this graph, FPS is more steady with RFA 0 (except those few parts), and because it was around 30-40 FPS, you can't see that in graph very clearly. The actual gameplay of game was changed with RFA = 0, in a good way.

    Back on topic, with nVidia cards, i find best option to be RFA = 1, it is a good balance between framerate and lag. I think (not tested tho), it would be more important on quad+ core CPU's.

    Here is some tests i found on backup CD with 9800GT and RFA = 1, all games are tested with high/maximum details, AF16x and 4xAA. Resolutions was lower 1024*768 or 1280*960, you can see difference in good vs bad optimised and caped games.

    [​IMG]
    You can see not so good performance here.
    [​IMG]
    Almost perfect, despite low FPS.
    [​IMG]
    Caped game.
    [​IMG]
    Good performance.
    [​IMG]
    Badly optimised game, it requires AVG FPS well over 200 in order to work good.

    From best, to worst preforming games: F1 2010, COD Modern Warfare 2, Doom 3, Trackmania United Forever, FIFA 11. There you go, go figure...
     
  4. rewt

    rewt Guest

    Messages:
    1,279
    Likes Received:
    15
    GPU:
    Yes
    That's the reason Nvidia sets it to 3 by default.
     

  5. tweakpower

    tweakpower Banned

    Messages:
    932
    Likes Received:
    1
    GPU:
    MSI HD 6770
    Yes, but with 1 it is about same as 3, OK, there is a bit of difference when FPS goes above 100, and minimum FPS is lower with 1, but in all, more stable with 1 or 0.
     
  6. rewt

    rewt Guest

    Messages:
    1,279
    Likes Received:
    15
    GPU:
    Yes
    It can't be generalized as such, as performance will vary depending on game, hardware, etc.
     
  7. tweakpower

    tweakpower Banned

    Messages:
    932
    Likes Received:
    1
    GPU:
    MSI HD 6770
    Agree, but from one, you can assume how it will behave on other system. For example, if you have game that use 4 threads or more, you can assume that you will stress CPU less with 1 RFA, because, every thread will prepare 1 frame ahead = 4 frames. But with dual core CPU, it means 2 frames ahead etc. That is difference. More frames ahead prepared = more lag, so, you can assume, that would be strongly advised to use less as possible with quad+ cores CPU's and games that actually use them.

    Off topic, that is the reason why i changed to AMD, from past, i know nVidia cards stress CPU more than ATI, and that trend is still the same. Less stresed CPU, different way how drivers work = less lag, and in some games less FPS with GPU's in same range (similar performance), and i can live with that, at the end, it is what you really need, more FPS or more stable FPS, less stressed CPU (in my case very important), and less lag. As i always said, nVidia drivers are very superior to AMD, and that is the reason i go with AMD to be clear, for others, it can be reason to go with nVidia, depends what your aim is.
     
  8. rewt

    rewt Guest

    Messages:
    1,279
    Likes Received:
    15
    GPU:
    Yes
    You can't assume that the CPU always prepares several frames in advance. That generally only occurs when the GPU is under heavy load (or vsync'd).
     
  9. tweakpower

    tweakpower Banned

    Messages:
    932
    Likes Received:
    1
    GPU:
    MSI HD 6770
    Actually, that is news for me, didn't know that. Well, that's why i always head unbearable lag with nVidia when using v-sync, but with AMD not. And even without vsync, as i tend to get more FPS to get rid of lag, but i think you got point there, since on some games less lag i experienced with 120 FPS then 200 FPS on NV card.
     
  10. rewt

    rewt Guest

    Messages:
    1,279
    Likes Received:
    15
    GPU:
    Yes
    A majority of the lag associated with vsync and pre-rendering can be eliminated by limiting FPS at the CPU level.
     

  11. inplayruns

    inplayruns Active Member

    Messages:
    93
    Likes Received:
    13
    GPU:
    Radeon 6600 XT
    So as a BF3 player with VSync enabled, what would be the best setting? BF3's default is 3 IIRC.
     
  12. Prophet

    Prophet Master Guru

    Messages:
    865
    Likes Received:
    34
    GPU:
    Msi 680
    None, you dont use vsync for fps´s.
     
  13. rewt

    rewt Guest

    Messages:
    1,279
    Likes Received:
    15
    GPU:
    Yes
    Three is somewhat of an established standard for D3D applications/drivers. OpenGL on the other hand, Nvidia tends to use a value of 2.
     
  14. inplayruns

    inplayruns Active Member

    Messages:
    93
    Likes Received:
    13
    GPU:
    Radeon 6600 XT
    Yes you do if you don't stand tearing.

    The smoothness VSync gives is unparalleled, I'd much rather deal with the so called input "lag" than to deal with tearing.
     
  15. Prophet

    Prophet Master Guru

    Messages:
    865
    Likes Received:
    34
    GPU:
    Msi 680
    Yea the 'so called'. Please dont talk any more about this.
     

  16. inplayruns

    inplayruns Active Member

    Messages:
    93
    Likes Received:
    13
    GPU:
    Radeon 6600 XT
    Why so defensive brosef?
     
  17. bishi

    bishi Master Guru

    Messages:
    575
    Likes Received:
    17
    GPU:
    GTX 1080 SLI
    He's always like that with the mightier than thou attitude and has a stick up his ass, just ignore him.
     
  18. Cyberdyne

    Cyberdyne Guest

    Messages:
    3,580
    Likes Received:
    308
    GPU:
    2080 Ti FTW3 Ultra
    Because input lag added by vsync is not "so called". It is what it is, it's there.
     
    BlindBison likes this.
  19. rewt

    rewt Guest

    Messages:
    1,279
    Likes Received:
    15
    GPU:
    Yes
    Whether or not we like vsync is our own personal preference, so lets not fight over it.

    Latency caused by vsync and pre-rendering are related, but also somewhat independent of each other.

    A few folks have complained to Nvidia about the recent changes, but still unclear if that's going to influence future drivers. Perhaps ManuelG can chime in once again.
     
  20. inplayruns

    inplayruns Active Member

    Messages:
    93
    Likes Received:
    13
    GPU:
    Radeon 6600 XT
    I probably should have worded that one differently, I apologize.

    What I meant to say was that input lag, while somewhat noticeable, is not that much of a big deal when compared to a asynchronous image.

    There are ways to reduce input lag, and while I'm sure there are ways to reduce tearing as well, I'm pretty sure it will still lack the smoothness of a VSync'ed image.
     
    Last edited: Jul 18, 2012

Share This Page