Does the performance impact of Anisotropic Texture Filtering increase with Resolution?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by BlindBison, Jun 11, 2021.

  1. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,404
    Likes Received:
    1,128
    GPU:
    RTX 3070
    Regarding Anisotropic Texture Filtering, I was recently watching Hardware Unboxed's video regarding Red Dead Redemption 2 "Optimized" settings and was interested by the performance impact of the texture filtering:
    upload_2021-6-11_0-8-26.png
    Looks like in the video they tested at 1440p -- although a gap of ~2-4% is arguably not much, it got me thinking -- does the performance impact of Anisotropic filtering grow the higher your base resolution is or is it irrespective of resolution/the same fixed cost regardless of resolution?

    For example, if I were to run 16x Anisotropic Filtering @1080p VS 16x @4K how would the performance cost scale if at all?

    For other effects which run on the GPU they clearly increase in cost with resolution (for example, Reflections have to draw at a higher base resolution and get much more expensive hence why some console games run reflections at half or quarter rez in some cases).

    Anyhow, thanks a lot, just something I was curious about since personally I can't really tell the difference between 8x and 16x myself unless you're really overlooking a scene with very distant oblique angles (even then it's kind of hard for me to tell once you're
     
  2. Astyanax

    Astyanax Ancient Guru

    Messages:
    16,996
    Likes Received:
    7,337
    GPU:
    GTX 1080ti
    AF affecting performance at all appears to be a thing specific to turing and ampere, i suspect this is due to a poor balance of TMU's per CC's in an SM.

    [​IMG]
     
    Last edited: Jun 27, 2021
    enkoo1, SmokingPistol and BlindBison like this.
  3. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    I have yet to see any serious performance impact between using no AF to using AF x16.

    Can't even recall which last one of the old ancient cards I had that got impacted hard by it.
     
    BlindBison likes this.
  4. RealNC

    RealNC Ancient Guru

    Messages:
    4,893
    Likes Received:
    3,168
    GPU:
    RTX 4070 Ti Super
    Probably. I mean, there's more pixels to process at higher resolutions. But it's hard to benchmark since the perf differences are always so low for me. At least in games. Maybe you can run an actual 3D benchmarking application and test that way. Run Unigine Superposition or 3D Mark with AF forced off and then forced on and see how much it affects the score.
     
    SmokingPistol and BlindBison like this.

  5. windrunnerxj

    windrunnerxj Master Guru

    Messages:
    487
    Likes Received:
    127
    GPU:
    RTX 4060
    Games with shitty coding exist :)
    Take Apex for example, performance with AF enabled takes a nosedive on my GTX1060 and at high AF levels it makes the game unplayable in certain conditions.
    Same result whether texture filtering is set in the game options or forced via driver.
     
    Last edited: Jun 11, 2021
    BlindBison and artina90 like this.
  6. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,404
    Likes Received:
    1,128
    GPU:
    RTX 3070
    That’s very interesting huh — I know people say AF is so cheap that you can just set it and forget it, but whenever I’ve sought out tests, there does appear to be measurable (though small) impact to overall FPS. TweakTown’s Overwatch tests and HWU Red Dead 2 tests for example.
     
  7. janos666

    janos666 Ancient Guru

    Messages:
    1,645
    Likes Received:
    405
    GPU:
    MSI RTX3080 10Gb
    Even if it was expensive, I would still set it to 8x at least due the huge quality difference (compared to no or only 4x AF).
     
    BlindBison likes this.
  8. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,404
    Likes Received:
    1,128
    GPU:
    RTX 3070
    Yeah I hear you — I don’t think I’d ever want less than 4x and it’s only once I hit 8x or so when I start having to really look for differences.
     
  9. fellix

    fellix Master Guru

    Messages:
    252
    Likes Received:
    87
    GPU:
    MSI RTX 4080
    Texture sampling and filtering is independent from screen-space rasterization. Any performance cost from texture AF would come from the particular TMU implementation. Higher AF modes will always incur some (marginal) performance hit, due to time spent for multi-pass sampling cycle to complete. Anyway, the relative performance cost has become so small, that there's hasn't been incentive to invest more transistor budget for wider sampling logic and faster texture caches for several generations.
     
    Last edited: Jun 13, 2021
    BlindBison and dr_rus like this.
  10. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,876
    Likes Received:
    1,014
    GPU:
    RTX 4090
    AF cost would be the same since it's happening in texture space (supersampling along the texture XYs) but overall picture may change since you're likely to hit a different set of performance bottlenecks in 4K when compared to 1080p.
     
    BlindBison likes this.

  11. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,404
    Likes Received:
    1,128
    GPU:
    RTX 3070
  12. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    Visual quality is at a level making me use 16x only.
    If it adds a noticeable performance hit with certain games it's probably something else in the games engine that causes it.
     
    BlindBison likes this.
  13. PeskyPotato

    PeskyPotato Member

    Messages:
    46
    Likes Received:
    24
    GPU:
    GTX 970 M
    Yes, that's part of why most FPS pros always use 4:3 resolutions like 1024x768 or 1280x960.

    Since they're smaller and easier to render, they get the best performance. These resolutions give the highest FPS, with the least input latency. Not to mention the hitboxes being larger, if the game allows the 4:3 stretched resolution.





    It's true that if you use AA@16x with 4:3 resolution like 1024x768 and 1280x960 for example, you'll actually often get higher FPS than you would on 1920x1080 with AA@16x.


    But even more notably... even with all the graphics quality or video settings off, those 4:3 resolutions will have more FPS and the least input delay. However, newer games aren't allowing stretched 4:3 as often (VALORANT for example). So this has led some pro players to switch things up.


    Due to that, some of the pro players have started to switch to 16:9 resolutions. Which makes sense, since they all have RTX 2080Ti or Ampere cards. Note, though, that even though you can use higher quality video settings in general on lower 4:3 resolutions, most pro players often still use the lowest settings across the board.


    Even though the pro players have Turing and Ampere graphics cards. Why is that, you may ask? They have RTX 2000 and RTX 3000 series, but use all lowest settings?


    Well... it's not due to FPS.





    Rather, it's because increasing almost any video quality settings creates more input lag.


    Turning on 16x AA or 4x MSAA adds a noticeable amount of input lag. Using higher quality textures, shadows, raytracing, and so on all cause a noticeable increease in input lag.
     
  14. janos666

    janos666 Ancient Guru

    Messages:
    1,645
    Likes Received:
    405
    GPU:
    MSI RTX3080 10Gb
    ^Thanks for the "tails from the cave".
     
    Xtreme512 likes this.
  15. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    AF not AA.
    You seem to know the difference, but seen quite a lot thinking that AF is the same thing as AA.
    Heck, some even think that Super Sampling / DSR is the same thing as AA.

    Also, the pros go with as high fps as possible, so choosing stronger cards are due to higher fps that can keep up with monitors that have high refresh rates, it does make enough of a difference to pick 240Hz over 120Hz.
    Quality is turned down when it is distracting, adds a hit on performance and if it has impact on input lag.
    Seen much fewer pros play with 4:3 resolutions these days as well, 1080p seems to be the new sweet spot, which also makes sense looking at current monitor technology.
     
    Last edited: Jun 14, 2021

  16. janos666

    janos666 Ancient Guru

    Messages:
    1,645
    Likes Received:
    405
    GPU:
    MSI RTX3080 10Gb
    Well, kind of. The original SuperSampling AA is the oldest form of AA (it was introduced some 20 years ago but excluded from the control panel for performance reasons). And DSR also reduces aliasing in a similar way. Only MSAA is unique in a way of being, well, MultiSample AA rather than a type of SSAA (like original SSAA or DSR) or post-process AA (like FXAA).
     
    BlindBison likes this.
  17. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    SSAA uses one or another proper algorithm, while the SuperSampling I have seen in later games doesn't seem to do so at all while still coming with around the same performance cost that SSAA has.
    Personally I prefer MSAA and even DLSS 2.0 at the better image quality settings over any post processed AA, does a good job with a performance hit that I can live with.
     
  18. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,251
    Likes Received:
    1,758
    GPU:
    7800 XT Hellhound
    Some general thought on the initial matter: The performance hit of 4:1 vs. 16:1 isn't perceivable, whereas the visual difference definitely is. So I consider 16:1 a no-brainer. It's sad that there are some games that don't apply the configured AF setting in their graphics options to all textures (incompatibility cases like virtual texturing not counted). Thus I really wished the Nvidia Vulkan driver would allow forcing AF, like the Nvidia D3D12 driver does (well, and of course AMD doesn't even support it for D3D12).
     
    BlindBison likes this.

Share This Page