Does the performance impact of Anisotropic Texture Filtering increase with Resolution?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by BlindBison, Jun 11, 2021.

  1. BlindBison

    BlindBison Master Guru

    Messages:
    674
    Likes Received:
    130
    GPU:
    RTX 2080 Super
    Regarding Anisotropic Texture Filtering, I was recently watching Hardware Unboxed's video regarding Red Dead Redemption 2 "Optimized" settings and was interested by the performance impact of the texture filtering:
    upload_2021-6-11_0-8-26.png
    Looks like in the video they tested at 1440p -- although a gap of ~2-4% is arguably not much, it got me thinking -- does the performance impact of Anisotropic filtering grow the higher your base resolution is or is it irrespective of resolution/the same fixed cost regardless of resolution?

    For example, if I were to run 16x Anisotropic Filtering @1080p VS 16x @4K how would the performance cost scale if at all?

    For other effects which run on the GPU they clearly increase in cost with resolution (for example, Reflections have to draw at a higher base resolution and get much more expensive hence why some console games run reflections at half or quarter rez in some cases).

    Anyhow, thanks a lot, just something I was curious about since personally I can't really tell the difference between 8x and 16x myself unless you're really overlooking a scene with very distant oblique angles (even then it's kind of hard for me to tell once you're
     
  2. Astyanax

    Astyanax Ancient Guru

    Messages:
    10,612
    Likes Received:
    3,889
    GPU:
    GTX 1080ti
    AF affecting performance at all appears to be a thing specific to turing and ampere, i suspect this is due to a poor balance of TMU's per CC's in an SM.

    [​IMG]
     
    enkoo1, SmokingPistol and BlindBison like this.
  3. Mineria

    Mineria Ancient Guru

    Messages:
    4,424
    Likes Received:
    200
    GPU:
    Asus RTX 2080 Super
    I have yet to see any serious performance impact between using no AF to using AF x16.

    Can't even recall which last one of the old ancient cards I had that got impacted hard by it.
     
    BlindBison likes this.
  4. RealNC

    RealNC Ancient Guru

    Messages:
    3,385
    Likes Received:
    1,580
    GPU:
    EVGA GTX 980 Ti FTW
    Probably. I mean, there's more pixels to process at higher resolutions. But it's hard to benchmark since the perf differences are always so low for me. At least in games. Maybe you can run an actual 3D benchmarking application and test that way. Run Unigine Superposition or 3D Mark with AF forced off and then forced on and see how much it affects the score.
     
    SmokingPistol and BlindBison like this.

  5. windrunnerxj

    windrunnerxj Master Guru

    Messages:
    366
    Likes Received:
    59
    GPU:
    MSI Gaming 1060 6G
    Games with shitty coding exist :)
    Take Apex for example, performance with AF enabled takes a nosedive on my GTX1060 and at high AF levels it makes the game unplayable in certain conditions.
    Same result whether texture filtering is set in the game options or forced via driver.
     
    Last edited: Jun 11, 2021
    BlindBison and artina90 like this.
  6. BlindBison

    BlindBison Master Guru

    Messages:
    674
    Likes Received:
    130
    GPU:
    RTX 2080 Super
    That’s very interesting huh — I know people say AF is so cheap that you can just set it and forget it, but whenever I’ve sought out tests, there does appear to be measurable (though small) impact to overall FPS. TweakTown’s Overwatch tests and HWU Red Dead 2 tests for example.
     
  7. janos666

    janos666 Maha Guru

    Messages:
    1,025
    Likes Received:
    168
    GPU:
    MSI RTX3080 10Gb
    Even if it was expensive, I would still set it to 8x at least due the huge quality difference (compared to no or only 4x AF).
     
    BlindBison likes this.
  8. BlindBison

    BlindBison Master Guru

    Messages:
    674
    Likes Received:
    130
    GPU:
    RTX 2080 Super
    Yeah I hear you — I don’t think I’d ever want less than 4x and it’s only once I hit 8x or so when I start having to really look for differences.
     
  9. fellix

    fellix Member Guru

    Messages:
    193
    Likes Received:
    30
    GPU:
    KFA² GTX 1080 Ti
    Texture sampling and filtering is independent from screen-space rasterization. Any performance cost from texture AF would come from the particular TMU implementation. Higher AF modes will always incur some (marginal) performance hit, due to time spent for multi-pass sampling cycle to complete. Anyway, the relative performance cost has become so small, that there's hasn't been incentive to invest more transistor budget for wider sampling logic and faster texture caches for several generations.
     
    Last edited: Jun 13, 2021
    BlindBison and dr_rus like this.
  10. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,013
    Likes Received:
    379
    GPU:
    RTX 3080
    AF cost would be the same since it's happening in texture space (supersampling along the texture XYs) but overall picture may change since you're likely to hit a different set of performance bottlenecks in 4K when compared to 1080p.
     
    BlindBison likes this.

  11. BlindBison

    BlindBison Master Guru

    Messages:
    674
    Likes Received:
    130
    GPU:
    RTX 2080 Super
  12. Mineria

    Mineria Ancient Guru

    Messages:
    4,424
    Likes Received:
    200
    GPU:
    Asus RTX 2080 Super
    Visual quality is at a level making me use 16x only.
    If it adds a noticeable performance hit with certain games it's probably something else in the games engine that causes it.
     
    BlindBison likes this.
  13. PeskyPotato

    PeskyPotato Member

    Messages:
    11
    Likes Received:
    1
    GPU:
    GTX 970 M
    Yes, that's part of why most FPS pros always use 4:3 resolutions like 1024x768 or 1280x960.

    Since they're smaller and easier to render, they get the best performance. These resolutions give the highest FPS, with the least input latency. Not to mention the hitboxes being larger, if the game allows the 4:3 stretched resolution.





    It's true that if you use AA@16x with 4:3 resolution like 1024x768 and 1280x960 for example, you'll actually often get higher FPS than you would on 1920x1080 with AA@16x.


    But even more notably... even with all the graphics quality or video settings off, those 4:3 resolutions will have more FPS and the least input delay. However, newer games aren't allowing stretched 4:3 as often (VALORANT for example). So this has led some pro players to switch things up.


    Due to that, some of the pro players have started to switch to 16:9 resolutions. Which makes sense, since they all have RTX 2080Ti or Ampere cards. Note, though, that even though you can use higher quality video settings in general on lower 4:3 resolutions, most pro players often still use the lowest settings across the board.


    Even though the pro players have Turing and Ampere graphics cards. Why is that, you may ask? They have RTX 2000 and RTX 3000 series, but use all lowest settings?


    Well... it's not due to FPS.





    Rather, it's because increasing almost any video quality settings creates more input lag.


    Turning on 16x AA or 4x MSAA adds a noticeable amount of input lag. Using higher quality textures, shadows, raytracing, and so on all cause a noticeable increease in input lag.
     
  14. janos666

    janos666 Maha Guru

    Messages:
    1,025
    Likes Received:
    168
    GPU:
    MSI RTX3080 10Gb
    ^Thanks for the "tails from the cave".
     
    Xtreme512 likes this.
  15. Mineria

    Mineria Ancient Guru

    Messages:
    4,424
    Likes Received:
    200
    GPU:
    Asus RTX 2080 Super
    AF not AA.
    You seem to know the difference, but seen quite a lot thinking that AF is the same thing as AA.
    Heck, some even think that Super Sampling / DSR is the same thing as AA.

    Also, the pros go with as high fps as possible, so choosing stronger cards are due to higher fps that can keep up with monitors that have high refresh rates, it does make enough of a difference to pick 240Hz over 120Hz.
    Quality is turned down when it is distracting, adds a hit on performance and if it has impact on input lag.
    Seen much fewer pros play with 4:3 resolutions these days as well, 1080p seems to be the new sweet spot, which also makes sense looking at current monitor technology.
     
    Last edited: Jun 14, 2021

Share This Page