I've read about this in the past and most people it seems (including Alex from Digital Foundry) recommend simply setting Anisotropic Texture Filtering to "max" (16x) globally in the control panel and forgetting about it more or less since the performance hit is arguably negligible. My question then is, is this actually true? I've read benchmarks from TweakTown for Overwatch for example (and some other modern games) where increasing this setting from 1x to 16x does have a discernible performance impact on modern cards despite what the general perception seems to be. Now, of course this penalty appears relatively small in these tests -- 1 or 2 fps basically between 4x and 16x for example, but I might argue that can be a meaningful difference in some cases. Perhaps notably, I recall on last gen consoles something like 4x ATF was generally used as the "sweet spot" for this setting in many cases. For myself, I have a very hard time telling the difference between an image at 8x AF vs 16x AF so I wanted to ask about what the recommended value is for this on a forum where I expect people actually know about this sort of thing. If the performance impact is truly negligible and 16x is the preferred default for this across the board, I'll start doing it, but I wanted to verify this first since 16x seems like major overkill unless I'm missing something. Thanks for your time, I appreciate it.