I laugh at people who say you don't need anti-aliasing and the full amount of anti-aliasing at 4k.

Discussion in 'Games, Gaming & Game-demos' started by Cronik, Jul 4, 2021.

Thread Status:
Not open for further replies.
  1. Cronik

    Cronik Maha Guru

    Messages:
    1,316
    Likes Received:
    121
    GPU:
    Nvidia GTX 1080
    Let's have an either/or count.

    1. You sit too far away from your monitor/TV. That crap should be on your lap and not as far away as some douche bag in an article said it needs to be.

    2. You need to get your eyes checked

    3. You have glaucoma

    4. You have cataracts

    5. you're a smooth brain

    People said the same thing about 1080p. "Y0u d0N'T n33d anti-aliasing @ ten eighty pee!"

    When I upgraded from my CRT to a 900p monitor I needed the full amount of anti-aliasing available. When I upgraded from my 900p monitor to a 1080p monitor I still needed the full amount of anti-aliasing(8xMSAA, FXAA on high, etc).

    I got a chance to play games on a 4k monitor. Guess what? I needed anti-aliasing and the full amount. Turned TAA on in Squad and now had the ability to shoot ever enemy pixel.
     
    Dragam1337 likes this.
  2. RealNC

    RealNC Ancient Guru

    Messages:
    3,590
    Likes Received:
    1,770
    GPU:
    EVGA GTX 980 Ti FTW
    If somebody tells you that, just send them to this test:

    https://www.testufo.com/aliasing-vi...&background=000000&antialiasing=0&thickness=4

    The only displays right now that actually don't need anti-aliasing are some mobile phones with ultra high DPI displays. The Sony Xperia 1 II for example has a 6.5" 4K display with a pixel density of 643DPI. But that's just the display. I doubt that such phones actually make use of these resolutions. They mostly run at lower res and upscale.

    With a monitor or TV, you'd need something like a 34" 8K display to get pixel densities that might not require anti-aliasing. And I'm not sure it would be enough even then.

    How much pixel density do you need to actually not need anti-aliasing anymore? Not sure. But it's safe to say that once it becomes so high as to make each individual pixel so small as to not be perceptible anymore, then you don't need anti-aliasing anymore.

    But even if we had such monitors, good luck running games at 8K (or 10K, or whatever "K" is required to reach the needed pixel density.) Anti-aliasing will be required for many years to come.
     
    Last edited: Jul 4, 2021
    Dragam1337 likes this.
  3. Ghosty

    Ghosty Ancient Guru

    Messages:
    6,109
    Likes Received:
    301
    GPU:
    AMD Radeon Graphics
    I never favor AA. I prefer AF almost all of the time @ 8/16x. AMD(ATI) has always offered very good AF image quality. It all depends on your preferences and what works best at the time.
     
  4. tsunami231

    tsunami231 Ancient Guru

    Messages:
    12,180
    Likes Received:
    940
    GPU:
    EVGA 1070Ti Black
    That is like tell people if vsync is OFF you get tearing, even IF your running 120hz fps or higher it just tend to be hard to see, i lost count how many friend claim they dont run vsync and dont see tear. until I "pointed" it out to them and they started seeing it.. or how some saying you dont need 4k on 32" just like the said you dont 1080p on 32" back in the day vs 40" or 50" I can see clear difference between 1080p on 32" vs 40-50" and bigger just like I see clear difference from 27~32" 4 vs 40~50" and bigger, smaller screen much higher PPI much clearer/sharper image. only problem is Scaling need to be done right or thing get hard to read.

    Anyway I like AA some games REALLY need other dont, seeing I tend game on 55" 10 feet away from screen less game has horrible bad aliasing I dont use (less the game support in game AA that FXAA or some other form NOT TA. IF game dont support AA in some form in game I dont use, MSAA is lost art these days. still prefer it over most forms but not compatible with alot thing .

    AF is these days x16 time globally
     

  5. user1

    user1 Ancient Guru

    Messages:
    1,736
    Likes Received:
    604
    GPU:
    hd 6870
    I think its largely subjective, my eyes aren't terrible, aliasing has never bothered me, even at <1080p. perhaps having grown up with computers from an early age has tainted my perception. I almost prefer it to be painfully sharp, I have disdain for blurring of any kind.
     
  6. WhiteLightning

    WhiteLightning Don Illuminati Staff Member

    Messages:
    29,094
    Likes Received:
    1,936
    GPU:
    GTX1070 iChillx4
    It totally depends on the game for me. I find aliasing disgusting , i can't stand it! Especially fxaa and smaa. terrible , just terrible.

    Older games work fine at 4k no AA though , newer games unfortunately not.
     
  7. RealNC

    RealNC Ancient Guru

    Messages:
    3,590
    Likes Received:
    1,770
    GPU:
    EVGA GTX 980 Ti FTW
    Whenever I play a game that doesn't have AA, or only useless AA like FXAA (as in this example,) at first I'm like "this isn't so bad, I can live with it." Then 10 minutes later I come across some sort of monstrosity like this:

    no-aa.jpg
    :eek:
    (And it's ten times worse during motion compared to a static screenshot.)

    I then race to the "NVidia anti-aliasing bits" thread to find out how to force MSAA:

    msaa.jpg
     
    Last edited: Jul 4, 2021
    user1 likes this.
  8. user1

    user1 Ancient Guru

    Messages:
    1,736
    Likes Received:
    604
    GPU:
    hd 6870
    I see your point, though i will say, i feel like that's more of an issue with the assets than the resolution/aa, a pattern like that is just asking for problems. , thin lines are best avoided entirely aa or not.
     
  9. DannyD

    DannyD Ancient Guru

    Messages:
    3,137
    Likes Received:
    2,073
    GPU:
    EVGA 2060
    I don't ever use aa, really fail to see what business it is of anyone elses tho?
    can't you just enjoy your own crap and let others do same?
     
    alanm, AsiJu and tfam26 like this.
  10. tfam26

    tfam26 Guest

    I agree with Danny here, it is nobodies damn business why Im in AA.

    Enjoy your 4th.
     
    Maddness, alanm and CrunchyBiscuit like this.

  11. Ghosty

    Ghosty Ancient Guru

    Messages:
    6,109
    Likes Received:
    301
    GPU:
    AMD Radeon Graphics
    FXAA is the worse for image quality. No one uses it.
     
  12. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    2,897
    Likes Received:
    1,736
    GPU:
    Rtx 3090 Strix OC
    Yeah, all the temporal effects in newer games makes a form of temporal anti aliasing necessary. Like in red dead redemption 2, even if downsample from 10k to 4k, there is alot of nasty aliasing without TAA enabled.
     
  13. metagamer

    metagamer Ancient Guru

    Messages:
    2,244
    Likes Received:
    908
    GPU:
    Palit GameRock 2080
    You don't need AA at 4k on a 27" PC monitor. You will need AA on a 4k 65" TV sat a metre away from your eyes. You also won't need AA on the same TV sat 5 metres away from your eyes.

    What even is this thread?
     
    Last edited: Jul 5, 2021
  14. Cronik

    Cronik Maha Guru

    Messages:
    1,316
    Likes Received:
    121
    GPU:
    Nvidia GTX 1080
    Yes you do. Seen it for myself. No PPI currently is high enough to hide jaggies. Needing a high PPI is just a theory anyway.

    I had a 2k res phone(Galaxy s8+) also. I still saw aliasing in mobile games.
     
    Last edited: Jul 4, 2021
    Dragam1337 likes this.
  15. RealNC

    RealNC Ancient Guru

    Messages:
    3,590
    Likes Received:
    1,770
    GPU:
    EVGA GTX 980 Ti FTW
    This thread isn't about you.
     

  16. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    2,897
    Likes Received:
    1,736
    GPU:
    Rtx 3090 Strix OC
    Yeah, and your eyes cant see more than 24 fps...

    Either you haven't actually tried a 4k 27" monitor, or you need to visit an optician.
     
  17. metagamer

    metagamer Ancient Guru

    Messages:
    2,244
    Likes Received:
    908
    GPU:
    Palit GameRock 2080
    I have an LG OLED 4k TV right behind me, as I type this, and an Odyssey G9 monitor right in front of me. 240hz, just for your information. I also have a couple of 1440p monitors in the cupboard. One is a 120hz Qnix and the other one is a 165hz GSYNC monitor. Both IPS. Maybe if you upgraded from your 2133mhz ram and your ancient CPU, your input about acknowledging high refresh rates and 4k at 27" would be more reasonable.

    If you can't distinguish between having a 27" 4k monitor in your face and a 4k TV 6ft from your face, I think you need to...

    1. take a chill pill
    2. take another one

    I'm typing this on my PC that has a Odyssey G9 about a meter from my face. 5120x1440 at 240hz. I know my crap. Do you know yours?
     
    Last edited: Jul 5, 2021
  18. metagamer

    metagamer Ancient Guru

    Messages:
    2,244
    Likes Received:
    908
    GPU:
    Palit GameRock 2080
    You're right, that's why I'm typing this from a 240hz monitor. Because I can't see more than 24 fps.

    Go get yourself a medal, Chad.
     
    Last edited: Jul 5, 2021
  19. metagamer

    metagamer Ancient Guru

    Messages:
    2,244
    Likes Received:
    908
    GPU:
    Palit GameRock 2080
    It comes down to what is acceptable to you. 4k 27" will be plenty sharp for the vast majority of people. But if you're not one of them, fair play, throw some AA at it. Generally speaking though, 4k 27" will be pretty sweet. Will it not have any jaggies? No. Of course not. Also depends on how far you sit from your screen. It's not an exact science.
     
  20. KissSh0t

    KissSh0t Ancient Guru

    Messages:
    9,847
    Likes Received:
    3,717
    GPU:
    ASUS RX 470 Strix
    Your mother was a hamster, and your father smelt of elderberries... is the vibe this thread gives of. haha.
     
    alanm, AsiJu, user1 and 1 other person like this.
Thread Status:
Not open for further replies.

Share This Page