why ATI cards got less crispy images than Nvidia ?

Discussion in 'Videocards - AMD Radeon' started by ehsancgfx, Oct 26, 2010.

  1. DSparil

    DSparil Guest

    Messages:
    3,295
    Likes Received:
    33
    GPU:
    GeForce RTX 3080
    This is a very good point. Whenever you revisit a game you thought looked kick ass years ago and revisit it, it doesn't quite feel the same. Especially after a hardware change and a generation leap in gaming.

    lol, is this Maleficarus in disguise? whos looking like the ass, someone asking a legit question because they need help, or something chiming in out of nowhere and getting defensive about it?
     
  2. Daaknite

    Daaknite Guest

    Messages:
    27
    Likes Received:
    0
    GPU:
    EVGA GTX 680 2gb
    HDMI cable was used for both ATI and Nvidia, the same cable actually. TV is a Toshiba Regza 40" running at full 1080p and there is a night and day difference between the two cards.

    Something wrong with my TV where ATI will show poorly and Nvidia shows much better?
     
  3. TheHunter

    TheHunter Banned

    Messages:
    13,404
    Likes Received:
    1
    GPU:
    MSi N570GTX TFIII [OC|PE]
    :nerd: i always thought the opposite and still think its better on radeon., i had ati9600pro once and it had much better image with sharper filtering compared to old geforce2 mx400 or riva tnt2 pro, or even to 6600gt later lol:pc1:..

    nvidia improved Anisotropic Filtering with 7800gtx and a little more with 8800gtx (g80) but then it stayed like that ever since, even on 480gtx today.

    So anyway its better on ati/amd radeon because it uses angle independent anisotropic filtering by nvidia its still somewhat restricted..
     
  4. Omagana

    Omagana Guest

    Messages:
    2,532
    Likes Received:
    1
    GPU:
    RTX 2070 Super
    ^+1

    This isn't the first time people have argued about IQ...
     

  5. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,691
    Likes Received:
    2,673
    GPU:
    Aorus 3090 Xtreme
    Have you made sure that the image is scaled correctly and that you are definitely running at the native res of the display?

    The settings for image scaling are in the same area as the Pixel Format described earlier.
    Open CCC, Desktops & Displays
    Right click the small monitor icon for your screen at the bottom, select Configure.
    Select the Scaling Options tab and make sure overscan is set to 0%

    If the image is now too big for the screen, adjust the TVs overscan to suit.
     
  6. Death_Lord

    Death_Lord Guest

    Messages:
    722
    Likes Received:
    9
    GPU:
    Aorus 1080 GTX 8GB
    Im runing a HDTV 42 inch on a old Ati 1600XT, with an HDMI cable, look exactly the same way on both my Nvidia card or ati cards. an HDTV works like a Screen, you Put a resolution and you get an input, nothing else, theres no magical Tweaks, its pure and simple. i bet you were running on a non native resolution there and you blame the ati card when it could be a bad configuration on CCC or even on the windows screen settings.

    @DSparil: im sorry, im not maleficarus, the thing that got me upset its all the troling around when there are hundreds of image comparations that proves that ATI and Nvidia looks exactly the same. I hate Fanboys from Both sides, at the end the thread gets closed because all the ****ty flaming and all the "I Got the Biggest cock" game.
     
  7. Gromuhl'Djun

    Gromuhl'Djun Ancient Guru

    Messages:
    5,452
    Likes Received:
    30
    GPU:
    4070ti
    Go to option, set everything to max. Then set your resolution as high as possible.

    It could be that the game is automatically setting a lower detail/resolution for your Ati card, which is silly, just like the discussions above.
     
  8. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    I went from a 8800GT to a 4870 to a GTX260, and noticed no huge difference, a slightly different look, but better or worse would have to be done to personal preference i think.

    I did lol at the "yes out of the box nvidia does look better ati takes a lot of tweaking to get it the way you want thats why i say its for the advanced user"

    That just dumb, its sounds like your saying AMD intentionally make image quality worse, or don't try to make it better, and just leave it to the end user to fix their own mistakes, which is a horrible way to treat customers, sounds like Fallout NV lol.

    That definitely wasn't the case 2 years ago, and im sure it isn't now, some people need to think things over if they want to defend something.
     
    Last edited: Oct 27, 2010
  9. Sever

    Sever Ancient Guru

    Messages:
    4,825
    Likes Received:
    0
    GPU:
    Galaxy 3GB 660TI
    if youre using a hdmi cable to a tv with an ati card, make sure you use the CCC to set the level of overscan, and not the TV. the TV's overscan function just stretches the image to fit, and as a result, you end up with a blurry image (happens on my 37inch).

    but if you use the CCC to set it to gpu scaling, you end up with a much sharper image as its using true 1:1 pixel for 1920x1080 instead of stretching the image to fit. could just be because im using an IPS panel, but there is a notable difference.

    and yeah, theres no difference between ati and nvidia image quality. you just need to adjust the brightness to suit, add in aa and af and they both end up looking the same.
     
  10. DSparil

    DSparil Guest

    Messages:
    3,295
    Likes Received:
    33
    GPU:
    GeForce RTX 3080
    I can see your point, but I doubt this thread was an attempt at proving a bigger ball sack. There are definitely many threads that turn into fan boy flame arguments but sometimes those can be entertaining to read lol. Never the less, they are stupid, and there is a lot of evidence to support that image quality isn't superior on either brand. I've been happy with both nvidia and ati cards to date, and like to play around with my settings in CCC to see its effect on imaging and performance. For me, thats part of the fun about pc gaming!

    flame wars :biggun:
     

  11. Bluedog

    Bluedog Ancient Guru

    Messages:
    2,689
    Likes Received:
    0
    GPU:
    MSI N680GTX TF 2GD5/OC
    I'm just the opposite as every Nvidia card I've had (6000 ~ 8800 series) died. Got two replaced under EVGA warranty replacement program but they eventually died also. I finally sold the replacements for those. My current ATI 4870 has been nothing but rock solid and with no driver issues.
     
  12. UnclePappi

    UnclePappi Banned

    Messages:
    5,082
    Likes Received:
    1
    GPU:
    Asus 680 2gb 1250mhz
    I've went back and forth between nvidia and ati and yes there is a difference. Ati has more true to life color but less sharp image, while nvidia has a "colder" color profile and slightly sharper image. Seems nvidias AF and AA is sharper too. I'm on a 1920x1200 widescreen CRT so the sharpness isn't as much of a concern but on my old LCD it was much more apparent. Although ATI's color does look better on the CRT the combo of slightly less sharp monitor with slightly less sharp ATI graphics makes a kind of dull image but my god the color would make you pluck your eyes out because you couldn't handle it
     
  13. Death_Lord

    Death_Lord Guest

    Messages:
    722
    Likes Received:
    9
    GPU:
    Aorus 1080 GTX 8GB
    I think most people confuses what Colour profiles look like across different brands with Graphics like how the textures gets rendered or the amount of LOD used.

    Since no one here has the same exact system, with the same monitors, each individual person has to learn how to calibrate their screens to get the most out of it with the brand they use.

    Ati and Nvidia programmers also have their own tastes, so they configure the colour configuration the way they like it best.
     
  14. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,691
    Likes Received:
    2,673
    GPU:
    Aorus 3090 Xtreme
    Are you discussing analogue only?
     
  15. Legendary_Agent

    Legendary_Agent Guest

    Messages:
    888
    Likes Received:
    0
    GPU:
    Asus HD7970 DirectCU II
    Fanboys, Fanboys, wether you like it or not, Nvidia didnt upgrade crap in their AA and AF instruction sets, so no it actually looks worse than ATI, ATI HD5000 series have better image filtering when it comes to AF and AA aswell compared to any nvidia including latest nvidia cards as those nvidia are also using same old crap when it comes to image quality while ati made it even better than ati hd 4000 series, if you forced AA on a crap game it will look blurry ofc, try forcing AA on a decent game like dirt 2, nfs shift or crysis then compare both.
     

  16. UnclePappi

    UnclePappi Banned

    Messages:
    5,082
    Likes Received:
    1
    GPU:
    Asus 680 2gb 1250mhz
    I've tested ATI and NVIDIA on both analogue and digital. It's the same with both.

    While I agree that *technically* ATI has the superior AA and AF.. nvidia's is sharper. No doubt about it. I'll find a link... here ya go Clicky
     
    Last edited: Oct 27, 2010
  17. DSparil

    DSparil Guest

    Messages:
    3,295
    Likes Received:
    33
    GPU:
    GeForce RTX 3080
    Color looks better on a high end CRT period. The days when they were vastly better than LCD's for gaming are over, but CRT vs LCD doesn't have anything to do with ATI's picture quality vs Nvidia...
     
  18. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,213
    Likes Received:
    1,537
    GPU:
    NVIDIA RTX 4080 FE
    I came from an NVIDIA GTX 280 to an ATI HD 5870 and in the same games with the same MSAA settings I can't say I've noticed any differences in image quality personally. They're pretty much the same.

    NVIDIA drivers are better at forcing AA in games though, that's about it. I actually miss being about to use AA in BioShock 2 DX10 for example.
     
  19. jskyg

    jskyg Master Guru

    Messages:
    434
    Likes Received:
    0
    GPU:
    powercolor 5850@775/1125
    from what I've seen, the brand of monitor your using makes more of a difference with IQ on your given video card than anything else.

    it seems some monitors prefer ATI and others Nvidia.
     
  20. cloudman

    cloudman Maha Guru

    Messages:
    1,409
    Likes Received:
    1
    GPU:
    Asus CU II GTX 780
    I just recently moved to the red side with my 5770's. I run a 40 led lcd sammy with hdmi at it's native 1920x1080. After reading this thread I made a few adjustments and it looks marvelous. There's so many adjustments on these new panels, it can take some tweaking to get it right.
     

Share This Page