why ATI cards got less crispy images than Nvidia ?

Discussion in 'Videocards - AMD Radeon' started by ehsancgfx, Oct 26, 2010.

  1. ehsancgfx

    ehsancgfx Active Member

    Messages:
    75
    Likes Received:
    0
    GPU:
    Nvidia 560GTX & ATI 6770m
    I was a nVidia fan for many years and my last card was Geforce 8800GT which was amazing. Now i converted to AMD ( HD 5770 ) seeing the drectx11 support and affordibility of the card at that time and i was pretty happy with the card. but when i played Call of Duty Modern warfare 2 with this ATI card i discovered a shock ! i already finished that game with geforce card some time ago and it was an amazing game with breathtaking graphics. the depth of the graphics was amazing when i played with nvidia 8800 GT card. But when i played with ATI card , the textures seems bit a lot flattened out and lost a lot of its crispiness. in geforce 8800 gt card i could feel the pores of the army dresses and those intricate details was so amazing.i was expecting same output from a directx11 card if not better. all setting was on default on both cards. i sold that nvidia card so i cant show you any comparison screenshot but i could feel that streight away cause i was looking for thos e details so badly. so is it a driver issue or hardware or nvidia cards genuinly have better image quality and ATI just want the FPS increase at a cost of render quality ?
     
  2. nexuno

    nexuno Master Guru

    Messages:
    279
    Likes Received:
    0
    GPU:
    ASUS ROG MATRIX R9 290X
    I don't know...the only difference i ever noticed is color which in Nvidia is more vivid by default but u can play with the settings on AMD cards too...
    In terms of IQ it is pretty much the same thing between AMD and NVIDIA s maybe you just need to play with CCC, i don't know what your problem may be...look at the IQ comparative screenshots on the guru3d reviews

    Another idea...did you change the cable (VGA to monitor cable i mean)?
     
  3. Covert

    Covert Maha Guru

    Messages:
    1,187
    Likes Received:
    0
    GPU:
    Asus HD 5770 1000/1300
    you can call me troll or whatever but i wont be buying an ATi card ever again, yeah the 5770 was fast and cheap but omg its so lacking everywhere else, i had the same feeling, i came from the 9600GT which was on par with the 8800GT.

    nothing ever works properly, even the AA looks sh it when you can get it to work.

    Next card is defo an Nvidia :)
     
  4. nexuno

    nexuno Master Guru

    Messages:
    279
    Likes Received:
    0
    GPU:
    ASUS ROG MATRIX R9 290X
    Strange ppl is strange ppl
     

  5. bobdude

    bobdude Ancient Guru

    Messages:
    1,949
    Likes Received:
    0
    GPU:
    GTX 1060 6gb
    well ati is for more advanced users they need to be set up properly nvidia on the other hand is for the common folk plug it in and play
     
  6. nexuno

    nexuno Master Guru

    Messages:
    279
    Likes Received:
    0
    GPU:
    ASUS ROG MATRIX R9 290X
    So you are actually saying they are right :D
     
  7. Daaknite

    Daaknite Member

    Messages:
    27
    Likes Received:
    0
    GPU:
    EVGA GTX 680 2gb
    I bought a 40" 1080p tv to use as a monitor some months back. Back then I had 2 5850's in crossfire. It took a LOT of tweaking to get the display (even ordinary windows computing tasks as well as games) to look halfway decent.

    I came to accept that a TV can never look as great as my 28" which I had before. The extra 12" were great but I wasnt thrilled with the IQ.

    After switching sides to the cards in my sig because of xfire problems, I was BLOWN AWAY with the difference. My TV now looks as clear crisp and sharp as a monitor does.

    Like you, my next upgrade will DEFINITELY be nvidia. ATI is no good at proper HDTV output.
     
  8. bobdude

    bobdude Ancient Guru

    Messages:
    1,949
    Likes Received:
    0
    GPU:
    GTX 1060 6gb
    yes out of the box nvidia does look better ati takes a lot of tweaking to get it the way you want thats why i say its for the advanced user
     
  9. phill1978

    phill1978 Master Guru

    Messages:
    715
    Likes Received:
    0
    GPU:
    Saphire 1GB 5850@840/1140
    hmm you could be onto something. image quality wise. having said that im running at 7 million pixel resolution on 1 ati card where i need two nvidia`s to do that so which is better now ;)
     
  10. Metz

    Metz Master Guru

    Messages:
    229
    Likes Received:
    0
    GPU:
    2x 5850 TOP CF
    Before i bought these 5850s i was using and benching several nvidia cards. 2 460's, 2 8800 GT's and also an 8800 Ultra a women gave me. The image quality of the 5 series is better than the 8800s. The 460's looked great, but i found ATI on my machine and TV to produce better color and had better native support for my HDTV.

    Using default driver settings, ATI may be using some filtering options nvidia is not. You need to go in and modify your CCC. Do some research and you should be able to find out how to maximize IQ on your 5 series card. You can't expect different cards on different drivers to provide similar results with default settings. You kind of have to match them together.

    I had 2 460s before the 5850s. You can see which made the cut.

    You can also try that inf tweak for the 10.10a hotfix drivers which lets you disable surface format optimizations within the catalyst A.I. (and also lets you enable morph aa) This should give you a good boost in texture quality, but by the sounds of your problem... attempthing this may just cause more of a headache for you.

    First and foremost make sure your Mipmap detail slider is on High Quality in your CCC.


    Edit: It must just be chance but with ATI my Sony Bravia Auto detects 1080p and displays everything perfect and crisp. Everything from 8800 GT -> 460 made everything quite pixelated and aliased. I found native HDTV support better with ATI and maybe thought it was drivers. Kind of funny.
     
    Last edited: Oct 26, 2010

  11. perosmct

    perosmct Banned

    Messages:
    1,072
    Likes Received:
    0
    GPU:
    unknown
    if you dont know to touch and modify registry to unleash CCC...ati's quality is much inferior and on lower level, than nvidia's one...
     
  12. Omagana

    Omagana Ancient Guru

    Messages:
    2,532
    Likes Received:
    1
    GPU:
    GTX 780 SLI
    Perhaps its just better in your memory than it actually was, I often play older games and think to myself "at the time I thought this looked amazing".

    I say this because I have two systems in my house, the one in my sig and a 2nd backup with a 9800GT. Played games -inclduing COD- on both machines (recently as well) and they always looked the same in terms of image quality.
     
  13. Daaknite

    Daaknite Member

    Messages:
    27
    Likes Received:
    0
    GPU:
    EVGA GTX 680 2gb
    Drivers and configurations are such a wierd thing. I messed with everything in CCC to try to get a better IQ, no dice.

    Honestly, I did not try editing any registry settings. Had I heard abt that then I would have tried it. Thing is, the output from my ATI cards was superb on a monitor but on the TV everything went downhill.

    Conversely, with the NV setup, I plugged in, installed drivers and everything was perfect.

    Admittedly, ATI has way better multi monitor support, no argument there. They really need to take better care of their crossfire customers. The taste in my mouth after crossfire is more bitter than young, green aloes.
     
  14. Mufflore

    Mufflore Ancient Guru

    Messages:
    11,648
    Likes Received:
    741
    GPU:
    1080Ti + Xtreme III
    I found no noticeable difference in image quality, sharpness etc, moving from a GTX260 to 2x5770 on both my TV and monitor.
    If using analogue, its unsurprising there will be differences, but with digital the differences are down to driver settings.

    For example, some drivers set the wrong colour mode on HDTVs, making the image too dark or the colours strange.
    As long as you know where to find the setting that only appears for HDTVs (not monitors), you can change it back.
    It is called 'Pixel Format' and will not appear unless you are 'using' an HDTV (not just have one connected).

    To adjust it:
    Open CCC, Desktops & Displays
    Right click the small monitor icon for your screen at the bottom, select Configure.
    You will see the 'Pixel Format' tab if using an HDTV.

    If you suffer other problems, in CCC go to Options and select Preferences/Restore Factory Defaults.
     
  15. IcE

    IcE Don Snow Staff Member

    Messages:
    10,693
    Likes Received:
    73
    GPU:
    Zotac GTX 1070 Mini
    I didn't really notice a difference in image quality (bar higher settings of course) from my 4850. I did however notice that the colors are a bit more to my liking (more vibrant), and text seems a wee bit sharper than before.
     

  16. kapu

    kapu Ancient Guru

    Messages:
    3,745
    Likes Received:
    6
    GPU:
    MSI Geforce 1060 6gb
    There is hundred tests in the net that show there is no real image difference between AMD and nVidia.

    Your subjective opinion in the topic doesn't bring anything nor does it matter.

    But i would agree that forced AA worked more often on nVidia than AMD.
     
  17. isidore

    isidore Ancient Guru

    Messages:
    6,210
    Likes Received:
    22
    GPU:
    RTX 2080TI GamingOC
    it's just in your mind..see some reviews on image comparison between the two, you will see there's no difference..

    i get the same feeling after i replay an older game that i thought it looked great in the past..
     
  18. GhostXL

    GhostXL Ancient Guru

    Messages:
    6,011
    Likes Received:
    16
    GPU:
    ASUS STRIX 2080 Ti
    It's definitely not a mind game. Nvidia does seem to have better Image Quality since the 8 series. Since selling my HD 5870 and going to GTX 480's, I've noticed a lot more finer detail.

    Though ATI definitely had smoother/crisper video playback imo.
     
  19. DSparil

    DSparil Ancient Guru

    Messages:
    3,258
    Likes Received:
    17
    GPU:
    ASUS ROG StriX RX480, 8GB
    hard to say. One thing could be the Catalyst A.I. being on full blast in the Catalyst control center. I don't know if that would make any significant difference in the type of detail you're talking about... but all those types of things add up. I can say that I switched from a 7950GT to an HD5750 and have noticed no such loss
     
  20. Death_Lord

    Death_Lord Master Guru

    Messages:
    714
    Likes Received:
    7
    GPU:
    Aorus 1080 GTX 8GB
    Are this people joking? or something? I will slap my self if this guys are wannabe noobs complaining about Blurry images when their resolution is not set properly...

    The guy of the 42 inch screen, are you using a HDMI cable?, does your TV supports FULL HD? HD Ready TVs use up-scaling, so it cant look great, and the other guy, what's wrong with him? the image looks exactly the same, out of the box, games are made to run exactly the same way on any graphic cards, except for some exclusive titles.

    I use both brands, NVIDA and ATI, and they work the same way, except from Performance differences between cards, and maybe some special features on AA, but the rest its exactly the same. we don't live in the 3DFX era any more were different brands made games look soo different.

    2 tips, Configure your Resolution and Refresh rate properly, Stop posting troll post without anything to back you up, it makes you look like an ass.
     

Share This Page