Testing NVIDIA vs. AMD Image Quality

Discussion in 'Frontpage news' started by MAD-OGRE, Nov 20, 2010.

  1. Konrad321

    Konrad321 Banned

    Messages:
    259
    Likes Received:
    0

    Confirmed by several german tech sites. It's true.
     
  2. Exodite

    Exodite Guest

    Messages:
    2,087
    Likes Received:
    276
    GPU:
    Sapphire Vega 56
    I were about to say something regarding flame-bait, Nvidia and the OP but I came to my senses. It's simply no use.

    Consider this one vote towards deleting the thread instead, nothing good could possibly come of it.
     
  3. buddyfriendo

    buddyfriendo Guest

    Messages:
    3,404
    Likes Received:
    5
    GPU:
    2070 Super
    True to be bias? That's painfully obvious.
     
  4. Konrad321

    Konrad321 Banned

    Messages:
    259
    Likes Received:
    0

    It's fact.
     

  5. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    this thread screams fanboy....
     
  6. buddyfriendo

    buddyfriendo Guest

    Messages:
    3,404
    Likes Received:
    5
    GPU:
    2070 Super
    If by fact you mean a load of fanboy **** then yes, indeed it is.
     
  7. Metz

    Metz Master Guru

    Messages:
    229
    Likes Received:
    0
    GPU:
    2x 5850 TOP CF
    I think a few of you need to go outside and get some sun. :3eyes:
     
  8. Konrad321

    Konrad321 Banned

    Messages:
    259
    Likes Received:
    0
  9. qstoffe

    qstoffe Active Member

    Messages:
    55
    Likes Received:
    0
    GPU:
    nVidia GTX 580 @ stock
    Reading this thread reminds me of driver update for my 4870x2 a while back. I was playing Dragon Age and getting avg fps 30-60. Then a new driver came out and my fps was rock solid on 60. However I noticed distinctly that the rendered image was more blurry than before. In fact I rolled back to the previous driver and the picture was sharper in the game again. I also compared this with my 8800 GTX. The older ATI driver were sharper than on the nVidia card. The newer ATI driver (more blurry) matched the IQ of my 8800GTX.

    Because of this I'm sure such IQ tweaks are done by both nVidia and AMD. The only frustrating thing about it is that us users don't get more control over it. Make driver settings available etc. Very annoying when obvious changes like this are made in "secret".
     
    Last edited: Nov 21, 2010
  10. Xzibit

    Xzibit Banned

    Messages:
    4,382
    Likes Received:
    0
    GPU:
    7970 Windforce
    :wanker::wanker::wanker:
    It's copy-pasted(translated) and inspirated from nvidia's sh!t, I mean that test.
    What a buzz around the 500 series, interesting, very interesting, how much cost this campaign for Nvidia?
     

  11. JimBobb

    JimBobb Guest

    Messages:
    70
    Likes Received:
    0
    How much pays AMD for disinformation to spread? :heh:
     
  12. TLDR

    TLDR New Member

    Messages:
    1
    Likes Received:
    0
    GPU:
    Radeon HD 5850
    Actually, nVidia claims esp. about aniso quality are valid. But they derail the whole piece with that marketing white knight "we never cheat - trust us!!!" BS.
     
  13. dchalf10

    dchalf10 Banned

    Messages:
    4,032
    Likes Received:
    0
    GPU:
    GTX670 1293/6800
    I saw your bilinear post, I honestly have no idea what you're talking about. I have plauyed Fear 1 and 2 a hundred tmes and never seen anything like what you're suggesting. Maybe it's a hardware configuration issue. Or a driver incompatibility issue, but I never ever saw anything like that in Fear 1 or 2 on my 280 or 480.

    EDIT:

    There was an edit to the article added to the bottom:

    "EDITOR'S NOTE: This is a disturbing article, and the sources here are critical for legitimacy. NVIDIA is a direct competitor to AMD and is the author of this article, which lead some readers to ignore the message. However, it was several independent review website's that first brought this issue to the forefront, and proved it exists. I personally trust these websites, particularly 3DCenter.org, and have found them to be unbiased over the years.

    Benchmark Reviews can confirm that issues with filtering still exist, and pointed this out in our Radeon HD 6850 and Radeon HD 6870 launch articles. We also made it public that certain AMD partners were sending 'juiced' video card samples to reviews sites, ours included, with details published in our 1120-Core "Fixed" Radeon HD 6850 Review Samples Shipped to Media article. So could this be AMDs last ditch effort to compete with NVIDIA by manipulating performance? "

    I'd like to see some ATI white-knighters to defend that B.S.
     
    Last edited: Nov 21, 2010
  14. Xzibit

    Xzibit Banned

    Messages:
    4,382
    Likes Received:
    0
    GPU:
    7970 Windforce
    Nvidia, now wants to restore the loss of sales, which AMD has taken away from Nvidia with the release of 5000 series, so they invent all sorts of tricks that, people would buy their products.
    But we know something that Nvidia does hot sh!t and cheating people :eyebrows:.
    BTW, the 500 series is a big joke, agree?
     
  15. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    10,449
    Likes Received:
    3,128
    GPU:
    PNY RTX4090
    I will say this, everyone sees things differently. No matter who you are, you will see things different to the next person. Everyone's eyes are different (hence why some need glasses and some don't).

    I have recently gone from ATI to Nvidia (4870x2 - GTX480), my previous card before my 4870x2 was an 8800GTX. I honestly say there is literally no difference in IQ. But Nvidia's blacks seem to be darker than that of ATI's. Now this could be seen as better by some or worse to others (this is all on the same monitor, same settings, and same games, etc, etc). This is the only thing I can say that I have seen different about the two companies in terms of IQ.

    Both have great AA/AF rendering and both can produce some awesome looking stuff. But there are people who are fanboys and no matter what they see they will always go for the one they support (much like a football supporter, no matter how rubbish his team play and how great the rivals play he will always support his team). Then there are people who simply can not tell the difference and don't bother to look for IQ differences. Then there the people who just go for what ever card they can afford or is best for their system and just listen and read all the stupid fanboy comments... At the end of the day who cares, your gaming experiences are not going to drastically change because one side produces microscopically less jaggies, less blurry textures, etc, who cares? Play the games and have fun and stop trying to compare and better everyone else. This is like console fanboys who buy a 360 or PS3 just because they think the graphics are slightly better on one than the other... WHO CARES!! Your still playing and experiencing the same game!
     

  16. buddyfriendo

    buddyfriendo Guest

    Messages:
    3,404
    Likes Received:
    5
    GPU:
    2070 Super
    You caught me! I'm the AMD fanboy with an Nvidia GPU! HUZZAH!

    You've got quite the thick head, don't you? :bang:

    As for these clueless people you speak of, speak for yourself guy.
     
    Last edited: Nov 21, 2010
  17. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    Both sides are at it, its competitive business.
    Nvidia have been doing it for years, and so have ATI/AMD, but recently they have done less of it as Nvidia have been so poor and didn't have to, but since the GTX460, Nvidia have been actually been on top form, beating AMD to a new card, and bringing the GTX580 out in the middle of the 6970 hype was a very clever move.

    Alot of people are treating this as a IQ thread, which contrary to the thread title, i don't think it is.

    AMD have actually gone up in my book with this, its nice to see them being more aggresive.
    Slightly degrading image quality at default settings to get a few FPS here and there, its sneaky, but Nvidia would, and have done the same, and its not like they are selling you a poor product, the majority probably wouldn't see the difference, and the others would be clued up enough to sort the issue themselves.
     
  18. TheHunter

    TheHunter Banned

    Messages:
    13,404
    Likes Received:
    1
    GPU:
    MSi N570GTX TFIII [OC|PE]

    well than you must be blind.

    Its a fine almost invisible line of bilinear filtering - bow wash effect infornt of you while you move (it happens if you use ingame AF, call of juarez2, fear1,2, farcry2, Burnout Paradise,..), its the same in Masseffect1 if you dont apply trilinear in config. or in Batman...mirrors edge has it enabled by default.

    and like i said its ok if you use ingame trilinear and force AF in driver., in most cases in COJ2 it didnt help.

    i have my monitor ~70cm away from me and i can clearly see this ugly line.. and im using h.quality, no optimizations.. and yet its still doing it in certain games..
     
    Last edited: Nov 21, 2010
  19. Xzibit

    Xzibit Banned

    Messages:
    4,382
    Likes Received:
    0
    GPU:
    7970 Windforce
    Before my 4870X2, I had 7800GTX and 8800GTX, I am not fanboy, but:
    I hate Nvidia's strategy, with their PhysX and CUDA, their policy is unfair and in commerce and advertising.
    I know that they have the soft part better, but AMD's driver is not a complete failure, a little worse but not critical, but some people(nvidia trolls), find between them(I mean nvidia and AMD drivers, like in this article) a substantial difference, which in fact is no!

    Redemption80
    [​IMG]
     
    Last edited: Nov 21, 2010
  20. Sever

    Sever Ancient Guru

    Messages:
    4,825
    Likes Received:
    0
    GPU:
    Galaxy 3GB 660TI
    http://www.guru3d.com/article/radeon-hd-6850-6870-review/9

    according to the fairly unbiased hilbert, theres no notable difference.

    going from a 5850 xfire to a gtx580, i havent noticed any differences once ive tweaked them a bit. still got a fair bit of tweaking to go as well since nvidia's panel confuses me a little.

    basically, both sides are more or less the same. the only major difference i can see comes from the different aa methods each side has.
     

Share This Page