Exploring ATI's Image Quality Optimizations [Guru3D.com]

Discussion in 'Frontpage news' started by Guru3D News, Dec 2, 2010.

  1. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    They have to be done on max, its the only true way of knowing.

    If using those pics above as an example, then its obvious there is some sort of optimisation being done with both vendors, which is to be expected, its seems that the AMD ones are more aggresive.
     
  2. alanm

    alanm Ancient Guru

    Messages:
    12,267
    Likes Received:
    4,467
    GPU:
    RTX 4080
    I wonder why the Trackmania banner is in reverse in the NV pic.
     
  3. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    I assumed that the pic is mirrored so they could split it down the middle.
     
  4. Bo_Fox

    Bo_Fox Active Member

    Messages:
    57
    Likes Received:
    0
    GPU:
    4870 1GB, 8800GTX
    Nah, I think that's just how the maps are used in Trackmania. Certain banners are mirrored backwards since they did not bother with making a new texture for left image... still a good game considering that it's FREE.
     

  5. Evasive

    Evasive Member

    Messages:
    18
    Likes Received:
    0
    GPU:
    EVGA 580 GTX
    I just think an Nvidia article is needed now too.

    The thing that caught me offguard was the article acted like this was something new, and horrible because AMD never told anyone.

    But i mean i thought most of the hardware community and reviewers KNEW that Nvidia and ATI put IQ on Quality over High Quality...

    I guess thats why im flabergasted this article was done without showing Nvidia benchmarks as well..
     
  6. forciano

    forciano Guest

    Messages:
    26
    Likes Received:
    0
    It's very funny to see how both pro ATI and NVIDIA fans think and are 100% sure none of the companies have done wrong at all.

    Both optimize and get improvements in games by doing this, you really think that every time there is a performance increase its just means the optimized their code?

    With that said ATI should have clear information stating what these options do, if they don't.

    And its also funny to see how NVIDIA tries to prove this point with images that show barely any if at all IQ degradation.
     
  7. forciano

    forciano Guest

    Messages:
    26
    Likes Received:
    0
    LOL what would I do if I go outside and find out god is applying optimizations to the IQ of fat chicks when i get drunk?

    :infinity:
     
  8. kapu

    kapu Ancient Guru

    Messages:
    5,418
    Likes Received:
    802
    GPU:
    Radeon 7800XT
    What the ..
     
  9. forciano

    forciano Guest

    Messages:
    26
    Likes Received:
    0
    exactly my point...
     
  10. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    nVidia has done is countless times in the past. GF6 and 7 series were plagued with poor IQ due to "driver optimizations".... Didn't seem to bother anyone that nVidia did it...so why is it such a big deal that AMD is doing it?

    It always amazes me how it's fine for Intel or nVidia to cheat customers....but if AMD does it, people react like the world is coming to an end. Every company does something like this. Look at Creative.....they release drivers that barely work, and never bother to fix any of the bugs....but the same people that bitch and moan about AMD/nVidia cheating customers, praise Creative.
     

  11. Omagana

    Omagana Guest

    Messages:
    2,532
    Likes Received:
    1
    GPU:
    RTX 2070 Super
    Lol i think you just compared a Graphics vender to God forciano
     
  12. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    Don't think anyone ever said it was ok for Nvidia or Intel to do it,ar from it, those two and MS are the first to get torn into the second they do anything wrong, and usually deservedly so.

    The irony is the AMD fans accepting, or excusing it, but you know it wouldn't be the same if it was Nvidia, that's whats amusing lol.
     
  13. alanm

    alanm Ancient Guru

    Messages:
    12,267
    Likes Received:
    4,467
    GPU:
    RTX 4080
    ^ yep. I've long viewed Nvidia as the premier scandal-mongers target in the tech world and with many Charlie Demerjians out there ready to pounce on any hint of wrong doing on their part. The tech press has Nvidia under the microscope in the same manner that the celeb press had Michael Jackson. Come to think of it, its a bit odd that Charlie has been quiet on this so far. I KNOW he is aware of it (from checking SA forum recently). And he is an intelligent person, although vindictive to a fault. I wonder if he is quiet because he knows more about it and does not think what he has hurts Nvidia more than ATI at this point.
     
  14. Black_ice_Spain

    Black_ice_Spain Guest

    Messages:
    4,584
    Likes Received:
    17
    GPU:
    5700XT
    if we need to make a review, get 300 static images, analise them with +300% zoom.

    And then play "where are the 7 differences"?.

    Where's the problem in real gaming? gimme that 10% performance goddamnit!.


    I always enable every optimization if i need performance, so i did when i had nvidia, who cares. Truth is that i dont notice quality nor performance changes lol :| .


    Anyways im a nvidia fanboy so die AMD, j/k but on the same performance/money i would pick nvidia... idk why T_T.
     
    Last edited: Dec 5, 2010
  15. newbuser

    newbuser New Member

    Messages:
    4
    Likes Received:
    0
    GPU:
    8800gtx
    I registered just to make a reply here, been a long time visitor to the site tho and wanted to offer my two cents worth.

    IMO, products should be reviewed as they are 'out the box' with the latest available drivers.

    When reviewing/benching new CPU's you do not start off by overclocking and then doing benchmarks, you test the product as the manufacturer intended it to be used. Then perhaps you can start looking at overclocking potential. If I was presented with two CPU's by different manufacturers, of the exact same spec, performance and price I would pick the one with higher overclocking potential.

    I feel its the same situation here, if a company decides to comprimise on quality, however little the decrease, to increase performance, however little the gain is, it should be up to them. Otherwise AMD/Nvidia should just be making chip for chip the exact same products with the exact same specs, iq, price and performance. As a consumer I do not want that, I already only have two companies to choose between.

    BUT, I do think IQ matters, I would like to know what a graphics card is CAPABLE of just as I want to know the over clocking potential of a CPU. Both will influence my decision to purchase x over y. And just as a reviewer might recommend a lower spec item over a higher spec due to factors like price, noise and heat _I_ will chose which of the cons I can live with.

    I think the moral (should I say, ethical?) debate is pointless. And I think it all comes down to the fact that its drivers we are really talking about. If one manufacture made its 'optimizations' in a chip, and it was visible enough for regular folk to notice, the manufacture would just be known for being weaker in a particular area, nobody would be outraged.


    What it all comes down to for me, and what I personally want to know is:

    Does it do what is says on the box, out of the box and what potential does it have if I poke around a bit?

    Thanks for listening :nerd:
     

  16. TheHunter

    TheHunter Banned

    Messages:
    13,404
    Likes Received:
    1
    GPU:
    MSi N570GTX TFIII [OC|PE]
    most sites test it this way..

    like here
    http://www.xbitlabs.com/articles/video/display/radeon-hd6870-hd6850_12.html#sect1
     
  17. heflys20

    heflys20 Member

    Messages:
    35
    Likes Received:
    0
    GPU:
    ATI 5830
    Basically, what I gathered from that Trackmania screen, is that both companies use optimizations at the default setting to affect certain aspects of the IQ. If that's the case, then why is AMD being called out for it? I can clearly see aspects where the AMD card looks superior, and aspects where it doesn't (the track).

    So, again, I ask, why is AMD being singled out? Am I missing something?

    Also, I noticed something else in that screen and circled it with a fat line.

    [​IMG]
     
  18. Sota

    Sota Ancient Guru

    Messages:
    2,520
    Likes Received:
    2
    GPU:
    MSI GTX 980 Gaming 4G
    Yes, but Anisitropic Optimization is off. That is the difference. In the default quality setting, Trillinear Optimization is on, but Anisitropic Opitimization off. In the new Cat, not only is Trillinear Optimization enabled, but so are Anisitropic Opimizations, and the only way to disable them is to slide the bar to HQ because there is no other way to disable them. Thus, there are extra optimizations that are helping the Radeons with performance. That is all well and good in regards to end users who want, and like, the extra FPS with the minimal loss in IQ, but when it comes to objective reviewing purposes, it makes doing so impossible with driver defaults enabled. The only way to correct that would be to enable Nvidia's Anisitropic Optimization that is disabled by default, or to bench both with the HQ setting.

    Hilbert explained everything in plain English. Also, I don't the wouldn't have written the article if something had not changed from previous Cat releases that compromised his ability to do an objective comparison. It doesn't matter if you are willing to game with the optimization enabled. What matters is getting accurate, objective reviews, out to the community without having to resort to tinkering with either vendors control panel. And to do that AMD needs to turn off the Anisitropic Optimization at the driver default, just as Nvidia has theirs. And, if you want to enable it yourself, then help yourself to it.
     
  19. heflys20

    heflys20 Member

    Messages:
    35
    Likes Received:
    0
    GPU:
    ATI 5830
    Or Guru3d could start reviewing games in HQ and not demand AMD change their optimizations to suit their reviewing methods. Most sites do it now,TBH.

    However, I'm still trying understand why certain aspects of Nvidia's IQ looks worse. They clearly have optimizations as well.
     
    Last edited: Dec 6, 2010
  20. ross

    ross Master Guru

    Messages:
    717
    Likes Received:
    0
    GPU:
    Diamond R9 290X 4GB
    Hmmm if the owner of one of the biggest hardware review sites can't get AMD to take this out of their driver, I'm not sure who can.

    They would be fools to leave it in their driver after this bad publicity on the issue.

    I'm considering AMD's new high end single card whenever they are available but reading this makes me want to buy an NVIDIA card.

    Thanks for the read.
     

Share This Page