Exploring ATI's Image Quality Optimizations [Guru3D.com]

Discussion in 'Frontpage news' started by Guru3D News, Dec 2, 2010.

  1. Sr7

    Sr7 Master Guru

    Messages:
    246
    Likes Received:
    0
    It amazes me that some of you can defend this.

    Think about it this way. First you have the texture format substitution, resulting in a slightly different picture, but mostly unnoticeable IQ. Then you add in AF hacks, resulting in slightly worse AF, both static and in motion. Here it's only "partially noticeable". Then ask yourself what other hacks have been done that aren't publicized, or what other hacks will be done in the immediate future.

    Each of those was, relatively speaking, a small decrease in IQ. However, taken as a whole, the IQ deviated from the original IQ much more noticeably.

    Then factor in that they're getting performance by offering a lesser IQ than NVIDIA purely to win benchmarks... in a time when most midrange GPUs can offer maxed out settings, why not use that GPU horsepower for something useful? Why regress towards console quality, taking away one of the advantages that PC has?

    That's why this trend is unacceptable.
     
  2. TheHunter

    TheHunter Banned

    Messages:
    13,404
    Likes Received:
    1
    GPU:
    MSi N570GTX TFIII [OC|PE]
    you mean nvidia also with v.high quality, its set to quality by default.

    I always select v.high quality and texture LOD set to clamp before i start testing them.. I dont want any optimizations like trilinear optimization and anisotropic sample opt., no thank you.
     
  3. kapu

    kapu Ancient Guru

    Messages:
    5,418
    Likes Received:
    802
    GPU:
    Radeon 7800XT
    I wouldn't mind 10% gain from something i can't see while playing.
     
  4. perosmct

    perosmct Banned

    Messages:
    1,072
    Likes Received:
    0
    GPU:
    unknown
    that's why from now on...we dont care about benchmarks anymore...but only features...ati is "dead"...i dont give a ****...for benchmark with all the respect...no thank you...many optimizations, even the most of them i have analysed CCC and driver and registry are "hidden"...that means...never you will have the original game's IQ...because ati always have something left silently "on"...next time...i will surpass comparisons on many sites...it's a mind control and lost time...we should demand from amd to change tactic, lower its prices even more, because finally they have even more weaker gpu's from nvidia...
     

  5. mitzi76

    mitzi76 Guest

    Messages:
    8,738
    Likes Received:
    33
    GPU:
    MSI 970 (Gaming)
    I ran nvidia cards all the time up until 5870's and for me ati picture quality seems better on my screen.

    Perhaps some "placebo effect" but certainly I have never seen anything worth shouting about in terms of image quality.

    What on earth is this discussion all about? Seems like a big pile of turd to me. Sorry.

    If there was something worth complaining other its ati drivers and xfire on the whole.

    Hey maybe thats why since 10.5 they have been wank for me!! It's cos catalyst drivers contain all these extra "optimisations".

    But hey I'll let you know soon what its like to be back on Nvidia.

    You know what I'd bet a lot of money on the fact I will be able to say there is hardly any notable visual difference and some games run better on nvidia than they did on my Ati setup.

    So we prove ati do things differently...big deal. Are we then going to list all of nvidia's bs and cheap tactics to win custom?

    Swings and roundabouts chicos :)
     
  6. kapu

    kapu Ancient Guru

    Messages:
    5,418
    Likes Received:
    802
    GPU:
    Radeon 7800XT
    This sums it pretty well.

    Soo i get 8-10% extra performance for something i cant see ?

    Well... good job AMD ?
     
  7. Lycronis

    Lycronis Maha Guru

    Messages:
    1,046
    Likes Received:
    0
    GPU:
    Gigabyte GTX 970 G1
    Yeah, look back a few years when Nvidia was optimizing for 3DMark and notice who was doing all the bitching. It goes both ways so don't come in here thinking you can bash Nvidia fans for the same thing ATI fans would do if the situation was reversed. In fact, if this was about Nvidia you would see MUCH more bitching by ATI fans because of the usual "underdog" syndrome.

    This is an issue that needs to be addressed and it is something that can be seen in certain situations. I personally don't consider this a type of cheat, per say, but I do believe end users need to be made aware of it. Fair is fair. Tests should be as equal as possible, regardless of personal preference or whether on not you can notice a difference some of the time. IF there is a difference in image quality in order to gain performance then it should be clearly stated as such or tested at the same competing level.
     
  8. perosmct

    perosmct Banned

    Messages:
    1,072
    Likes Received:
    0
    GPU:
    unknown
    the point is now that we all admit that, new benchmarks from now on, must be analysed to have the same quality on both sides...and then we will see who has really stronger gpu's... ;)
     
  9. mitzi76

    mitzi76 Guest

    Messages:
    8,738
    Likes Received:
    33
    GPU:
    MSI 970 (Gaming)
    but u dont just buy a gpu based on absolute strength do you? or is having the bigger epeen the most important...

    personally i care about stability/noise/heat...(ok i dont want a weener of a gpu ofc) :)

    but i guess i see why guru created this topic. people do have a right to know. i think it's all a bit of an unecessary fuel for flaming and more nvidia v amd stuff..
     
  10. TheHunter

    TheHunter Banned

    Messages:
    13,404
    Likes Received:
    1
    GPU:
    MSi N570GTX TFIII [OC|PE]
    @Perosmtc

    if it bothers u that much then turn it up to v.high quality?

    why would you want to run at quality if you want better IQ anyway.. simple as that.


    nv cheats too @ default quality, so...
     
    Last edited: Dec 2, 2010

  11. mitzi76

    mitzi76 Guest

    Messages:
    8,738
    Likes Received:
    33
    GPU:
    MSI 970 (Gaming)
    that actually is a fairly sound arguement. +1.
     
  12. morbias

    morbias Don TazeMeBro

    Messages:
    13,444
    Likes Received:
    37
    GPU:
    -
    Firstly, Nvidia used to have 'Quality' as the default texture processing setting in Forceware drivers, not 'High Quality'. They don't do it now but it's not like they never did the same thing.

    Secondly, the hypocrisy in this thread is hilarious; if this article was focused on Nvidia most of the posts herein would include the words 'epic' and 'fail', but because it's ATI... apparently it's a feature.
     
  13. mitzi76

    mitzi76 Guest

    Messages:
    8,738
    Likes Received:
    33
    GPU:
    MSI 970 (Gaming)
    thats only because nvidia has been an epic fail in some areas but have redeemed themselves re the 460 and the 580.

    you could also argue that this is "smear" attempt to discredit the company which has been selling the most gpus recently.

    saying that am fed up of ati hehe. does that make me a hypocrite?...i have always tried to be impartial ;)

    if the 580 performs worse than my xfire its going straight back.
     
  14. arrrdawg

    arrrdawg Member

    Messages:
    28
    Likes Received:
    0
    GPU:
    NVIDIA GeForce GTX 460m
    Maybe I'm just dumb, but doesn't nvidia default to 'quality' which I am assuming has some sort of optimizations? Also, I show by default trilinear optimizations are also on but anisotropic sample optimizations are off. Both optimization options will gray out if you select high quality. So is this different from what ATI is doing?

    Also, the article says something like "anisotropic at 16x and trilinear filtering on".. I thought anisotropic was always trilinear unless you enable optimizations that substitute bilinear when it wouldn't make much of an image quality difference. I could just be ignorant when it comes to this stuff
     
  15. 3x3cUt0r

    3x3cUt0r Master Guru

    Messages:
    331
    Likes Received:
    0
    GPU:
    MSI 980 Ti Gaming 6G
    While i agree with performance improvements with little to no impact on IQ, i hate that some ATI users claim they have a better IQ.
     

  16. perosmct

    perosmct Banned

    Messages:
    1,072
    Likes Received:
    0
    GPU:
    unknown
    ati has no options to turn anisotropic/trilinear optimizations on/off...they are always on and hidden(that's my belief until i get proved a liar)...that's a FAIL...nvidia has many more options for open view at users...it's time for amd to show up...!!! ;)
     
  17. TheHunter

    TheHunter Banned

    Messages:
    13,404
    Likes Received:
    1
    GPU:
    MSi N570GTX TFIII [OC|PE]
    that's trilinear anisotropic filtering vs bilinear anisotropic filtering.. and no thats nothing to do with that trilinear optimization :)


    it can still render bilinear anisotropic filtering even if you set to very high quality.. you need specifically pinpoint this either ingame or in cfg..

    For example Fear1 or Fear2 and nvidia: if you use ingame Anisotropic filtering it renders it in bilinear anisotropic filtering, but if you use ingame trilinear and force anisotropic filtering in driver profile than its normal image without that bow wash effect(bilinear anisotropic) moving infront of you.
     
  18. Memorian

    Memorian Ancient Guru

    Messages:
    4,017
    Likes Received:
    883
    GPU:
    RTX 4090
    Nvidia's High Quality texture filtering turns off almost all optimizations if not all.
    A switch is missing in the Radeon cards to do the same thing.
     
  19. perosmct

    perosmct Banned

    Messages:
    1,072
    Likes Received:
    0
    GPU:
    unknown
    ^^thanks...i did it and the annoying bow wash just dissapeared on my ati too...;):banana:
     
  20. TheHunter

    TheHunter Banned

    Messages:
    13,404
    Likes Received:
    1
    GPU:
    MSi N570GTX TFIII [OC|PE]
    doesnt Catalyst AI @ v.high quality disable it anyway, it was like so back in 9700pro era.. well that was the last time i had ATI, but by looking at this image it didnt change much..

    [​IMG]


    and why would you want to run at default quality and then bitch about it,..pointless!

    because its the same by nv at default, although they dont sacrifice it that much, that it can actually disable a effect or two.. maybe its seen at performance mode, but i never tried that and why would i just to get 4-5fps more and have much uglier image?
     
    Last edited: Dec 2, 2010

Share This Page