Exploring ATI's Image Quality Optimizations [Guru3D.com]

Discussion in 'Frontpage news' started by Guru3D News, Dec 2, 2010.

  1. Lane

    Lane Guest

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    It's not what saying Nvidia when they respond to it, they say they have thoses optimisation, but they don't impact the Quality. Something you can't only check by compare all beta and driver one by one, and ofc nobody want to do this.

    NVIDIA Technical Marketer Jeffrey Yen

    I think there's a misunderstanding with how our profiles function. The complete quote in our guide should be "NVIDIA's official driver optimization's policy is never to introduce a performance optimization via .exe detection that alters the application's image quality, however subtle the difference."

    That doesn't mean that profiles don't look for .exe files. Just that we're unwilling to alter the application's image quality.

    I'm sure you're familiar with many of the performance improvements across games and other applications that our drivers have enabled over the years.
     
    Last edited: Dec 2, 2010
  2. Moricon

    Moricon New Member

    Messages:
    5
    Likes Received:
    0
    GPU:
    2 x HD5850 875/1175
    Is this a big deal..NO!

    Should AMD default to High Quality in Driver..NO!

    Should AMD publically announce the correct settings for comparable benchmarks for Hardware Testing...YES!

    This is a simple solution. AMD should just make consumers aware of the settings in the Drivers to allow people to make their own mind up if they prefer the optimisations or not, and they should inform every hardware reviewer of these settings and advise like for like settings across the different platforms for fair comparison!

    They have not done anything wrong with these optimizations, they are good performance boosters for very very little image quality hit, so small its really not noticable. I have only one noticed it in LOTRO, running with the camera view at a certain zoom and angle I get that exact AA effect of a solid gridline, but setting to High Quality in CCC removes this effect, no other game have I seen this happen.

    We all know AMD does not have the faster cards, that belongs to Nvidia..but you pay the price for the faster cards! If you want performance for your pound, the way is AMD! If you want pure performance cost irrespective, move with Nvidia!
     
  3. Kohlendioxidus

    Kohlendioxidus Guest

    Messages:
    1,399
    Likes Received:
    13
    GPU:
    Sapphire Vega 56 Pu
    what's the point of this thread??:3eyes:

    I don't play with a microscope conected to my eyes and I don't see the reason of debating if ATI or nvidia should use "Quality" or "High Quality" settings...It's up to user to decide. ATI always had more IQ expecially when watching movies. Regarding games I see NILL difference...maybe they are but at microscopic scale...
     
    Last edited: Dec 2, 2010
  4. chanw4

    chanw4 Guest

    Messages:
    2,362
    Likes Received:
    26
    GPU:
    NITRO+ RX6800XT SE
    How did people come to the conclusion that Nvidia did not use optimization on default setting and that you need to deduct 8% in performance to get 'real performance' or raw perfomance?
     

  5. alanm

    alanm Ancient Guru

    Messages:
    12,234
    Likes Received:
    4,435
    GPU:
    RTX 4080
    The 'point of this thread' is that unless this is addressed by ATI - and only in regards with what they supposedly did with these 2 drivers (10.10 and 10.11?) default settings and if they continue it - then it may hang over their head with continued controversy. I can just see future bench comparisons with Nvidia owners saying 'uh, but you have to deduct 8% off the ATI figures to make it fair', etc.
     
  6. chanw4

    chanw4 Guest

    Messages:
    2,362
    Likes Received:
    26
    GPU:
    NITRO+ RX6800XT SE
    Deduct 8% to get raw performance and compare it with Nvidia which use optimization in their default setting as well? I'm pretty sure HH said that BOTH brand need to use no optimization for the default setting.
     
  7. sutyi

    sutyi Member Guru

    Messages:
    106
    Likes Received:
    0
    GPU:
    Gainward GTX 660 OC 2GB
    Can I ask why was AA enabled on the GTX 580 while making screens from ME2 for a texture filtering test?
     
  8. dirthurts

    dirthurts Guest

    Messages:
    765
    Likes Received:
    13
    GPU:
    Vega 56 Pulse
    .

    I never have and never will notice this in game. I can barely see in with a screen shot comparison.
    I'll take my added performance and run for it.
    Nvidia could do this as well, and in my opinion they should. It's a great little trick.
     
  9. Lane

    Lane Guest

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    Last edited: Dec 2, 2010
  10. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,388
    Likes Received:
    18,558
    GPU:
    AMD | NVIDIA
    LOL - of all the things to do, the wrong NV image screenshot was uploaded initially (one with AA still enabled), this was corrected.
     

  11. chanw4

    chanw4 Guest

    Messages:
    2,362
    Likes Received:
    26
    GPU:
    NITRO+ RX6800XT SE
    why did it said the user edit the post when it looks like a MOD edit? is this a bug or working as intended?
     
  12. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,388
    Likes Received:
    18,558
    GPU:
    AMD | NVIDIA
    The user edited the message before a MOD had to edit it ? ;)
     
  13. Cyberdyne

    Cyberdyne Guest

    Messages:
    3,580
    Likes Received:
    308
    GPU:
    2080 Ti FTW3 Ultra
    The point is that ATI is fabricating extra performance in games by default. If ATI was the only GPU company this would not matter. But they are not.
     
  14. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    The difference in image quality is irrelevant, its the difference in performance thats the issue, and if this was the other way round, we all know people would be speaking very differently.

    It means now its going to be even harder to compare one review to another, and no doubt every site will be using different settings.

    Just hope we don't get an influx of new users, who just want to post in this thread.
     
  15. Lane

    Lane Guest

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    you can't test this using the same driver and set just HQ and Quality, you need for this use an old driver set it to "standard" and then use the new driver and set it using "Quality", and watch what is the difference ( best will be in the 3 mode) ... Then you can measure the gain of fps betwen them ..
     

  16. Anarion

    Anarion Ancient Guru

    Messages:
    13,599
    Likes Received:
    386
    GPU:
    GeForce RTX 3060 Ti
    Hilber, did you enable trilinear filtering in Mass Effect? If you didn't (it's off unless changed in config file), there are bilinear like banding even with 16xAF. That will (bilinear filtering+AF) most likely make AF look identical between cards. The difference is clearly noticeable in game between bilinear filtering+AF and trilinear filtering+AF, especially when walking around Citadel.

    Same thing goes to Mass Effect 2. It's beyond my comprehension why Bioware doesn't have trilinear filtering enabled by default.
     
    Last edited: Dec 2, 2010
  17. mapel110

    mapel110 Active Member

    Messages:
    92
    Likes Received:
    0
    GPU:
    GTX 460 Gigabyte OC
    Screenshots don't show, what you see. Videos are better.
    In the 3DCenter-Forums we have compared several games by making videos with fraps.
     
  18. PinguX

    PinguX Maha Guru

    Messages:
    1,123
    Likes Received:
    303
    GPU:
    Sapphire RX580 8GB
    If optimisation is considered to be fabricating, then are'nt Nvidia doing the same thing with all the "way its meant to be played games" ?
     
  19. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    Nothing like TWIMTBP, exact opposite actually.

    With Nvidia and the TWIMTBP, AMD users lose out.
    With AMD and this, AMD users lose out.
     
  20. Ukraver

    Ukraver Master Guru

    Messages:
    542
    Likes Received:
    0
    GPU:
    msi r280x oc
    i think this is daft , all us users with mid to high end gpus both ati and nvidia should enable high quality as default in both ccc and nvcpl which renders the whole argument void:bang:
     

Share This Page