Guru3D.com Forums

Go Back   Guru3D.com Forums > General Chat > Frontpage news
Frontpage news Perhaps you have some news to report or want to check out the latest Guru3D headlines and comment ? Check it in here.


Reply
 
Thread Tools Display Modes
Exploring ATI's Image Quality Optimizations [Guru3D.com]
Old
  (#1)
Guru3D News
Ancient Guru
 
Guru3D News's Avatar
 
Videocard:
Processor: HAL 9000
Mainboard:
Memory:
Soundcard:
PSU:
Default Exploring ATI's Image Quality Optimizations [Guru3D.com] - 12-02-2010, 12:49 | posts: 6,462

A somewhat heated topic amongst graphics card manufacturers is how to get as much performance out of a graphics card with as little as possible image quality loss. In the past both ATI and NVIDIA have...

More...
   
Reply With Quote
 
Old
  (#2)
Mkilbride
Banned
 
Videocard: EVGA GTX470 1.2GB
Processor: 2500K @ 4.5GHZ
Mainboard: Asrock P67 Professional
Memory: 8GB DDR 1600MHZ
Soundcard: X-Fi Titanium Prof.
PSU: Corsair HX850
Default 12-02-2010, 13:00 | posts: 8,078 | Location: US, New Hampshire

So for every ATi review, basically I'll take 10% performance off and that'll be the real performance.
   
Reply With Quote
Old
  (#3)
k3vst3r
Ancient Guru
 
k3vst3r's Avatar
 
Videocard: Tri-fire 290x Qnix 120Hz
Processor: i7 4770k 4.6
Mainboard: Asus Maximus Hero VI
Memory: 4x4GB 2400
Soundcard: SB X-FI Titanium
PSU: Corsair AX1200i
Default 12-02-2010, 13:07 | posts: 2,978 | Location: Sheffield UK

Quote:
Originally Posted by Mkilbride View Post
So for every ATi review, basically I'll take 10% performance off and that'll be the real performance.
pretty much


   
Reply With Quote
Old
  (#4)
Matt26LFC
Maha Guru
 
Matt26LFC's Avatar
 
Videocard: 7970Ghz CF Qnix2710 96Hz
Processor: i5 3570K/Raystorm/120.5
Mainboard: Z77X-UD5H
Memory: 8GB 1866Mhz Platinums
Soundcard: ALC889 / HD449s
PSU: HX1050
Default 12-02-2010, 13:07 | posts: 2,478 | Location: UK

Quote:
Originally Posted by Mkilbride View Post
So for every ATi review, basically I'll take 10% performance off and that'll be the real performance.
Unless stated in the review that the image quality setting has been changed to HQ. Obviously reviews on this site deduct around 8% as Hilbert i believe said he leaves it on its default setting.

Don't think it bothers me too much that they've done this, i know i can manually change the setting to HQ if i want too. Not that i own a AMD card anyway
   
Reply With Quote
 
Old
  (#5)
JonasBeckman
Ancient Guru
 
JonasBeckman's Avatar
 
Videocard: R9 290 Tri-X Vapor-X OC
Processor: i7-3930K @ 4.1Ghz
Mainboard: Asus Rampage IV Extreme
Memory: Corsair Vengeance 16GB
Soundcard: Asus ROG Phoebus
PSU: Corsair AX 1200W
Default 12-02-2010, 13:11 | posts: 9,440 | Location: Sweden

Hasn't it been like this for a really long time though, I don't know much about this but as a older generation ATI card user I don't have the new AI options but it is the same technique as with the Mip-Map Filtering Quality option isn't it and by default that defaults to high and not very high and similarly ATI's defaults are "balanced" instead of "very high" for the default, non-advanced view of the CCC and have been for a long time hasn't it?
(Can't say how it's changed for ATI 5K and 6K series but I imagine the defaults are comparable.)

EDIT: NVIDIA is similar in a way also no? balanced defaults for the non-advanced view though more control over if you want that to decide, override it with the default settings or use a mix of both along with app profile settings which can also apply optimizations.
(Mostly related to trilinear and AF optimizations if the view on the second computer with it's 7800 GTX is still accurate, disabled and grayed out when switched to high quality instead of the default quality option.)

Last edited by JonasBeckman; 12-02-2010 at 13:14.
   
Reply With Quote
Old
  (#6)
Ven0m
Maha Guru
 
Ven0m's Avatar
 
Videocard: ASUS GTX 680 D2CU TOP
Processor: i7 920 @ 4GHz
Mainboard: ASUS Rampage II Extreme
Memory: 12GB Patriot Viper
Soundcard: Xonar D2X>Custom amp>K712
PSU: Seasonic M12D 850W
Default 12-02-2010, 13:22 | posts: 1,353 | Location: Warsaw, Poland

The problem arises when we compare 2 cards - one by AMD, one by NV and AMD marginally wins. The first impression for quite a lot of people is that AMD is faster, which is not the case in this scenario.

Because of that, the readers should be explicitly informed that these tests are performed with different image quality settings, as we're not really comparing cards in 100% fair way. Other solutions - decrease quality settings for NV cards (not cool), let NV get worse scores because they care about IQ more (not cool either).

We may say that it's a rare case. It's not really visible in many third-person perspective or strategy games, but in games with low camera angle, it may be annoying. If you play racing games / MMOs / FIFA series, etc, then comparing AMD and NV cards at default settings without IQ notice is just unfair.
   
Reply With Quote
Old
  (#7)
nicugoalkeper
Master Guru
 
nicugoalkeper's Avatar
 
Videocard: Gigabyte GTX 660 OC 2GB
Processor: Q6600 G0 @ 3600
Mainboard: Asus P5E
Memory: 4Gb DDR2 800 Geil Ultra
Soundcard: Soundmax
PSU: CoolMaster 620 Real Power
Exclamation 12-02-2010, 13:44 | posts: 791 | Location: RO TM

Sorry to say it but ATI is doing this for speed not quality !
So +1 to Nvidia (higher price but better quality and other nice features)
   
Reply With Quote
Old
  (#8)
cold2010
Master Guru
 
Videocard: HD 4870
Processor: 2600k
Mainboard: ASUS SABERTOOTH P67
Memory: G.SKILL Ripjaws 8GB 2 x 4
Soundcard:
PSU: Cooler Master GX 750W
Default 12-02-2010, 13:44 | posts: 180

http://ht4u.net/reviews/2010/amd_rad...ed/index11.php
   
Reply With Quote
Old
  (#9)
John Dolan
Maha Guru
 
John Dolan's Avatar
 
Videocard: 2x GTX 780 SLI
Processor: 2600K EK H2o
Mainboard: Asus P8P67 PRO Intel P67
Memory: Corsair Vengeance 8GB
Soundcard: x-fi platinum fatality 7.
PSU: Corsair AX1200i Digital
Default 12-02-2010, 13:46 | posts: 2,248 | Location: uk nottingham

Ive used half a dozen of each brand over the last decade or so and ive always thought that the ATI cards gave better IQ.The older nvidia 7 series used to look particularly bad in comparison.
   
Reply With Quote
Old
  (#10)
WhiteLightning
Ancient Guru
 
WhiteLightning's Avatar
 
Videocard: EVGA 780GTX ACX
Processor: i7-2600k HT @ 4.5 +H70 PP
Mainboard: MSI Z77A-GD65 GAMING
Memory: Gskill 2133Mhz 8GB
Soundcard: Onboard
PSU: Corsair 1000 watt
Default 12-02-2010, 13:47 | posts: 23,191 | Location: Hoek van Holland, Netherlands

ok its clear, im a cheater for about 2 years now aargh....
   
Reply With Quote
 
Old
  (#11)
Undying
Ancient Guru
 
Undying's Avatar
 
Videocard: R9 280X Vapor-X
Processor: Intel i5 2500k @ 4.5ghz
Mainboard: Gigabyte GA-Z68AP-D3
Memory: Patriot Signature 8GB
Soundcard: Asus Xonar DG
PSU: Tt Toughpower XT 775W
Default 12-02-2010, 13:50 | posts: 3,966 | Location: Serbia, NS

Quote:
Originally Posted by WhiteLightning View Post
ok its clear, im a cheater for about 2 years now aargh....
No one is cheater people just get what they pay for.
   
Reply With Quote
Old
  (#12)
k3vst3r
Ancient Guru
 
k3vst3r's Avatar
 
Videocard: Tri-fire 290x Qnix 120Hz
Processor: i7 4770k 4.6
Mainboard: Asus Maximus Hero VI
Memory: 4x4GB 2400
Soundcard: SB X-FI Titanium
PSU: Corsair AX1200i
Default 12-02-2010, 13:54 | posts: 2,978 | Location: Sheffield UK

HH says recently

Currently with the Radeon HD 6000 series release the Catalyst drivers (Catalyst AI) have a new setting which allows control over control texture filtering with settings for 'High Quality', 'Quality' and 'Performance'.

High Quality turns off all optimizations and lets the software run exactly as it was originally intended to. Quality, which is now the default setting - applies some optimizations that AMD believes remains objective and keeps the integrity of the image quality at high levels while gaining some performance. The last setting is the Performance setting which applies supplementary optimizations to gain even more performance.


What HH is explaining is newest drivers have optimizations enabled by default where as nvidia don't.


   
Reply With Quote
Old
  (#13)
Exodite
Maha Guru
 
Videocard: Gigabyte 6950 2GB @ Stock
Processor: Intel i7 2600K @ 1.15V
Mainboard: ASUS P8P67 Pro B3
Memory: 8GB Kingston PC10600
Soundcard: Realtek 892
PSU: Seasonic SS-460FL
Default 12-02-2010, 13:59 | posts: 1,652 | Location: LuleŚ, Sweden

This is getting tiresome.

Both vendors use optimizations that can adversely affect image quality in the default settings. Both vendors allow such optimizations to be disabled.

It's completely inane to apply some kind of scaling pulled entirely from the backside to 'compensate' for this when testing different cards. Obviously testing should be done at default settings and image quality compared.

If there are obvious differences in image quality that has to be accounted for when reviewing a product, obviously, but reading quite a lot of reviews both here and on other sites it's obvious there are no such differences.

Bar the odd bugs, obviously.
   
Reply With Quote
Old
  (#14)
VultureX
Maha Guru
 
Videocard: GTX670 4GB SLI
Processor: Core i7 2700K @4.8GHz H2O
Mainboard: Asrock Z68 Extreme3 Gen3
Memory: 8GB G.Skill 2133MHz CL9
Soundcard: Xonar Essence ST - Z-5500
PSU: Corsair TX850 V2
Default 12-02-2010, 14:00 | posts: 2,280 | Location: Netherlands

The first comparison shot is not clear to me.

Why does it say NV(16xQ) in the left screenshot (looks like an nvidia only anti-aliasing mode to me) and 16xAF in the right screenshot?

Also the article does not make it clearer:
Quote:
To your right you can see NVIDIA at work to the left an ATI Radeon 5000 series card
And further down:
Quote:
You can see that the right image from the Radeon 5000 card
The left screenshot looks better, so I guess that would be Nvidia... But do they use exactly the same AA and AF settings? Otherwise this would not be a fair comparison:s

Another point would be the use of jpg compression... I'd take comparisons and save the screens as lossless png to do it right...

Last edited by VultureX; 12-02-2010 at 14:02.
   
Reply With Quote
Old
  (#15)
TDurden
Maha Guru
 
Videocard: Gigabyte HD7870 2GB
Processor: Core 2 Quad Q9550@3.4GHz
Mainboard: Asus P5E X38(X48)
Memory: 8GB A-DATA DDR2 800
Soundcard: X-Fi Xtreme Gamer
PSU: CM Real Power Pro m520
Default 12-02-2010, 14:03 | posts: 1,801 | Location: LT

Difference is visible in some games..AMD on the right.
http://www.3dcenter.org/image/view/3746/_original
http://www.3dcenter.org/image/view/3747/_original
http://www.3dcenter.org/image/view/3731/_original
   
Reply With Quote
Old
  (#16)
wolvy
Newbie
 
Videocard: 4570
Processor: T4300
Mainboard:
Memory:
Soundcard:
PSU: -
Default 12-02-2010, 14:07 | posts: 14 | Location: Netherlands

And what about the users with lower range video cards?I`m not a big quality freak, i don`t play games alot , but when i do i want them to be playable, not running with 25 fps on ULTRA HIGH settings.Those extra 8% performance gain are more than welcome for me, "cheating" or not.....
   
Reply With Quote
Old
  (#17)
Lane
Ancient Guru
 
Videocard: 2x HD7970 - EK Waterblock
Processor: I7 4930K H2o EK Supremacy
Mainboard: Asus X79 Deluxe
Memory: G-Skill C9 2133mhz 16GB
Soundcard: X-FI Titanium HD + SP2500
PSU: CM 1000W
Default 12-02-2010, 14:08 | posts: 5,382 | Location: Switzerland

I have one question anyway, have you make the test of fps using an older driver and the new one with new settings option ? it look like you use only HQ and perf for determine the difference on the same driver .. or each driver profile increase--- it's same for nvidia if you remove all optimisation on the driver ( optimisation/by game ) you will get a similar down on performance.

it was too the case if you set CAT AI off with old drivers ..( cause there's no more any profile by game and so bug and other fixes, + specific optimisation will not work, including crossfire profile.. )

If you want to compare fps lost or gain, you need test with the Cat 10.9 or 10.10 with the driver set on Cat "standard " and then set with new catalyst " Quality ", HQ etc... and then you can see how much between you gain loose between " quality on new driver " and standard on old.. i really doubt the difference is 8%, most likely 1% ( so at max 1-2fps, nothing who can make you say they cheat for it.)

Trackmania on HD5870 is the worst example, as this game fail with a problem on the AF algorythm who bug with noisy textures, you can't use this game for a comparaison, cause whatever is the the reason, this is not due to a Optimisation or wathever ....

go read the article of 3D center claiming AMD have a hardware problem on HD5870-HD4870-HD3870 with AF, and you will see they have allready used TM ...

In reality this make 3 years 3Dcenter use TM, HL2 for claming the difference in AF between ATI and Nvidia...

This is one concern i have about all of this, why don't use BC2, Dirt2, F12010, COD for check quality difference, and use instead games like HL2 ( and strangely enough not L4D), Quake 3, TM, or Oblivion .... since how many year you have see a review who use thoses games ?

Last edited by Lane; 12-02-2010 at 14:33.
   
Reply With Quote
Old
  (#18)
Hilbert Hagedoorn
Don Vito Corleone
 
Hilbert Hagedoorn's Avatar
 
Videocard: AMD | NVIDIA
Processor: Core i7 4770K
Mainboard: Z77
Memory: 8GB
Soundcard: X-Fi - GigaWorks 7.1
PSU: 1200 Watt
Default 12-02-2010, 14:11 | posts: 20,451 | Location: Guru3D testlab

Quote:
Originally Posted by VultureX View Post
Another point would be the use of jpg compression... I'd take comparisons and save the screens as lossless png to do it right...
Fixed, right is Radeon, obiously.

On the PNG/JPG. The full blown 24-bit PNG or even BMP is 13 MB per image, I opted JPG at maximum quality (100%) . With the screenshots at 2560x1600 pixels you will not notice the difference.


Follow Guru3D on twitter.
Follow Guru3D on facebook.
   
Reply With Quote
Old
  (#19)
Darren Hodgson
Ancient Guru
 
Darren Hodgson's Avatar
 
Videocard: EVGA NVIDIA GTX 780
Processor: Intel Core i7-4770K
Mainboard: ASUS Z87 Deluxe
Memory: 16GB Corsair Veng 1600MHz
Soundcard: SB X-Fi Titanium HD
PSU: XFX Pro B.E. 850W
Default 12-02-2010, 14:12 | posts: 11,624 | Location: England

When I had my HD 5870 CFX setup, I would always up the (I think they were) texture filtering settings from Quality to High Quality so I was always aware that by not doing that I was compromising the image quality. Now I'm back with NVIDIA I noticed that they too default to a texture filtering setting of Quality and, again, I have raised that to High Quality at the cost of a little performance.

So, unless I'm misunderstanding this article, both NVIDIA and AMD seem to apply the same texture filtering optimisations and neither use the best quality settings, it is up to the end-user to select them.
   
Reply With Quote
Old
  (#20)
k3vst3r
Ancient Guru
 
k3vst3r's Avatar
 
Videocard: Tri-fire 290x Qnix 120Hz
Processor: i7 4770k 4.6
Mainboard: Asus Maximus Hero VI
Memory: 4x4GB 2400
Soundcard: SB X-FI Titanium
PSU: Corsair AX1200i
Default 12-02-2010, 14:15 | posts: 2,978 | Location: Sheffield UK

Quote:
Originally Posted by Darren Hodgson View Post
When I had my HD 5870 CFX setup, I would always up the (I think they were) texture filtering settings from Quality to High Quality so I was always aware that by not doing that I was compromising the image quality. Now I'm back with NVIDIA I noticed that they too default to a texture filtering setting of Quality and, again, I have raised that to High Quality at the cost of a little performance.

So, unless I'm misunderstanding this article, both NVIDIA and AMD seem to apply the same texture filtering optimisations and neither use the best quality settings, it is up to the end-user to select them.
quality on nvidia doesn't have texture filtering optimizations enabled

What HH is saying is you get 100% image on quality setting for nvidia cards by default but 99% image quality on ATI cards by default with latest drivers


   
Reply With Quote
Old
  (#21)
Lane
Ancient Guru
 
Videocard: 2x HD7970 - EK Waterblock
Processor: I7 4930K H2o EK Supremacy
Mainboard: Asus X79 Deluxe
Memory: G-Skill C9 2133mhz 16GB
Soundcard: X-FI Titanium HD + SP2500
PSU: CM 1000W
Default 12-02-2010, 14:18 | posts: 5,382 | Location: Switzerland

Quote:
Originally Posted by k3vst3r View Post
quality on nvidia doesn't have texture filtering optimizations enabled

What HH is saying is you get 100% image on quality setting for nvidia cards by default but 99% image quality on ATI cards by default with latest drivers
It's not what saying Nvidia when they respond to it, they say they have thoses optimisation, but they don't impact the Quality. Something you can't only check by compare all beta and driver one by one, and ofc nobody want to do this.

NVIDIA Technical Marketer Jeffrey Yen

I think there's a misunderstanding with how our profiles function. The complete quote in our guide should be "NVIDIA's official driver optimization's policy is never to introduce a performance optimization via .exe detection that alters the application's image quality, however subtle the difference."

That doesn't mean that profiles don't look for .exe files. Just that we're unwilling to alter the application's image quality.

I'm sure you're familiar with many of the performance improvements across games and other applications that our drivers have enabled over the years.

Last edited by Lane; 12-02-2010 at 14:29.
   
Reply With Quote
Old
  (#22)
Moricon
Newbie
 
Moricon's Avatar
 
Videocard: 2 x HD5850 875/1175
Processor: i5 3570K @ 4.4Ghz
Mainboard: Asus Sabertooth Z77
Memory: 4x2GB XMS3 1600MHZ 7,8,7,
Soundcard: Asus XFI Onboard
PSU: Corsair 750TX
Default 12-02-2010, 14:23 | posts: 5 | Location: UK Sussex

Is this a big deal..NO!

Should AMD default to High Quality in Driver..NO!

Should AMD publically announce the correct settings for comparable benchmarks for Hardware Testing...YES!

This is a simple solution. AMD should just make consumers aware of the settings in the Drivers to allow people to make their own mind up if they prefer the optimisations or not, and they should inform every hardware reviewer of these settings and advise like for like settings across the different platforms for fair comparison!

They have not done anything wrong with these optimizations, they are good performance boosters for very very little image quality hit, so small its really not noticable. I have only one noticed it in LOTRO, running with the camera view at a certain zoom and angle I get that exact AA effect of a solid gridline, but setting to High Quality in CCC removes this effect, no other game have I seen this happen.

We all know AMD does not have the faster cards, that belongs to Nvidia..but you pay the price for the faster cards! If you want performance for your pound, the way is AMD! If you want pure performance cost irrespective, move with Nvidia!
   
Reply With Quote
Old
  (#23)
Kohlendioxidus
Maha Guru
 
Kohlendioxidus's Avatar
 
Videocard: HIS 7950 IceQ Turbo CFX
Processor: AMD FX 8350
Mainboard: ASUS 990FX Crosshair V
Memory: 8Gig Crucial Balist @1866
Soundcard: Creative XFi Fatality Pro Gamer
PSU: Corsair TX 750W
Cool 12-02-2010, 14:31 | posts: 1,273 | Location: Germany

what's the point of this thread??

I don't play with a microscope conected to my eyes and I don't see the reason of debating if ATI or nvidia should use "Quality" or "High Quality" settings...It's up to user to decide. ATI always had more IQ expecially when watching movies. Regarding games I see NILL difference...maybe they are but at microscopic scale...

Last edited by Kohlendioxidus; 12-02-2010 at 14:35.
   
Reply With Quote
Old
  (#24)
chanw4
Maha Guru
 
Videocard: ASUS GTX 670 4GB GDDR5
Processor: Intel i7 3770K
Mainboard: ASUS MAXIMUS V EXTREME
Memory: OCZ Platinum DDR3-1600 4G
Soundcard:
PSU: Enermax Platimax 600W
Default 12-02-2010, 14:41 | posts: 2,270 | Location: Hong Kong

How did people come to the conclusion that Nvidia did not use optimization on default setting and that you need to deduct 8% in performance to get 'real performance' or raw perfomance?


   
Reply With Quote
Old
  (#25)
alanm
Ancient Guru
 
alanm's Avatar
 
Videocard: MSI TF GTX 770
Processor: i5-3570k
Mainboard: Asrock Z77 Pro4
Memory: 8gb G.Skill DDR3 1600
Soundcard: SB Zx
PSU: XFX Pro 750w
Default 12-02-2010, 14:49 | posts: 5,453

Quote:
Originally Posted by Kohlendioxidus View Post
what's the point of this thread??

I don't play with a microscope conected to my eyes and I don't see the reason of debating if ATI or nvidia should use "Quality" or "High Quality" settings...It's up to user to decide. ATI always had more IQ expecially when watching movies. Regarding games I see NILL difference...maybe they are but at microscopic scale...
The 'point of this thread' is that unless this is addressed by ATI - and only in regards with what they supposedly did with these 2 drivers (10.10 and 10.11?) default settings and if they continue it - then it may hang over their head with continued controversy. I can just see future bench comparisons with Nvidia owners saying 'uh, but you have to deduct 8% off the ATI figures to make it fair', etc.
   
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



Powered by vBulletin®
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
vBulletin Skin developed by: vBStyles.com
Copyright (c) 1995-2014, All Rights Reserved. The Guru of 3D, the Hardware Guru, and 3D Guru are trademarks owned by Hilbert Hagedoorn.