Guru3D.com Forums

Go Back   Guru3D.com Forums > General Chat > Frontpage news
Frontpage news Perhaps you have some news to report or want to check out the latest Guru3D headlines and comment ? Check it in here.


Reply
 
Thread Tools Display Modes
Exploring ATI's Image Quality Optimizations [Guru3D.com]
Old
  (#1)
Guru3D News
Ancient Guru
 
Guru3D News's Avatar
 
Videocard:
Processor: HAL 9000
Mainboard:
Memory:
Soundcard:
PSU:
Default Exploring ATI's Image Quality Optimizations [Guru3D.com] - 12-02-2010, 12:49 | posts: 6,462

A somewhat heated topic amongst graphics card manufacturers is how to get as much performance out of a graphics card with as little as possible image quality loss. In the past both ATI and NVIDIA have...

More...
   
Reply With Quote
 
Old
  (#2)
Mkilbride
Banned
 
Videocard: EVGA GTX470 1.2GB
Processor: 2500K @ 4.5GHZ
Mainboard: Asrock P67 Professional
Memory: 8GB DDR 1600MHZ
Soundcard: X-Fi Titanium Prof.
PSU: Corsair HX850
Default 12-02-2010, 13:00 | posts: 8,078 | Location: US, New Hampshire

So for every ATi review, basically I'll take 10% performance off and that'll be the real performance.
   
Reply With Quote
Old
  (#3)
k3vst3r
Ancient Guru
 
k3vst3r's Avatar
 
Videocard: Tri-fire 290x Qnix 120Hz
Processor: i7 4770k 4.6
Mainboard: Asus Maximus Hero VI
Memory: 4x4GB 2400
Soundcard: SB X-FI Titanium
PSU: Corsair AX1200i
Default 12-02-2010, 13:07 | posts: 2,928 | Location: Sheffield UK

Quote:
Originally Posted by Mkilbride View Post
So for every ATi review, basically I'll take 10% performance off and that'll be the real performance.
pretty much


   
Reply With Quote
Old
  (#4)
Matt26LFC
Maha Guru
 
Matt26LFC's Avatar
 
Videocard: 7970Ghz CF Qnix2710 96Hz
Processor: i5 3570K/Raystorm/120.5
Mainboard: Z77X-UD5H
Memory: 8GB 1866Mhz Platinums
Soundcard: ALC889 / HD555s
PSU: HX1050
Default 12-02-2010, 13:07 | posts: 2,436 | Location: UK

Quote:
Originally Posted by Mkilbride View Post
So for every ATi review, basically I'll take 10% performance off and that'll be the real performance.
Unless stated in the review that the image quality setting has been changed to HQ. Obviously reviews on this site deduct around 8% as Hilbert i believe said he leaves it on its default setting.

Don't think it bothers me too much that they've done this, i know i can manually change the setting to HQ if i want too. Not that i own a AMD card anyway
   
Reply With Quote
 
Old
  (#5)
JonasBeckman
Ancient Guru
 
JonasBeckman's Avatar
 
Videocard: Sapphire R9 280X Dual-X
Processor: i7-3930K @ 4.1Ghz
Mainboard: Asus Rampage IV Extreme
Memory: Corsair Vengeance 16GB
Soundcard: Asus ROG Phoebus
PSU: Corsair AX 1200W
Default 12-02-2010, 13:11 | posts: 9,068 | Location: Sweden

Hasn't it been like this for a really long time though, I don't know much about this but as a older generation ATI card user I don't have the new AI options but it is the same technique as with the Mip-Map Filtering Quality option isn't it and by default that defaults to high and not very high and similarly ATI's defaults are "balanced" instead of "very high" for the default, non-advanced view of the CCC and have been for a long time hasn't it?
(Can't say how it's changed for ATI 5K and 6K series but I imagine the defaults are comparable.)

EDIT: NVIDIA is similar in a way also no? balanced defaults for the non-advanced view though more control over if you want that to decide, override it with the default settings or use a mix of both along with app profile settings which can also apply optimizations.
(Mostly related to trilinear and AF optimizations if the view on the second computer with it's 7800 GTX is still accurate, disabled and grayed out when switched to high quality instead of the default quality option.)

Last edited by JonasBeckman; 12-02-2010 at 13:14.
   
Reply With Quote
Old
  (#6)
shane_p
Member Guru
 
shane_p's Avatar
 
Videocard: 590GTX HYDRO COPPER CLASS
Processor: i7 980x @4.4Ghz H20
Mainboard: Gigabyte UD9
Memory: G.Skill 12GB
Soundcard: Fatal1ty Titanium Pro
PSU: 1500Watt Strider
Default 12-02-2010, 17:57 | posts: 46 | Location: [URL=http://imageshack.us/photo/my-

wow
now I am upset, WITH NVIDIA!

why don't they do the same thing and get my 480 some more performance, for a minor tweak that 99.99% of you guys cannot even see?

LOL
come on Nvidia do the same thing here, I see people are upset with ATI, but Nvidia is behind the ball I say...lol
   
Reply With Quote
Old
  (#7)
Ven0m
Maha Guru
 
Ven0m's Avatar
 
Videocard: ASUS GTX 680 D2CU TOP
Processor: i7 920 @ 4GHz
Mainboard: ASUS Rampage II Extreme
Memory: 12GB Patriot Viper
Soundcard: Xonar D2X, Philips HP1000
PSU: Seasonic M12D 850W
Default 12-02-2010, 13:22 | posts: 1,220 | Location: Warsaw, Poland

The problem arises when we compare 2 cards - one by AMD, one by NV and AMD marginally wins. The first impression for quite a lot of people is that AMD is faster, which is not the case in this scenario.

Because of that, the readers should be explicitly informed that these tests are performed with different image quality settings, as we're not really comparing cards in 100% fair way. Other solutions - decrease quality settings for NV cards (not cool), let NV get worse scores because they care about IQ more (not cool either).

We may say that it's a rare case. It's not really visible in many third-person perspective or strategy games, but in games with low camera angle, it may be annoying. If you play racing games / MMOs / FIFA series, etc, then comparing AMD and NV cards at default settings without IQ notice is just unfair.
   
Reply With Quote
Old
  (#8)
nicugoalkeper
Master Guru
 
nicugoalkeper's Avatar
 
Videocard: Gigabyte GTX 660 OC 2GB
Processor: Q6600 G0 @ 3600
Mainboard: Asus P5E
Memory: 4Gb DDR2 800 Geil Ultra
Soundcard: Soundmax
PSU: CoolMaster 620 Real Power
Exclamation 12-02-2010, 13:44 | posts: 790 | Location: RO TM

Sorry to say it but ATI is doing this for speed not quality !
So +1 to Nvidia (higher price but better quality and other nice features)
   
Reply With Quote
Old
  (#9)
sykozis
Ancient Guru
 
sykozis's Avatar
 
Videocard: eVGA GTX660SC SLI
Processor: Core i7 2600K
Mainboard: ASRock Z77 Extreme4
Memory: 8gb G.Skill DDR3-1866
Soundcard: Creative Recon3D PCIe
PSU: SeaSonic M12II 620 Bronze
Default 12-05-2010, 00:29 | posts: 15,586 | Location: US East Coast

Quote:
Originally Posted by nicugoalkeper View Post
Sorry to say it but ATI is doing this for speed not quality !
So +1 to Nvidia (higher price but better quality and other nice features)
nVidia has done is countless times in the past. GF6 and 7 series were plagued with poor IQ due to "driver optimizations".... Didn't seem to bother anyone that nVidia did it...so why is it such a big deal that AMD is doing it?

It always amazes me how it's fine for Intel or nVidia to cheat customers....but if AMD does it, people react like the world is coming to an end. Every company does something like this. Look at Creative.....they release drivers that barely work, and never bother to fix any of the bugs....but the same people that bitch and moan about AMD/nVidia cheating customers, praise Creative.
   
Reply With Quote
Old
  (#10)
Omagana
Maha Guru
 
Omagana's Avatar
 
Videocard: Evga GTX 780
Processor: Intel Core i7 4770k
Mainboard: Asus Sabertooth Z77
Memory: Corsair XMS3 16GB
Soundcard: Asus Xonar D2X 7.1
PSU: Corsair HX850W
Default 12-05-2010, 00:41 | posts: 2,333 | Location: Scotland

Lol i think you just compared a Graphics vender to God forciano
   
Reply With Quote
 
Old
  (#11)
cold2010
Master Guru
 
Videocard: HD 4870
Processor: 2600k
Mainboard: ASUS SABERTOOTH P67
Memory: G.SKILL Ripjaws 8GB 2 x 4
Soundcard:
PSU: Cooler Master GX 750W
Default 12-02-2010, 13:44 | posts: 180

http://ht4u.net/reviews/2010/amd_rad...ed/index11.php
   
Reply With Quote
Old
  (#12)
John Dolan
Maha Guru
 
John Dolan's Avatar
 
Videocard: 2x GTX 780 SLI
Processor: 2600K EK H2o
Mainboard: Asus P8P67 PRO Intel P67
Memory: Corsair Vengeance 8GB
Soundcard: x-fi platinum fatality 7.
PSU: Corsair AX1200i Digital
Default 12-02-2010, 13:46 | posts: 2,217 | Location: uk nottingham

Ive used half a dozen of each brand over the last decade or so and ive always thought that the ATI cards gave better IQ.The older nvidia 7 series used to look particularly bad in comparison.
   
Reply With Quote
Old
  (#13)
WhiteLightning
Ancient Guru
 
WhiteLightning's Avatar
 
Videocard: EVGA 780GTX ACX 1215/1750
Processor: i7-2600k HT @ 4.5 +H70 PP
Mainboard: MSI Z77A-GD65 GAMING
Memory: Gskill 2133Mhz 8GB
Soundcard: Onboard
PSU: Corsair 1000 watt
Default 12-02-2010, 13:47 | posts: 22,491 | Location: Netherlands

ok its clear, im a cheater for about 2 years now aargh....
   
Reply With Quote
Old
  (#14)
Undying
Ancient Guru
 
Undying's Avatar
 
Videocard: -
Processor: -
Mainboard: -
Memory: -
Soundcard: -
PSU: -
Default 12-02-2010, 13:50 | posts: 3,696 | Location: Serbia, NS

Quote:
Originally Posted by WhiteLightning View Post
ok its clear, im a cheater for about 2 years now aargh....
No one is cheater people just get what they pay for.
   
Reply With Quote
Old
  (#15)
Exodite
Maha Guru
 
Videocard: Gigabyte 6950 2GB @ Stock
Processor: Intel i7 2600K @ 1.15V
Mainboard: ASUS P8P67 Pro B3
Memory: 8GB Kingston PC10600
Soundcard: Realtek 892
PSU: Seasonic SS-460FL
Default 12-02-2010, 13:59 | posts: 1,652 | Location: LuleŚ, Sweden

This is getting tiresome.

Both vendors use optimizations that can adversely affect image quality in the default settings. Both vendors allow such optimizations to be disabled.

It's completely inane to apply some kind of scaling pulled entirely from the backside to 'compensate' for this when testing different cards. Obviously testing should be done at default settings and image quality compared.

If there are obvious differences in image quality that has to be accounted for when reviewing a product, obviously, but reading quite a lot of reviews both here and on other sites it's obvious there are no such differences.

Bar the odd bugs, obviously.
   
Reply With Quote
Old
  (#16)
VultureX
Maha Guru
 
Videocard: GTX670 4GB SLI
Processor: Core i7 2700K @4.8GHz H2O
Mainboard: Asrock Z68 Extreme3 Gen3
Memory: 8GB G.Skill 2133MHz CL9
Soundcard: Xonar Essence ST - Z-5500
PSU: Corsair TX850 V2
Default 12-02-2010, 14:00 | posts: 2,139 | Location: Netherlands

The first comparison shot is not clear to me.

Why does it say NV(16xQ) in the left screenshot (looks like an nvidia only anti-aliasing mode to me) and 16xAF in the right screenshot?

Also the article does not make it clearer:
Quote:
To your right you can see NVIDIA at work to the left an ATI Radeon 5000 series card
And further down:
Quote:
You can see that the right image from the Radeon 5000 card
The left screenshot looks better, so I guess that would be Nvidia... But do they use exactly the same AA and AF settings? Otherwise this would not be a fair comparison:s

Another point would be the use of jpg compression... I'd take comparisons and save the screens as lossless png to do it right...

Last edited by VultureX; 12-02-2010 at 14:02.
   
Reply With Quote
Old
  (#17)
Hilbert Hagedoorn
Don Vito Corleone
 
Hilbert Hagedoorn's Avatar
 
Videocard: AMD | NVIDIA
Processor: Core i7 4770
Mainboard: Z77
Memory: 8GB
Soundcard: X-Fi - GigaWorks 7.1
PSU: 1200 Watt
Default 12-02-2010, 14:11 | posts: 19,564 | Location: Guru3D testlab

Quote:
Originally Posted by VultureX View Post
Another point would be the use of jpg compression... I'd take comparisons and save the screens as lossless png to do it right...
Fixed, right is Radeon, obiously.

On the PNG/JPG. The full blown 24-bit PNG or even BMP is 13 MB per image, I opted JPG at maximum quality (100%) . With the screenshots at 2560x1600 pixels you will not notice the difference.


Follow Guru3D on twitter.
Follow Guru3D on facebook.
   
Reply With Quote
Old
  (#18)
k3vst3r
Ancient Guru
 
k3vst3r's Avatar
 
Videocard: Tri-fire 290x Qnix 120Hz
Processor: i7 4770k 4.6
Mainboard: Asus Maximus Hero VI
Memory: 4x4GB 2400
Soundcard: SB X-FI Titanium
PSU: Corsair AX1200i
Default 12-02-2010, 13:54 | posts: 2,928 | Location: Sheffield UK

HH says recently

Currently with the Radeon HD 6000 series release the Catalyst drivers (Catalyst AI) have a new setting which allows control over control texture filtering with settings for 'High Quality', 'Quality' and 'Performance'.

High Quality turns off all optimizations and lets the software run exactly as it was originally intended to. Quality, which is now the default setting - applies some optimizations that AMD believes remains objective and keeps the integrity of the image quality at high levels while gaining some performance. The last setting is the Performance setting which applies supplementary optimizations to gain even more performance.


What HH is explaining is newest drivers have optimizations enabled by default where as nvidia don't.


   
Reply With Quote
Old
  (#19)
Lane
Ancient Guru
 
Videocard: 2x HD7970 - EK Waterblock
Processor: I7 4930K H2o EK Supremacy
Mainboard: Asus X79 Deluxe
Memory: G-Skill C9 2133mhz 16GB
Soundcard: X-FI Titanium HD + SP2500
PSU: CM 1000W
Default 12-02-2010, 14:08 | posts: 5,196 | Location: Switzerland

I have one question anyway, have you make the test of fps using an older driver and the new one with new settings option ? it look like you use only HQ and perf for determine the difference on the same driver .. or each driver profile increase--- it's same for nvidia if you remove all optimisation on the driver ( optimisation/by game ) you will get a similar down on performance.

it was too the case if you set CAT AI off with old drivers ..( cause there's no more any profile by game and so bug and other fixes, + specific optimisation will not work, including crossfire profile.. )

If you want to compare fps lost or gain, you need test with the Cat 10.9 or 10.10 with the driver set on Cat "standard " and then set with new catalyst " Quality ", HQ etc... and then you can see how much between you gain loose between " quality on new driver " and standard on old.. i really doubt the difference is 8%, most likely 1% ( so at max 1-2fps, nothing who can make you say they cheat for it.)

Trackmania on HD5870 is the worst example, as this game fail with a problem on the AF algorythm who bug with noisy textures, you can't use this game for a comparaison, cause whatever is the the reason, this is not due to a Optimisation or wathever ....

go read the article of 3D center claiming AMD have a hardware problem on HD5870-HD4870-HD3870 with AF, and you will see they have allready used TM ...

In reality this make 3 years 3Dcenter use TM, HL2 for claming the difference in AF between ATI and Nvidia...

This is one concern i have about all of this, why don't use BC2, Dirt2, F12010, COD for check quality difference, and use instead games like HL2 ( and strangely enough not L4D), Quake 3, TM, or Oblivion .... since how many year you have see a review who use thoses games ?

Last edited by Lane; 12-02-2010 at 14:33.
   
Reply With Quote
Old
  (#20)
TDurden
Maha Guru
 
Videocard: Asus HD6870 1GB
Processor: Core 2 Quad Q9550@3.4GHz
Mainboard: Asus P5E X38(X48)
Memory: 8GB A-DATA DDR2 800
Soundcard: X-Fi Xtreme Gamer
PSU: CM Real Power Pro m520
Default 12-02-2010, 14:03 | posts: 1,799 | Location: LT

Difference is visible in some games..AMD on the right.
http://www.3dcenter.org/image/view/3746/_original
http://www.3dcenter.org/image/view/3747/_original
http://www.3dcenter.org/image/view/3731/_original
   
Reply With Quote
Old
  (#21)
wolvy
Newbie
 
Videocard: 4570
Processor: T4300
Mainboard:
Memory:
Soundcard:
PSU: -
Default 12-02-2010, 14:07 | posts: 14 | Location: Netherlands

And what about the users with lower range video cards?I`m not a big quality freak, i don`t play games alot , but when i do i want them to be playable, not running with 25 fps on ULTRA HIGH settings.Those extra 8% performance gain are more than welcome for me, "cheating" or not.....
   
Reply With Quote
Old
  (#22)
Toli001
Newbie
 
Videocard: GeForce GTX 560 Ti SLI
Processor: Intel Core i7 950
Mainboard: Asus X58 Sabertooth
Memory: Corsair 1600MHz C8 3x2GB
Soundcard: Integrated
PSU: XFX Black Edition 750W
Default 12-03-2010, 08:46 | posts: 6


Look at the writings at the back of the screenshot 1 and 2, "TrackMania". ATI looks much better if you ask me, especially in second screenshot.
   
Reply With Quote
Old
  (#23)
Ryu5uzaku
Ancient Guru
 
Ryu5uzaku's Avatar
 
Videocard: 290X & 7950
Processor: 3770K & 2500K
Mainboard: GB Z77X-D3H / GB Z77-D3H
Memory: 16GB & 16GB G.Skill
Soundcard: Meridian 2G / xf-i fata
PSU: 1000W & 500W silverstone
Default 12-03-2010, 11:04 | posts: 3,763 | Location: Finland

Quote:
Originally Posted by Toli001 View Post
Look at the writings at the back of the screenshot 1 and 2, "TrackMania". ATI looks much better if you ask me, especially in second screenshot.
rofl thats cool stuff you can see that both images have their strong points which is kinda funny in distance nvidia looks really poor compared the texts aren't clear at all
   
Reply With Quote
Old
  (#24)
Darren Hodgson
Ancient Guru
 
Darren Hodgson's Avatar
 
Videocard: EVGA NVIDIA GTX 780
Processor: Intel Core i7-4770K
Mainboard: ASUS Z87 Deluxe
Memory: 16GB Corsair Veng 1600MHz
Soundcard: SB X-Fi Titanium HD
PSU: CM Silent Pro M 850W
Default 12-02-2010, 14:12 | posts: 11,257 | Location: England

When I had my HD 5870 CFX setup, I would always up the (I think they were) texture filtering settings from Quality to High Quality so I was always aware that by not doing that I was compromising the image quality. Now I'm back with NVIDIA I noticed that they too default to a texture filtering setting of Quality and, again, I have raised that to High Quality at the cost of a little performance.

So, unless I'm misunderstanding this article, both NVIDIA and AMD seem to apply the same texture filtering optimisations and neither use the best quality settings, it is up to the end-user to select them.
   
Reply With Quote
Old
  (#25)
k3vst3r
Ancient Guru
 
k3vst3r's Avatar
 
Videocard: Tri-fire 290x Qnix 120Hz
Processor: i7 4770k 4.6
Mainboard: Asus Maximus Hero VI
Memory: 4x4GB 2400
Soundcard: SB X-FI Titanium
PSU: Corsair AX1200i
Default 12-02-2010, 14:15 | posts: 2,928 | Location: Sheffield UK

Quote:
Originally Posted by Darren Hodgson View Post
When I had my HD 5870 CFX setup, I would always up the (I think they were) texture filtering settings from Quality to High Quality so I was always aware that by not doing that I was compromising the image quality. Now I'm back with NVIDIA I noticed that they too default to a texture filtering setting of Quality and, again, I have raised that to High Quality at the cost of a little performance.

So, unless I'm misunderstanding this article, both NVIDIA and AMD seem to apply the same texture filtering optimisations and neither use the best quality settings, it is up to the end-user to select them.
quality on nvidia doesn't have texture filtering optimizations enabled

What HH is saying is you get 100% image on quality setting for nvidia cards by default but 99% image quality on ATI cards by default with latest drivers


   
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



Powered by vBulletin®
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
vBulletin Skin developed by: vBStyles.com
Copyright (c) 1995-2014, All Rights Reserved. The Guru of 3D, the Hardware Guru, and 3D Guru are trademarks owned by Hilbert Hagedoorn.