Guru3D.com Forums

Go Back   Guru3D.com Forums > General Chat > Frontpage news
Frontpage news Perhaps you have some news to report or want to check out the latest Guru3D headlines and comment ? Check it in here.


Reply
 
Thread Tools Display Modes
Old
  (#51)
Omagana
Maha Guru
 
Omagana's Avatar
 
Videocard: Evga GTX 780
Processor: Intel Core i7 4770k
Mainboard: Asus Sabertooth Z77
Memory: Corsair XMS3 16GB
Soundcard: Asus Xonar D2X 7.1
PSU: Corsair HX850W
Default 12-02-2010, 18:11 | posts: 2,361 | Location: Scotland

Personally,

An 8-10% increase in performance by removing some bits of fine detail I can't even notice without comparing two screenshots and looking very closely -sounds like intelligent design. Plus it can be turned off if knowing about it bothers you. When I start too visually notice something, then I'll be "up in arms"

The only folk bitching about this are Nvidia users who want a reason to claim bigger e-peen lol...sad
   
Reply With Quote
 
Old
  (#52)
Kohlendioxidus
Maha Guru
 
Kohlendioxidus's Avatar
 
Videocard: HIS 7950 IceQ Turbo CFX
Processor: AMD FX 8350
Mainboard: ASUS 990FX Crosshair V
Memory: 8Gig Crucial Balist @1866
Soundcard: Creative XFi Fatality Pro Gamer
PSU: Corsair TX 750W
Cool 12-02-2010, 18:17 | posts: 1,273 | Location: Germany

Quote:
Originally Posted by Omagana View Post
Personally,

An 8-10% increase in performance by removing some bits of fine detail I can't even notice without comparing two screenshots and looking very closely -sounds like intelligent design. Plus it can be turned off if knowing about it bothers you. When I start too visually notice something, then I'll be "up in arms"

The only folk bitching about this are Nvidia users who want a reason to claim bigger e-peen lol...sad
Seems...you'r right!
   
Reply With Quote
Old
  (#53)
PinguX
Maha Guru
 
PinguX's Avatar
 
Videocard: AMD 7870 LE
Processor: i5 2500K
Mainboard: Asrock Z77 Pro
Memory: G.Skill DDR3 8GB
Soundcard: Asus Xonar DG
PSU: OCZ Stealth XStream2 600W
Default 12-02-2010, 18:25 | posts: 829

Quote:
Originally Posted by Redemption80 View Post
Nothing like TWIMTBP, exact opposite actually.

With Nvidia and the TWIMTBP, AMD users lose out.
With AMD and this, AMD users lose out.
How do AMD users lose out if there getting extra performance without a noticeble penatly on image quality ?
   
Reply With Quote
Old
  (#54)
alanm
Ancient Guru
 
alanm's Avatar
 
Videocard: MSI TF GTX 770
Processor: i5-3570k
Mainboard: Asrock Z77 Pro4
Memory: 8gb G.Skill DDR3 1600
Soundcard: SB Zx
PSU: XFX Pro 750w
Default 12-02-2010, 18:36 | posts: 5,484

Quote:
Originally Posted by Omagana View Post
Personally,

An 8-10% increase in performance by removing some bits of fine detail I can't even notice without comparing two screenshots and looking very closely -sounds like intelligent design. Plus it can be turned off if knowing about it bothers you. When I start too visually notice something, then I'll be "up in arms"
I see nothing wrong with ATI's little trick from a practical standpoint for its users. In fact I wouldnt mind if Nvidia did this themselves, for an 8-10% perf increase vs negligible IQ penalty. But sadly, it doesnt look like its being done for the benefit of its users, but rather as an advantage in marketing cards that appear to be 8-10% faster vs the competition.

I'm glad that this happened so both sides are now aware that intense scrutiny will be on all future driver releases.
   
Reply With Quote
 
Old
  (#55)
Omagana
Maha Guru
 
Omagana's Avatar
 
Videocard: Evga GTX 780
Processor: Intel Core i7 4770k
Mainboard: Asus Sabertooth Z77
Memory: Corsair XMS3 16GB
Soundcard: Asus Xonar D2X 7.1
PSU: Corsair HX850W
Default 12-02-2010, 18:42 | posts: 2,361 | Location: Scotland

Quote:
Originally Posted by alanm View Post
But sadly, it doesnt look like its being done for the benefit of its users, but rather as an advantage in marketing cards that appear to be 8-10% faster vs the competition.
Its an advantage in marketing because its a great "trick", you just said yourself you wouldn't mind if Nvidia did it.

Obviously its done to benefit users.
   
Reply With Quote
Old
  (#56)
Ryu5uzaku
Ancient Guru
 
Ryu5uzaku's Avatar
 
Videocard: 290X & 7950
Processor: 3770K & 2500K
Mainboard: GB Z77X-D3H / GB Z77-D3H
Memory: 16GB & 16GB G.Skill
Soundcard: Meridian 2G / xf-i fata
PSU: 1000W & 500W silverstone
Default 12-02-2010, 18:47 | posts: 3,928 | Location: Finland

Quote:
Originally Posted by alanm View Post
Which if you read the article, was rebutted by Nvidia with a detailed technical reply.
for sure but if ati doesn't have such things blocking it it's always testing ati on let's say 8xAA when nvidia is 4xAA... or so... but really both do optimizations nvidia does it in game profiles and ati does with that arse catalyst a.i and well you can turn everything off and have crappier performance with near to no change in iq unless you go checking really hard.
   
Reply With Quote
Old
  (#57)
deltatux
Ancient Guru
 
deltatux's Avatar
 
Videocard: GIGABYTE Radeon R9 280
Processor: Intel Core i5 3570K @4.5
Mainboard: GIGABYTE GA-Z77X-UD5H
Memory: Patriot 4 x 4GB DDR3-1600
Soundcard: Auzentech X-Raider 7.1
PSU: OCZ ModXStream Pro 500W
Default 12-02-2010, 19:10 | posts: 19,055 | Location: Toronto, Canada

Yes lowering IQ for faster FPS could be consider cheating but really, if you're in a fast pace battle, would you really notice the difference? I think I'd be focus of not getting my ass killed online than say "there's missing textures here, the AF is off here".

I think ATi put it to "Quality" by default since most people don't give a crap because they don't usually notice it.

deltatux
   
Reply With Quote
Old
  (#58)
ClaymoreMD
Newbie
 
Videocard: Asus 4870x2
Processor: Phenom 965
Mainboard:
Memory:
Soundcard:
PSU: Corsair 850w
Default 12-02-2010, 19:54 | posts: 25

My opinion is that both cards should be tested under same visual quality conditions. If that means ATI on high quality and Nvidia on quality then so be it, assuming both look the same. Another option would be to put a note under every benchmark that there is a slight but noticeable difference in visual quality between the cards.

Even if this is more of a marketting issue than a real issue, I believe the competition should be fair and transparent. It is possible that the evolution of graphics cards will lead to a situation where all the competitors will offer so different solutions and optimizations that it will be impossible to compare them under the same settings. I say same visual quality even if it means different settings or default settings with a note in every test and review, not just this article. People need to know what they are buying.

I still like ATI, will buy it again and leave this optimization on because with anything but these preselected images, it will be nearly impossible to notice. But the difference is there and competition should be as fair as possible.
   
Reply With Quote
Old
  (#59)
|Ano|
Master Guru
 
|Ano|'s Avatar
 
Videocard: GTX560Ti Twin FrozR II
Processor: I7-950 4200Mhz HT
Mainboard: Sabertooth X58
Memory: 6Gb XMS3 1680Mhz CL8
Soundcard: Asus Xonar DX
PSU: Tx850W
Default 12-02-2010, 20:03 | posts: 275 | Location: Sweden

Well since they compare a 6870 (budget card) to a gtx580 (Premium enthusiast card) i think the comparison fails before it has even started.

BUT!

I do not like what i'm hearing, if this is true my next card will be from nVidia since AMD pretty much scammed every single customer when releasing 6850 and 6870 with tweaked drivers. Those numbers were not accurate since they downed the IQ.
   
Reply With Quote
Old
  (#60)
Vistixx
Newbie
 
Videocard: 2x GTX 460
Processor: i7 860 @3,8
Mainboard: MSI P55-GD65
Memory: Corsair 4GB DDR3 1600Mhz
Soundcard: X-FI TITANIUM
PSU: Corsair 850W
Default 12-02-2010, 20:14 | posts: 16

And since when does image quality differ on a budget card from a "premium enthusiast card"?
Also, i wouldnt call a 6870 a budget card really, it's as fast as a 5850 and costs about 200bucks
   
Reply With Quote
 
Old
  (#61)
Sr7
Master Guru
 
Videocard:
Processor:
Mainboard:
Memory:
Soundcard:
PSU:
Default 12-02-2010, 20:16 | posts: 246

It amazes me that some of you can defend this.

Think about it this way. First you have the texture format substitution, resulting in a slightly different picture, but mostly unnoticeable IQ. Then you add in AF hacks, resulting in slightly worse AF, both static and in motion. Here it's only "partially noticeable". Then ask yourself what other hacks have been done that aren't publicized, or what other hacks will be done in the immediate future.

Each of those was, relatively speaking, a small decrease in IQ. However, taken as a whole, the IQ deviated from the original IQ much more noticeably.

Then factor in that they're getting performance by offering a lesser IQ than NVIDIA purely to win benchmarks... in a time when most midrange GPUs can offer maxed out settings, why not use that GPU horsepower for something useful? Why regress towards console quality, taking away one of the advantages that PC has?

That's why this trend is unacceptable.
   
Reply With Quote
Old
  (#62)
TheHunter
Banned
 
Videocard: MSi N570GTX TFIII [OC|PE]
Processor: Intel C2Q 9450 @ 3.576GHZ
Mainboard: Gigabyte GA-X48-DS5 [F8H]
Memory: Corsair D. 2x2GB @1073MHZ
Soundcard: XFi Fatality Pro [SB046A]
PSU: Tagan Piperock 600W [48A]
Default 12-02-2010, 20:20 | posts: 13,439 | Location: √╥

Quote:
Originally Posted by ClaymoreMD View Post
My opinion is that both cards should be tested under same visual quality conditions. If that means ATI on high quality and Nvidia on quality then so be it, assuming both look the same. Another option would be to put a note under every benchmark that there is a slight but noticeable difference in visual quality between the cards.

Even if this is more of a marketting issue than a real issue, I believe the competition should be fair and transparent. It is possible that the evolution of graphics cards will lead to a situation where all the competitors will offer so different solutions and optimizations that it will be impossible to compare them under the same settings. I say same visual quality even if it means different settings or default settings with a note in every test and review, not just this article. People need to know what they are buying.

I still like ATI, will buy it again and leave this optimization on because with anything but these preselected images, it will be nearly impossible to notice. But the difference is there and competition should be as fair as possible.
you mean nvidia also with v.high quality, its set to quality by default.

I always select v.high quality and texture LOD set to clamp before i start testing them.. I dont want any optimizations like trilinear optimization and anisotropic sample opt., no thank you.
   
Reply With Quote
Old
  (#63)
kapu
Ancient Guru
 
kapu's Avatar
 
Videocard: Gigabyte HD7970 GHz TOP
Processor: Intel i5 750 @ 4.0 /1.38v
Mainboard: Asus P7P55D
Memory: GoodRAM 8GB , 1600 MHz
Soundcard: X-Fi Gamer|Siberia V2
PSU: Be Quiet! E9 680W
Default 12-02-2010, 20:22 | posts: 3,648 | Location: Poland

I wouldn't mind 10% gain from something i can't see while playing.
   
Reply With Quote
Old
  (#64)
perosmct
Banned
 
Videocard: unknown
Processor: unknown
Mainboard: unknown
Memory: unknown
Soundcard: unknown
PSU: unknown
Default 12-02-2010, 20:28 | posts: 1,072 | Location: unknown

that's why from now on...we dont care about benchmarks anymore...but only features...ati is "dead"...i dont give a ****...for benchmark with all the respect...no thank you...many optimizations, even the most of them i have analysed CCC and driver and registry are "hidden"...that means...never you will have the original game's IQ...because ati always have something left silently "on"...next time...i will surpass comparisons on many sites...it's a mind control and lost time...we should demand from amd to change tactic, lower its prices even more, because finally they have even more weaker gpu's from nvidia...
   
Reply With Quote
Old
  (#65)
mitzi76
Ancient Guru
 
mitzi76's Avatar
 
Videocard: MSI 580 (Agua)
Processor: CoreI7 920@4ghz(EK Sup)
Mainboard: Asus P6T Deluxe
Memory: 6gb Corsair 1600
Soundcard: Asus Xonar D1
PSU: 850w Antec
Default 12-02-2010, 20:31 | posts: 7,900 | Location: UK

I ran nvidia cards all the time up until 5870's and for me ati picture quality seems better on my screen.

Perhaps some "placebo effect" but certainly I have never seen anything worth shouting about in terms of image quality.

What on earth is this discussion all about? Seems like a big pile of turd to me. Sorry.

If there was something worth complaining other its ati drivers and xfire on the whole.

Hey maybe thats why since 10.5 they have been wank for me!! It's cos catalyst drivers contain all these extra "optimisations".

But hey I'll let you know soon what its like to be back on Nvidia.

You know what I'd bet a lot of money on the fact I will be able to say there is hardly any notable visual difference and some games run better on nvidia than they did on my Ati setup.

So we prove ati do things differently...big deal. Are we then going to list all of nvidia's bs and cheap tactics to win custom?

Swings and roundabouts chicos
   
Reply With Quote
Old
  (#66)
kapu
Ancient Guru
 
kapu's Avatar
 
Videocard: Gigabyte HD7970 GHz TOP
Processor: Intel i5 750 @ 4.0 /1.38v
Mainboard: Asus P7P55D
Memory: GoodRAM 8GB , 1600 MHz
Soundcard: X-Fi Gamer|Siberia V2
PSU: Be Quiet! E9 680W
Default 12-02-2010, 20:31 | posts: 3,648 | Location: Poland

Quote:

We have a hard time spotting differences as much as you do, and while making the screenshots we increased gamma settings to 50% and applied a resolution of 2560x1600 to try it look more visible.

Do you spot the difference ? Probably not
This sums it pretty well.

Soo i get 8-10% extra performance for something i cant see ?

Well... good job AMD ?
   
Reply With Quote
Old
  (#67)
Lycronis
Maha Guru
 
Lycronis's Avatar
 
Videocard: 2x GTX460 1GB SLI
Processor: i7 4770K - Kuhler 920
Mainboard: Asus Maximus VI Hero
Memory: 16GB Vengeance Pro 1866
Soundcard: Focusrite 2i4 & KRK R6G2
PSU: Antec TPNew 750 Blue
Default 12-02-2010, 20:32 | posts: 1,047 | Location: South China, Maine USA

Quote:
Originally Posted by Omagana View Post
The only folk bitching about this are Nvidia users who want a reason to claim bigger e-peen lol...sad
Yeah, look back a few years when Nvidia was optimizing for 3DMark and notice who was doing all the bitching. It goes both ways so don't come in here thinking you can bash Nvidia fans for the same thing ATI fans would do if the situation was reversed. In fact, if this was about Nvidia you would see MUCH more bitching by ATI fans because of the usual "underdog" syndrome.

This is an issue that needs to be addressed and it is something that can be seen in certain situations. I personally don't consider this a type of cheat, per say, but I do believe end users need to be made aware of it. Fair is fair. Tests should be as equal as possible, regardless of personal preference or whether on not you can notice a difference some of the time. IF there is a difference in image quality in order to gain performance then it should be clearly stated as such or tested at the same competing level.
   
Reply With Quote
Old
  (#68)
perosmct
Banned
 
Videocard: unknown
Processor: unknown
Mainboard: unknown
Memory: unknown
Soundcard: unknown
PSU: unknown
Default 12-02-2010, 20:34 | posts: 1,072 | Location: unknown

the point is now that we all admit that, new benchmarks from now on, must be analysed to have the same quality on both sides...and then we will see who has really stronger gpu's...
   
Reply With Quote
Old
  (#69)
mitzi76
Ancient Guru
 
mitzi76's Avatar
 
Videocard: MSI 580 (Agua)
Processor: CoreI7 920@4ghz(EK Sup)
Mainboard: Asus P6T Deluxe
Memory: 6gb Corsair 1600
Soundcard: Asus Xonar D1
PSU: 850w Antec
Default 12-02-2010, 20:38 | posts: 7,900 | Location: UK

Quote:
Originally Posted by perosmct View Post
the point is now that we all admit that, new benchmarks from now on, must be analysed to have the same quality on both sides...and then we will see who has really stronger gpu's...
but u dont just buy a gpu based on absolute strength do you? or is having the bigger epeen the most important...

personally i care about stability/noise/heat...(ok i dont want a weener of a gpu ofc)

but i guess i see why guru created this topic. people do have a right to know. i think it's all a bit of an unecessary fuel for flaming and more nvidia v amd stuff..
   
Reply With Quote
Old
  (#70)
TheHunter
Banned
 
Videocard: MSi N570GTX TFIII [OC|PE]
Processor: Intel C2Q 9450 @ 3.576GHZ
Mainboard: Gigabyte GA-X48-DS5 [F8H]
Memory: Corsair D. 2x2GB @1073MHZ
Soundcard: XFi Fatality Pro [SB046A]
PSU: Tagan Piperock 600W [48A]
Default 12-02-2010, 20:41 | posts: 13,439 | Location: √╥

@Perosmtc

if it bothers u that much then turn it up to v.high quality?

why would you want to run at quality if you want better IQ anyway.. simple as that.


nv cheats too @ default quality, so...

Last edited by TheHunter; 12-02-2010 at 20:43.
   
Reply With Quote
Old
  (#71)
mitzi76
Ancient Guru
 
mitzi76's Avatar
 
Videocard: MSI 580 (Agua)
Processor: CoreI7 920@4ghz(EK Sup)
Mainboard: Asus P6T Deluxe
Memory: 6gb Corsair 1600
Soundcard: Asus Xonar D1
PSU: 850w Antec
Default 12-02-2010, 20:42 | posts: 7,900 | Location: UK

Quote:
Originally Posted by Lycronis View Post
Yeah, look back a few years when Nvidia was optimizing for 3DMark and notice who was doing all the bitching. It goes both ways so don't come in here thinking you can bash Nvidia fans for the same thing ATI fans would do if the situation was reversed. In fact, if this was about Nvidia you would see MUCH more bitching by ATI fans because of the usual "underdog" syndrome.

This is an issue that needs to be addressed and it is something that can be seen in certain situations. I personally don't consider this a type of cheat, per say, but I do believe end users need to be made aware of it. Fair is fair. Tests should be as equal as possible, regardless of personal preference or whether on not you can notice a difference some of the time. IF there is a difference in image quality in order to gain performance then it should be clearly stated as such or tested at the same competing level.
that actually is a fairly sound arguement. +1.
   
Reply With Quote
Old
  (#72)
morbias
Don Tommasino
 
morbias's Avatar
 
Videocard: GTX 480
Processor: i7 920 D0 @4.2 GHz
Mainboard: Gigabyte EX58-UD5
Memory: 6GB HyperX @1600 6-7-6-18
Soundcard: Audigy 2 ZS + CL 6700's
PSU: ePower Tiger 1200W
Default 12-02-2010, 20:42 | posts: 12,620 | Location: Southampton, UK

Firstly, Nvidia used to have 'Quality' as the default texture processing setting in Forceware drivers, not 'High Quality'. They don't do it now but it's not like they never did the same thing.

Secondly, the hypocrisy in this thread is hilarious; if this article was focused on Nvidia most of the posts herein would include the words 'epic' and 'fail', but because it's ATI... apparently it's a feature.


Follow Guru3D on twitter.
Folding@Home - join team Guru3D!
   
Reply With Quote
Old
  (#73)
mitzi76
Ancient Guru
 
mitzi76's Avatar
 
Videocard: MSI 580 (Agua)
Processor: CoreI7 920@4ghz(EK Sup)
Mainboard: Asus P6T Deluxe
Memory: 6gb Corsair 1600
Soundcard: Asus Xonar D1
PSU: 850w Antec
Default 12-02-2010, 20:46 | posts: 7,900 | Location: UK

Quote:
Originally Posted by morbias View Post
Firstly, Nvidia used to have 'Quality' as the default texture processing setting in Forceware drivers, not 'High Quality'. They don't do it now but it's not like they never did the same thing.

Secondly, the hypocrisy in this thread is hilarious; if this article was focused on Nvidia most of the posts herein would include the words 'epic' and 'fail', but because it's ATI... apparently it's a feature.
thats only because nvidia has been an epic fail in some areas but have redeemed themselves re the 460 and the 580.

you could also argue that this is "smear" attempt to discredit the company which has been selling the most gpus recently.

saying that am fed up of ati hehe. does that make me a hypocrite?...i have always tried to be impartial

if the 580 performs worse than my xfire its going straight back.
   
Reply With Quote
Old
  (#74)
arrrdawg
Newbie
 
Videocard: NVIDIA GeForce GTX 460m
Processor: Intel Core i7 740QM
Mainboard:
Memory:
Soundcard:
PSU: Laptop
Default 12-02-2010, 20:50 | posts: 28

Maybe I'm just dumb, but doesn't nvidia default to 'quality' which I am assuming has some sort of optimizations? Also, I show by default trilinear optimizations are also on but anisotropic sample optimizations are off. Both optimization options will gray out if you select high quality. So is this different from what ATI is doing?

Also, the article says something like "anisotropic at 16x and trilinear filtering on".. I thought anisotropic was always trilinear unless you enable optimizations that substitute bilinear when it wouldn't make much of an image quality difference. I could just be ignorant when it comes to this stuff
   
Reply With Quote
Old
  (#75)
3x3cUt0r
Master Guru
 
Videocard: nVidia EVGA GTX470
Processor: Intel Core 2 Duo E7300
Mainboard: Gigabyte GA-P35-DS3R
Memory: Crucial Ballistix 1066
Soundcard: Creative Audigy 2
PSU: Thermaltake Purepower RX
Default 12-02-2010, 20:54 | posts: 319 | Location: Colombia

While i agree with performance improvements with little to no impact on IQ, i hate that some ATI users claim they have a better IQ.
   
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



Powered by vBulletin®
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
vBulletin Skin developed by: vBStyles.com
Copyright (c) 1995-2014, All Rights Reserved. The Guru of 3D, the Hardware Guru, and 3D Guru are trademarks owned by Hilbert Hagedoorn.