Discussion in 'RivaTuner Advanced Discussion forum' started by applejack, Apr 6, 2005.
my d3 ROE is ugly :\
Yes, I just want to stress that what AlecRyben is saying is right. These 'shader versions' you guys keep talking about are irrelevant to OGL. OpenGL 2.0 has nothing to do with 'shaders v 2.0' - that's to do with fragment and vertex program profiles for the Cg or GLSL compiler, and those work EXACTLY the same under GL 2.0 and 1.5 - you just need to use a non-ARB extension to use GLSL shaders under 1.5.
So... yeah... I don't know WHAT you guys're talking about, but suffice to say, most of it is misleading. It's possible I just don't understand the *way* you're talking about it, but even if that IS the case, coming from someone who programs in OpenGL.. you should probably re-word stuff, or just actually not talk about things you don't know about. You need to re-evaluate what you're saying. Badly.
so what you r saying is that no matter which card I use (GF4/FX5600...), Doom3 for example, will use the same shading technology and give equilivant results concerning shading quality?
GF4 does not support fragment shader extension, so it will not give you the same shading quality as the FX5600
did you mean GF4 TI ? (coz i did)
Is there any way I can make a FX 5600 use GF4 TI extensions only? and will it give better performance if so?
By GF4 i mean Geforce 4200/4200x8, 4400/4800SE and 4600/4800.
GeForce4 MX is a DX7 part => GeForce2 in disguise.
Forcing FX 5600 to report lesser version of OpenGL is definitely possible. You will get a better performance but lower IQ (image quality). You could do it either by editing the registry, using some third party ultility or the simplest, preferred way: by lowering the quality settings in your game.
well sometime lowering the quality in game is just not enough.
anyway, it seems like i got answer earlier, though mr. Subtestube made me think we were talking nonsence around here... thanks for making it clear again.