I did not know how to use RT.. SO I used NVMax

Discussion in 'RivaTuner Advanced Discussion forum' started by TO_raptor, Oct 20, 2001.

  1. TO_raptor

    TO_raptor Guest

    I used NVMax on my Geforce 2 pro.<br>
    I am sorry unwinder but i just could find the help i needed. I really need to know what to change and what not to change..<br>
    <br>
    <IMG SRC="smileys/knock.gif"> <br>
    <br>
    there is no point in downloading and blindly changing settings where u have no idea what effect it might produce.<br>
    <br>
    unwinder, the utility seems awesome and well coded but its bad as far as explaining what to do with it. your advice of seeing the help on each setting is not much useful.<br>
    <br>
    i used nvmax. it at least has a very simple interface. and to a little extent the help works in the good old fashion.<br>
    <br>
    for one last time PLease please please please tell me where i can find the details that let me tweak my card using RT effectively and not techincal details that just throws some elusive term at you.<br>
    <br>
    TO_raptor
     
  2. LoyC

    LoyC Member

    Messages:
    41
    Likes Received:
    0
    <A HREF="http://www.xtremepcuk.com/guides/gfg/index.shtml" TARGET=_blank>http://www.xtremepcuk.com/guides/gfg/index.shtml</A><br>
    <br>
    Check out this address, it's old but useful for the time being.
     
  3. Neeyik

    Neeyik Guest

    Since I wrote that original guide (which was about the getting the highest possible 3DMark score) I think it's only fair that I post a basic info guide about each of the settings in RivaTuner's "customising" section.<br>
    <br>
    In the latest version of RT, there are 5 buttons ~ four cover the various hardware and driver adjustments, while the last one is a general "reset to the standard settings" button. Let's deal with each one at a time:<br>
    <br>
    <b>SYSTEM</b><br>
    There 4 sections to this part...<br>
    <br>
    <b>Overclocking</b><br>
    <i>Enable Hardware Overclocking</i> = click the box to allow you to change how fast the graphics processing chip and the RAM on the video card runs at. If you're ticking the box for the first time, a panel will pop up asking you to either reboot or attempt to read the speeds now. For absolute accuracy, it's best to reboot.<br>
    <br>
    <i>Core clock</i> = this slider sets the speed of the graphics processing chip. The two numbers to the left of this slider are the actual speed of the chip, and the one displayed by other programs and the drivers. The proper speed is the bottom one. <br>
    <br>
    Moving this slider to the right will increase the speed of the chip, making your graphics card work faster - however this produces more heat and makes the card more unstable. Typically most people should be able to get a 10% increase in the clock speed without too much trouble. Once you've set the slider accordingly, you must click the <i>Test</i> button at the bottom. The monitor should reset itself and a panel pops up. Wait for the monitor to reset itself once more, and the clock speed will now be set.<br>
    <br>
    <i>Memory clock</i> = this slider sets the speed of the memory chips on the graphics card, and works in the same way as the core clock slider. You can usually get a bigger increase in speed compared to the core, but not always. Again, don't forget to do the Test button.<br>
    <br>
    <i>Apply overclocking...</i> = with this ticked, every time Windows starts up the graphics card will have it's speed set to whatever you had the sliders fixed to when you clicked the <i>Save</i> button.<br>
    <br>
    <i>Test</i> = already mentioned it!<br>
    <br>
    <i>Defaults</i> = puts all the sliders and speeds back to the normal settings.<br>
    <br>
    <br>
    <b>Compatibility</b><br>
    Different motherboard chipset work the AGP slot in different ways. The appropriate section of this part will be selected as according to what chipset your PC has. For example, if it's an Intel one then you won't be able to adjust anything in the VIA or AMD section. The Help button explains everything that you need to know here.<br>
    <br>
    <br>
    <b>AGP</b><br>
    <i>Enable AGP</i> = you should really have this button ticked. Even though there isn't much difference between the various AGP speeds, there's a world of difference between the AGP mode and PCI-66 mode. The following two graphs show roughly how the various AGP bus speeds pan out in 3DMark 2001 and Quake 3 Arena (Quaver demo). Each test was done using a 1.33Ghz Athlon and GeForce2 Pro:<br>
    <br>
    <IMG SRC="http://www.btinternet.com/~neeyik/images/agptest1.jpg"><br>
    <br>
    <IMG SRC="http://www.btinternet.com/~neeyik/images/agptest2.jpg"><br>
    <br>
    For a beginner to RT and video card tweaking, it's best to ignore this section. Questions about it can always be posted here!<br>
    <br>
    <b>Overlay</b><br>
    This section typically applies to running video files.<br>
    <br>
    <i>Zooming area</i> = pretty self-explanatory. It let's you zoom into certain sections of the video footage but the program (and video card drivers) must allow this in the first place. If nothing happens, you know why!<br>
    <br>
    <i>Force hardware...</i> = If you've got several monitors attached to your graphics card and you have the desktop spanned across them, this setting forces the overlay controls to work on both monitors. Note: this is for Windows 2000 systems only.<br>
    <br>
    <i>Use busmastering...</i> = for Windows 9x and ME systems only, it's a "problem" solving setting. If you get funny things on-screen by using certain TV tuner, it can help to have this setting on. Other than that, it's nothing to worry about.<br>
    <br>
    <br>
    <br>
    <b>DIRECT 3D</b><br>
    There are 7 sections to this part...<br>
    <br>
    <b>Mipmapping</b><br>
    <i>Mipmap LOD bias</i> = Moving this slider to the left can produce better looking images in games that use Direct 3D, but at a cost to performance. Set too far over to the left and you'll find that things look to "bitty". Swing the slider over to the right (from the default position of 0) and images will look more "fuzzy" and less detailed. The performance does increase because of this though.<br>
    <br>
    <i>User mipmaps...</i> = leave this ticked.<br>
    <br>
    <i>Automatic mipmap...</i><br>
    As stated in the help file, NVIDIA cards newer than the original GeForce cards don't allow you to adjust the automatic usage of mipmaps, and it the ability to adjust this for the cards that do was removed in the drivers a while back. If you <i>can</i> adjust these settings, then the number of mipmaps produced affects how well something will look against how quickly it will run. The filtering adjusts how well the mipmapping itself looks - bilinear is quicker than trilinear but doesn't look as good.<br>
    <br>
    <br>
    <b>Depth buffering</b><br>
    Leave everything in this section at the default settings. You're not going to gain any real performance increases by changing things here, and at best, it's just going to make certain games look worse.<br>
    <br>
    <br>
    <b>Blitting</b><br>
    Same as above - leave it allow if you don't know what the settings are for.<br>
    <br>
    <br>
    <b>VSync</b><br>
    For the most part, it's best to leave this set to "Auto" and have the game decide what VSync setting it will use. Forcing it on will help reduce the "tearing" effect you sometimes get in games when you turn about very quickly - how it can put a cap on the maximum possible performance. Forcing it off means no cap, but the potential for this tearing. The <i>prerender limit</i> should be set to 2 or 3 for the best balance between performance and visuals.<br>
    <br>
    <br>
    <b>Textures</b><br>
    <i>Texture format</i> = leave this alone!<br>
    <br>
    <i>Texture memory</i> = if you've got an AGP card, put it to zero (if it isn't already). If you've got a PCI card then try setting it to around 1/4 or 1/3 of the total amount of memory you've got in your PC.<br>
    <br>
    <i>Texture filtering</i> = usually best to leave this at "determined by Direct3D application" but if you want to you can force the card to use different levels of filtering. The higher the level the better textures will appear in a game but at a cost of decreasing performance. Anything higher than level 2 is only available for GeForce3 cards.<br>
    <br>
    <br>
    <b>Compatibility</b><br>
    Leave everything set to the defaults - there's no need to adjust them these days.<br>
    <br>
    <br>
    <b>Antialiasing</b><br>
    <i>Enable antialiasing</i> = ticking this box doesn't mean that you will use FSAA in every game you've got. It just means that the feature is available if the program wants to use it.<br>
    <br>
    <i>Antialiasing method</i> = you'll get different levels as according to what video card you've got but unless you have something like a GeForce2 Pro/Ultra or a GeForce3, then the performance hit you take by using FSAA isn't worth the increase in visual quality you get.<br>
    <br>
    <i>Force antialiasing...</i> = kinda obvious this one. Stuff what the program may want to do - if you want FSAA on, no matter what, then tick this box!<br>
    <br>
    <br>
    <b>OPEN GL</b><br>
    There are 5 sections to this part...<br>
    <br>
    <b>VSync</b><br>
    Have a read of the VSync feature mentioned above. It works in the same way.<br>
    <br>
    <i>Buffer flipping mode</i> = leave this set to auto, unless you have real problems getting an Open GL program to work. Sometimes then, setting this to "block transfer" can help out.<br>
    <br>
    <br>
    <b>Rendering Quality</b><br>
    <i>Default bit depth</i> = the "Help" button explains this one fine. If you've got a top graphics card like a GeForce2 or better, then pop it to "32-bits per pixel".<br>
    <br>
    <i>Enable S3TC trick</i> = it's worth ticking this box for games that use compressed textures (eg. Quake 3, Serious Sam). It makes them look better and only for a very small decrease in performance, but only for cards newer than the original GeForce.<br>
    <br>
    <i>Force fast..</i> = if you have a TNT/TNT2 then pop it on. You gain a bit of performance for a very little loss in quality.<br>
    <br>
    <i>Default degree...</i> = works in the same way as the texture filtering section in Direct3D. The higher the level then the better your OpenGL game should look - I tend to leave it at Level 2 for most games.<br>
    <br>
    <br>
    <b>Hardware compatibility</b><br>
    Leave everything alone here but make sure that the <i>Disable support for enhanced CPU...</i> button is NOT ticked! The <i>Texture memory</i> setting works like the one in Direct3D above.<br>
    <br>
    <br>
    <b>Professional</b><br>
    Don't bother with this section, even if you can adjust things - again though, if you want to know more use the Help button or post in the forum.<br>
    <br>
    <br>
    <b>Antialiasing</b><br>
    Works very similar to the section in Direct3D BUT if you tick the <i>Enable Antialiasing</i> button, then all OpenGL programs will run with FSAA on!<br>
    <br>
    <br>
    <br>
    <b>COLOUR ADJUSTMENT</b><br>
    There are two sections to this part. For a beginner I would seriously recommend that you don't play around with these controls - it's not that you'll do any damage as you can just click the default button to put it back to normal. It's just that unless you're really into computer art or image processing, then there's no need to adjust the colour controls.<br>
    <br>
    The <i>Digital Vibrance</i> control works for GeForce2 MX and GeForce3 cards only - it can make colours seem more "kapow!" but without making the whole screen looked washed out. Experiment to see if you like it or not.<br>
    <br>
    I've mentioned overlays earlier (mainly to do with video files) and you can adjust the colour profile for these too.<br>
    <br>
    <br>
    This is only meant as a simple guide for beginners to RivaTuner - the help system gives more detail and the forums are even better for getting that specific answer to your question. Any mistakes are mine alone and nothing to do with RivaTuner or it's author Unwinder. Any spelling mistakes are due to my bad typing!<br><br><i>This message was edited by Neeyik on 21 Oct 2001 12:09 PM</i>
     
  4. Kakaru2

    Kakaru2 Master Guru

    Messages:
    789
    Likes Received:
    0
    GPU:
    Gigabyte 8800GTX 768MB
    Nice said Neeyik.<br>
    But for VSync in OGL in my opinion is best at Always off. The tearing in OGL is almost 0 when compared with D3D. U know Microsoft wants ppl to play w. VSync on, don't u?
     

Share This Page