Why can't I create my own "low power 3D" clock settings??

Discussion in 'RivaTuner Advanced Discussion forum' started by PIRATA!, May 5, 2007.

  1. PIRATA!

    PIRATA! Guest

    Messages:
    77
    Likes Received:
    0
    GPU:
    XFX 8800 ULTRA
    The question is about understanding why by default this setting has been disabled.

    I mean.... how can a "beginner" make something bad setting wrong clocks??

    What is this "low power 3D" scheme??

    And most important: when does this scheme enter in action?? Is the moment/situations that activates this middle scheme the real main problem of using it??


    I would like to prevent Windows to activate my 3D frequencies while using program different from games that uses or simulates something that is recognized as 3D.

    Do I have only to set the frequencies of the "low power 3D" as the "standard 2D" to have always the same clocking in everything then games??



    I have made some tests and I have noticed that the automatic clocking does not wirk good because the GPU and the RAM clock just can't change at the same time: if I set the "standard 2D" and the "low power 3D" schemes with the same clocks, when the "performance 3D" scheme is applied its clocks does not change both, but it changes only the GPU or the RAM according of what I have moved last before saving the profile.

    Please let me undersnta d why isn't possible with RivaTuner to manage both GPU and RAM autocatically at the same time.
    Thank you.
     
    Last edited: May 5, 2007
  2. boogieman

    boogieman Ancient Guru

    Messages:
    1,984
    Likes Received:
    49
    GPU:
    MSI GTX 1080X
    As "I" understand it...it's the Forcewares...not RT that causes this.
     
  3. Unwinder

    Unwinder Ancient Guru Staff Member

    Messages:
    17,127
    Likes Received:
    6,691
    As boogieman said, low-power clocks are activated by driver if:

    1) GPU idles during 3D and power consuming can be saved without degrading the performance (e.g. when new level is being loaded in game).
    2) GPU works unstable in 3D mode and clocks are reduced due to protective throttling mechanisms. In this case driver first drops clocks to low-power 3D ones, then to 2D ones if it didn't help.
     
  4. PIRATA!

    PIRATA! Guest

    Messages:
    77
    Likes Received:
    0
    GPU:
    XFX 8800 ULTRA
    So you mean that there is no way to use this middle scheme to help managing automatic clocking from those situations where we have desktop apps calling some 3D stuff??
     

  5. Unwinder

    Unwinder Ancient Guru Staff Member

    Messages:
    17,127
    Likes Received:
    6,691
    Correct. You cannot change the driver's performance level management approach. However, you can completely replace it with RT's own one by creating a few profiles and associating them with a threshold on "Hardware accleration graph". The graph is build using the statistics server, and you may force the server to ignore desired pseudo-3D applications.
     
  6. PIRATA!

    PIRATA! Guest

    Messages:
    77
    Likes Received:
    0
    GPU:
    XFX 8800 ULTRA
    Now I use different profiles (one for 2D for desktop use and one 3D for games) that I have made following this tips, and that I activate by my own by clicking on the "Launch > " menu of the taskbar icon.


    About what you have said, I had managed with that in the past, but I think I have never enabled any function about ignoring desider pseudo-3D apliations.

    What I have done is what was write in Marc's Speed Guide, and with that method I have continous jumps between 2D and 3D scheme in normal Windows use.

    What is this "forcing the server to ignore desired pseudo-3D applications" ??

    Most importante: how does it work? How does it recognizes that it have not to activate the 3D scheme but coninue with using the 2D scheme???


    Thank you.
     
    Last edited: May 5, 2007
  7. Unwinder

    Unwinder Ancient Guru Staff Member

    Messages:
    17,127
    Likes Received:
    6,691
    "Hardware acceleration" graph (default provider) displays hardware acceleration status detected via statistics server. By default any application using DD/D3D/OGL libraries is detected by server as 3D application, however you may define the profiles in the server and this way prevent any desired application from being detected by it.
    The graph is binary, i.e. it shows 0 (no acceleration) and 1 (acceleration is in use) values. So you can easily define a threshold at let's say 0.5 and associate profiles with it to automate 2D/3D clock switching.
     
  8. PIRATA!

    PIRATA! Guest

    Messages:
    77
    Likes Received:
    0
    GPU:
    XFX 8800 ULTRA
    So do I have to activate the statistic server to do this?
     
  9. boogieman

    boogieman Ancient Guru

    Messages:
    1,984
    Likes Received:
    49
    GPU:
    MSI GTX 1080X
    I use core temperature to activate 2D/3D, works well.
     
  10. Unwinder

    Unwinder Ancient Guru Staff Member

    Messages:
    17,127
    Likes Received:
    6,691
    Sure.
     

  11. PIRATA!

    PIRATA! Guest

    Messages:
    77
    Likes Received:
    0
    GPU:
    XFX 8800 ULTRA
    I am analyzing your post and serching for some relation with what I am seeing now in the Statistic Server.

    Correct me if I am wrong:

    If I am correct, I should enabe the Statistic Server on Windows Startup and first of all try if I can fix mt continuous bumping between 2D and 3D schemes without forcing any particular application to hook any Direct3D or OnelGL stuff.

    Thenn, if the problem still persists, you say to set for each app i need it to go with the 3D scheme, the related Direct3D or OpenGL setting.

    Now I don't understand why do I have to define a threshold for the Hardware acceleration graph if the Statistic Server gives the 2D or 3D scheme activation hint.

    Is it why the Statistic Server does not give any hint to change between the 2D and 3D scheme???

    Thank you.
     
  12. Unwinder

    Unwinder Ancient Guru Staff Member

    Messages:
    17,127
    Likes Received:
    6,691
    Nope, you're understanding everything completely wrong. The server gives you an ability to define your own 2D/3D clock switching approach via the thresholds.
    It doesn't allow you to alter the driver's own 2D/3D switching algorithm anyhow. If you don't like the driver's clocking approach - define a few overclocking profiles with 2D/LP3D/3D clocks set to the same values to make the driver's clock switching virtually invisible. Then manage 2D/3D profiles loading with RT using the threshold on "Hardware acceleration" graph.

    Do you understand that 2D/3D clock swithcing is performed by ForceWare and not anyway related to RT, don't you?
     
    Last edited: May 8, 2007
  13. PIRATA!

    PIRATA! Guest

    Messages:
    77
    Likes Received:
    0
    GPU:
    XFX 8800 ULTRA
    Before installing RT I have never noticed any clock changing in my 8800GTX: it was always like fixed to default GPU:576/RAM:900.

    Since I have installed RT, I have managed using it to have different clock, one for desktop use (GPU:290/RAM:450) and one for gaming (default like I mention above).

    Before my 8800GTX, so with other NVidia video cards, I used to try to manage with automating the clock changing by following Marc's Speed Guide that uses the "Core clock MHz" graph for managing the launch of the two clock profiles (2D and 3D).

    Now with 8800GTX I have only tried to use the manual executions of the profiles and no automatism because I want to be extremely sure when to active 2D or 3D profiles.


    I understand that RT makes a better work in changing the 2D or 3D clock, and I understand that I must do everything with RT without using anything of the ForceWare drivers.

    I donw understand when you say:
    At what clocking profile do you refere? To the ForceWare profiles or RT??

    Talking about the rest, you are saying is that applying the threshold on the "Core clock MHz" graph is like following the drivers ForceWare stimulus of 3D activation, while il applying the threshold to the "Hardware acceleration" graph like you say would give me the capability to manage the profile charge some way better and completely with RT without the use of any ForceWare stimulus..... right??

    If this is YES, in the Statistic Server I can only see which or which not application setting (Direct3D or OpenGL) can be hooked on which application.

    How dose these hoocking affect the "Hardware acceleration" graph so to let it manage my 2D and 3D profile with its threshold???


    Thank you again.

    P.S.: I have opened new thread on the most famous Italian hardware board about all my solved questions on RT, and it has been marked as sticky. Please understand that my topic is helping alot of italian people in using RT, and your help is well rewarded.
    Thank you very much again.
     
  14. Unwinder

    Unwinder Ancient Guru Staff Member

    Messages:
    17,127
    Likes Received:
    6,691
    ForceWare doesn't have any overclocking profiles. So I mean RivaTuner's profiles of course. For example, if you're going to use 500/700 for 2D mode and 600/800 for 3D mode, you'll need to create two driver-level overclocking profiles in RivaTuner:

    1) 500/700 for 2D/LP3D/3D
    2) 600/800 for 2D/LP3D/3D

    In this case you simply won't notice ForceWare's clock frequency adjustment. Even if it will try boost clocks for desktop application when the first profile is in use, or even if it will try to throttle clocks during 3D application - you won't see any changes because all the clocks are the same.

    Correct. A threshold on "Core clock" graph allows you to track ForceWare's performance level switching.

    Correct again. "Hardware acceleration" graph is built by means of server, which is not anyway linked with ForceWare.

    So what confuses you? The server's proiles allow you to enable D3D/OGL detection for all (via global server's profile) or just for desired 3D applications. Basically you don't even need to make any changes in the server's default configuration - it is configured by default to allow D3D/OGL detection for all applications besides most recently used pseudo 3D ones.
     
  15. PIRATA!

    PIRATA! Guest

    Messages:
    77
    Likes Received:
    0
    GPU:
    XFX 8800 ULTRA
    Yes, yes... I have already done profiles like that: one 2D that has all the three 2D/LP3D/3D with same "low" clocks, and a 3D that still has all the three 2D/LP3D/3D the same, but this time to "high" clocks.
    I am using them my lauching the manually from desktop and they wirk just fine!

    About the ForceWare "prifiles" I thought you where referring to the new NVidia Control Panel profile feature. Nevermind.... now its all understood about this.

    So I just have to run the Statistic server and then create a threshold to the "Hardware acceleration" graph set to about 0.5 to manage my "manual" profiles in an automatic way??
     

  16. Unwinder

    Unwinder Ancient Guru Staff Member

    Messages:
    17,127
    Likes Received:
    6,691
    Yes. An if you'll meet some application, which is using DirectDraw/Direct3D/OpenGL hardware acceleration but doesn't actually render anything (e.g. Everest is using OpenGL/D3D for 3D capabilities reporting, but it is not a real 3D application and you may wish not to apply 3D clock to it) - you'll be easily able to add a profile for it to the server and prevent server from hooking DD/D3D/OGL for this application.
    Predefined profiles for most of such applications (e.g. Fraps, ATITool, ATITrayTool, different media players, browsers using hardware accelerated Java etc.) are already included in the server. If you'll find any other application which is treated as 3D but it actually shouldn't be - just let me know and I'll include it in the list of profiles of future version of the server.
     
    Last edited: May 8, 2007
  17. PIRATA!

    PIRATA! Guest

    Messages:
    77
    Likes Received:
    0
    GPU:
    XFX 8800 ULTRA
    Thank you very much.

    I'll try also to see and understand the real differences in using the Statistic Server and without using it.

    P.S.: Is it possible to load the Statistic Server at Windows startup and to not show its tray icon in the taskbar?? I already have one for RT and would like to manage to have only that one...

    Thank you. ;)
     
  18. Unwinder

    Unwinder Ancient Guru Staff Member

    Messages:
    17,127
    Likes Received:
    6,691
    Of course no. It would be illogical, server is a completely independent application which is bundled not only with RT. So it must have its own properties accessible via its own tray icon.
     
  19. PIRATA!

    PIRATA! Guest

    Messages:
    77
    Likes Received:
    0
    GPU:
    XFX 8800 ULTRA
    Ok!
    I have configured RT like you said and I have to say that this automatic clock switching works really good.

    But tell me this: why the Statistic Server can be used by its own?? Which other app can use it???

    Thank you.
     
  20. Unwinder

    Unwinder Ancient Guru Staff Member

    Messages:
    17,127
    Likes Received:
    6,691
    The server is dediated for two main tasks:

    1) Monitoring framerate (like FRAPs or D3DGear) and providing framerate statistics to third party client applications. I.e. any application willing to monitor framerate can easily do it via the server. It can be used as completely independent framerate monitoring/displaying solution and function like FRAPS, if you tick "Show own statistics in On-Screen display" in the server's properties. And some people use just a single server for this purpose.

    2) Providing On-Screen Display service to any third party client application. Any application may connect to the server's OSD like RT and display personal text info in On-Screen Display directly in 3D applications.

    Due to these possibilities server is bundled with third-partly products, which require framerate monitoring, OSD features or both. E.g. with HIS iTurbo software dedicated for HIS ATI cards.
     

Share This Page