Fermi bios editor guide

Discussion in 'Videocards - NVIDIA GeForce' started by civato, Jan 6, 2011.

  1. raybies

    raybies Guest

    Messages:
    4
    Likes Received:
    0
    GPU:
    Many
    Thanks for the reply civato.

    The other components are... i5, h67, couple HDD's. The 300w psu has more than enough power, but without the fan, it may blow if using > 100w, and it starts to squeals (very faintly) if over ~75w.

    Using Afterburner I've run the card for a couple hours using FurMark @:

    Core: 810Mhz
    Shader: 1620Mhz
    Memory: 2000Mhz
    Voltage: 0.975v

    No issues.

    Movie playback (MPC, KMplayer or Splash) always uses the Domain 15 settings.

    My plan:
    =========================Clocks==============
    Domain 3 & 7: Leave

    Domain 15:

    Tab 0 = 1620
    Tab 1 = 1620
    Tab 2 = 540
    Tab 3 = 1620
    Tab 4 = 1705.2606
    Tab 5 = 2000
    Tab 6 = 540
    Tab 7 = 540
    Tab 8 = 0
    Tab 9 = 0
    Tab 10 = 203
    Tab 11 = 1705.2606

    What are tabs 0,1,2,6,7,8,9,10 for?
    =========================Voltage==============
    Setting 0

    Tab 0 = 0.9v
    Tab 1 = 0.9v
    Setting 1

    Tab 0 = 0.925v
    Tab 1 = 0.925v
    Setting 2

    Tab 0 = 0.975v
    Tab 1 = 1.0125v


    Got it, thanks.
     
    Last edited: Sep 6, 2011
  2. civato

    civato Guest

    Messages:
    918
    Likes Received:
    0
    GPU:
    2xGTX570 sli on H2O EK
    It's a good plan but tab 4 and 11 need to be 1705 and no number behind the comma.

    Tab 0 and 1 are the same value as the shader speed but you don't need to change them.
    You can do it , some card vendors put the same value in tab 0,1,3 some don't.
    I usually do it to.
    The other tabs are all related to the other domains so I would never touch them, what they exactly do I'm not 100% sure.


    Do remember that the voltage reading with the latest driver 280... is one step off, see a few post back.
    Nothing to worry about just when you set for example 1.005V in Nibitor you will see 1.013V in software like Afterburners or AIDA64.
    And software like Fur mark are not good for stability testing on the latest gpu's from amd and nvidia, they got a power draw controller so when the power draw is to high it clocks down the card,better to use heavy games or benchmark runs like crysis or heavenbenckmark at highest settings. The powerdraw limiter will not kick in with games or benchmarks like heaven , vantage etc.. Nvidia did put it in to protect the cards from burning because fur mark puts to much unreal stress on the card. Amd uses this also so don't think it is Nvidia only.
    Good luck and glad I could help.
     
    Last edited: Sep 6, 2011
  3. raybies

    raybies Guest

    Messages:
    4
    Likes Received:
    0
    GPU:
    Many
    Thanks for your help :)

    Quick question what determines domain? Eg what makes the card swap between clocks?
    I've noticed playing back an avi file the card will flip between 3D performance (810Mhz) and 3D normal (405mhz) frequently.


    And when no apps running, just GPU-Z and desktop it seems to get stuck @ 405Mhz instead of 2D 50Mhz
    R.
     
    Last edited: Sep 6, 2011
  4. civato

    civato Guest

    Messages:
    918
    Likes Received:
    0
    GPU:
    2xGTX570 sli on H2O EK
    Driver determine this, if you use a multi monitor setup it sticks in 3d (solution for this can be found in this thread ) and a 120hz monitor can also cause this problem, this is normal and a hardware limitation (changing refresh rate to 110 Hz solves this )
    Gpu-z causes the clocks to change but normally if you wait long enough it should end at 2d clocks.
     

  5. civato

    civato Guest

    Messages:
    918
    Likes Received:
    0
    GPU:
    2xGTX570 sli on H2O EK
    Update FermiBiosCalculator

    Version 1.0.0.4= GTX550ti added.
    App send to Hilbert to update the host.
     
  6. musvi

    musvi Member

    Messages:
    27
    Likes Received:
    0
    GPU:
    Asus GTX 560 DC ii Top
    hello i'm struggling to edit fan profile of my ASUS ENGTX560 DCII TOP/2DI/1GD5 but no luck as latest farmi bios editor cant read the bios......

    any help or clue will be highly appreciable

    Thank u..
     
  7. iMik

    iMik Guest

    Messages:
    2
    Likes Received:
    0
    GPU:
    Leadtek 8800 GTX 768
    I have two different 560 ti, Gigabyte which I overclocked to 950 , and Msi Hawk which comes overclocked to 950. Clock from Fermi calculator are slightly different from MSI clocks.

    [​IMG]
     
  8. civato

    civato Guest

    Messages:
    918
    Likes Received:
    0
    GPU:
    2xGTX570 sli on H2O EK
    Fermibios editor doesn't support 560 , you need to use NiBiToR.

    Open your bios with NiBiToR and go to temperatures there you can set min and max fanspeed , don't got to low.
    Fanprofile like in the fanspeed IC settings guide can only be done on older cards like the 200 series , not on fermi cards.
     
  9. civato

    civato Guest

    Messages:
    918
    Likes Received:
    0
    GPU:
    2xGTX570 sli on H2O EK
    That is no problem , it is possible , there will be a little offset with some card vendors. Follow the calculator , it is based on nvidia stock bios , and you won't have any problem.
     
  10. iMik

    iMik Guest

    Messages:
    2
    Likes Received:
    0
    GPU:
    Leadtek 8800 GTX 768

    What if I have this two cards with different offset in SLI.
     

  11. civato

    civato Guest

    Messages:
    918
    Likes Received:
    0
    GPU:
    2xGTX570 sli on H2O EK
    That is no problem, but if you don't feel OK with it use the settings of your Msi card.
     
  12. djjonastybe

    djjonastybe Master Guru

    Messages:
    725
    Likes Received:
    0
    GPU:
    HD6950(70) 885/1350
    How do I find my clocks in the VBIOS in hexview mode??

    For example if I have a core clock of 600mhz how do I search and edit it manually?
     
  13. baltazhor

    baltazhor Guest

    Messages:
    6
    Likes Received:
    0
    GPU:
    Gigabyte + EVGA GTX460 1G
    Hi, i need your help.

    I flashed my gtx 460, and i set a voltage of 1,1 for 3d Performance.

    But, when i'm pushing the GPU till 100% (MSI Kombustor), the software like Aida64 or MSI Afterburner says that the car is only @1,025V. WTF

    [​IMG]
     
    Last edited: Nov 23, 2011
  14. civato

    civato Guest

    Messages:
    918
    Likes Received:
    0
    GPU:
    2xGTX570 sli on H2O EK
    Read the guide again m8 this is user error.

    You changed the offset value.
    3D performance voltage can be found in "setting 2" .
    There you need to change the value of tab "0" to the desired voltage (in your case 1.1V) and in tab"1" you need to put the same voltage or 2 voltage steps higher.

    (3D performance voltage or P15 voltage setting uses "Setting 2" if you didn't change that.)

    Remember , from driver 280.... the voltage read-out in msi afterburner and AIDA64 is one step higher then what you set it in the bios , that is no problem !!!!

    And Furmark is not a good tool to stress test the gpu for stability.because of the power-draw limiter
    [​IMG]
    Try playing games like crysis at DX11 or Battlefield3.
     
    Last edited: Nov 23, 2011
  15. baltazhor

    baltazhor Guest

    Messages:
    6
    Likes Received:
    0
    GPU:
    Gigabyte + EVGA GTX460 1G
    thank you!

    P.S. How can i edit the fan/temperature profile? If i click on "Tools - Read Bios", i get the message: "cant start driver:1275".

    Do you recomend me another software to edit the speed fan profile from my bios?

    regards.
     

  16. civato

    civato Guest

    Messages:
    918
    Likes Received:
    0
    GPU:
    2xGTX570 sli on H2O EK
    error is normal because nibitor is not a signed tool by microsoft so you cannot read the bios with nibitor. You can save your bios with gpu-z and open it with nibitor (always save as a.bin file in nibitor)
    You can find all bios versions here http://www.techpowerup.com/vgabios/

    You cannot make a fan profile in the bios , that is only possible on older cards like the 8... and 200 serie of NVIDIA. On fermi based cards that is not possible.
    Only thing you can change in the bios on the fermi cards is the fan min and max speed (in temperature tab) , don't put the minimum fan speed to low or the fan won't work anymore.

    You can make a fan profile with msi afterbutner software.
     
  17. Sneakers

    Sneakers Guest

    Messages:
    2,716
    Likes Received:
    0
    GPU:
    Gigabyte 980Ti Windforce
    Hmm I'm trying to flash an Asus 580 GTX dcii with a modded BIOS where the only change is raising the voltage table from 1.150 to 1.215.

    I extracted bios with gpu-z

    Saved original BIOS in .rom formate with FermiBiosEditor

    Modded Original with FermiBiosEditor, also saved to .rom

    Moved nvflash files to formated, fat32 bootable USB stick
    Moved modded bios and org bios + autoexec script to USB stick

    When booting on the USB I am met with an error message in NVflash saying "I/O error could not open file<Name>.rom, and I cannot get past this.

    Any ideas?
     
  18. baltazhor

    baltazhor Guest

    Messages:
    6
    Likes Received:
    0
    GPU:
    Gigabyte + EVGA GTX460 1G
    Hi again!

    I flashed both card @ 850/2000, and i set 1,050v for gaming.

    Here it comes my new problem:

    [​IMG]

    In idle, the first card is allways running with the top voltage (1,050v), while the other one, goes to 0,875.

    Is this normal?

    Regards
     
  19. Seb1

    Seb1 Member

    Messages:
    48
    Likes Received:
    2
    GPU:
    GV-N2060GAMINGOCPRO
    That's not normal... and GPU1 is at the 2D frequency (50MHz) ? At any chance: do you have more than 1 monitor connected to that video card ?
     
  20. baltazhor

    baltazhor Guest

    Messages:
    6
    Likes Received:
    0
    GPU:
    Gigabyte + EVGA GTX460 1G
    Hi

    In idle:

    GPU1: 405 Mhz
    GPU2: 51Mhz

    And yeah! i've got a monitor and a LCD TV connected to the first videocard. No screens on the second gtx460.

    So... Is it normal to have a higher temperature if i'm using two screens in the vga?

    regards


    P.S. I unpluged the TV, and now, i got the same temperatura/gpu usage in both cards.

    Otherwise, i detected that i have a bigger performance using only one screen:

    Cloned image (tv/monitor) - @ 890/2100 - 3DMARK VANTAGE 29,900 gpu score

    Single monitor - @ 850/2000 - 3DMARK VANTAGE 30,900 points


    AWESOME
     
    Last edited: Nov 24, 2011

Share This Page