[TwinTech 8800GT 512Mb HeatPipe Edition] Core temperature reports 0.

Discussion in 'RivaTuner Advanced Discussion forum' started by SinRJ, Jan 27, 2008.

  1. SinRJ

    SinRJ New Member

    Messages:
    1
    Likes Received:
    0
    Hi there !

    First of all, congratulations and many thanks for that very useful software ! :)

    Now for the tricky part: I'm the very recent owner of a 8800GT (TwinTech 512Mb - HeatPipe Edition). I'm under XP Pro SP2, I have ForceWare 169.21 & RivaTuner 2.06 installed.

    Problem is, i'm unable to get any GPU temperature reading: RivaTuner shows a Core temperature of 0° (no matter what I'm doing, or which plugin I'm using ADT7473.dll / NVThermaldiode.dll). Just for info, the Data provider for the core temp is grayed out on <default> setup, but I don't think this is the problem. Also in other monitoring software, I'm unable to have a temp reading (Everest for example, doesn't even show my GPU in the probe part).

    My question is: could it be a driver / software related problem, or is this possible that those morons from TwinTech simply removed the thermal probe from the reference PCB, for some obscur reason (hope not, as on first edition of TwinTech 8800GT, there IS a sensor, as many owners report on various forums) ? :wanker: :rpg:

    If someone has a relevant answer, or even a clue, I would be forever grateful ! :heh:
     
  2. Nicanor

    Nicanor New Member

    Messages:
    8
    Likes Received:
    0
    Heloo .. i have the same card and the same problem here .. i looked on the internet and its possible not to have a thermal sensor .. ! Someone knows ?
     
  3. JJF

    JJF New Member

    Messages:
    2
    Likes Received:
    0
    I too am having the same problem with my Xpertvision card. I hear that some Palit and Asus cards do the same thing :/

    I wonder if anyone on this site can help :help:
     
  4. Nicanor

    Nicanor New Member

    Messages:
    8
    Likes Received:
    0
    The Asus cards have their monitoring program that resolves that problem in the latest version of it.
     

  5. levicki

    levicki Active Member

    Messages:
    66
    Likes Received:
    0
    GPU:
    Asus GTX980
    Problem is down to misconfigured (on purpose perhaps?) thermal table in the card's BIOS regardless of the card manufacturer.

    Thermal diode exists in the G92 and it can be read directly (not from the same register as on G80 though) which is what vendor utilities like Asus SmartDoctor do, but VGABIOS does not report diode calibration data properly so ForceWare drivers and all 3rd party utilities such as RivaTuner, Everest, nTune, etc..., are unable to calculate correct temperature.

    I know all this because I have Asus EN8800 GT card, I have the same problem as you do.

    The best solution for this issue would be for card vendors to fix their VGABIOS thermal tables. That would automatically enable proper reading in all third party utilities. However, chances for that to happen are slim to none unless they start getting support calls.

    Worst thing someone could do right now is to implement it incorrectly. That way the chance of fixing it properly (by forcing vendors to fix the VGABIOS) would diminish even further.

    I advise complaining to your card vendor and/or NVIDIA about missing feature.

    @nicanor:
    For all we know, Asus software may be reporting wrong temperature because we have no way to check / nothing to compare it with.
     
  6. Unwinder

    Unwinder Moderator Staff Member

    Messages:
    15,353
    Likes Received:
    2,630
    It looks like we're close to getting direct readings from G9x thermal diodes. I need some user with G92 based card and normal BIOS (which has thermal monitoring working) to perform the following:

    1) Record current core temperature
    2) Run "RivaTuner.exe /rr20008 /rr20400", which will give you a message box containing output of two GPU registers.
    3) Post the results here (i.e. post temperature and values of registers corresponding to it).
    3) Post the results here
     
  7. burebista

    burebista Ancient Guru

    Messages:
    1,736
    Likes Received:
    31
    GPU:
    MSI GTX1060GAMING X
    I dunno if I understand correct but this is:

    Monitoring

    [​IMG]

    Registers output

    [​IMG]
     
  8. Syncrod

    Syncrod Master Guru

    Messages:
    254
    Likes Received:
    0
    GPU:
    EVGA GeForce 780TI
    EVGA 8800GT
    GPU Temp: 56c
    256-bit G92 (A2,112sp) with 512MB DDR3
    Reg 00020008 : c008370b
    Reg 00020400 : 00000000
     
  9. levicki

    levicki Active Member

    Messages:
    66
    Likes Received:
    0
    GPU:
    Asus GTX980
    Could you two please upload your VGABIOS-es? Thanks.
     
  10. burebista

    burebista Ancient Guru

    Messages:
    1,736
    Likes Received:
    31
    GPU:
    MSI GTX1060GAMING X

  11. levicki

    levicki Active Member

    Messages:
    66
    Likes Received:
    0
    GPU:
    Asus GTX980
    Thanks, another possibility ruled out.
     
  12. Nicanor

    Nicanor New Member

    Messages:
    8
    Likes Received:
    0
    I have the problem card like in the title of the post : Twintech 8800GT 512 MB DDR3 - Heatpipe Edition ... i can't see the temperatures ... but when i do the Rivetuner thing i get the following register values :
    Reg 00020008 : c0083665
    Reg 00020400 : 00000000

    Can you tell me what temperature is for this value ?
    Can you make a program or a formula which we can use to calculate the temperature from this value ?
     
  13. Unwinder

    Unwinder Moderator Staff Member

    Messages:
    15,353
    Likes Received:
    2,630
    The formula used by ASUS SmartDoctor for such cards is

    T = (-13115 + raw_diode_data) / 18.7 + 1, where raw_diode_data is lower 14 bits of Reg 20008.
    So it gives (-13115 + 0x3665) / 18.7 + 1 = 44C

    However, currently we're absolutely not sure if it gives the real and correct temperature. G80 and older thermal diodes have non-fixed formulas, using diode inaccuracy specific to each diode sample, and I'm afraid that ASUS SmartDoctor programmers just skipped this code to simplify their job. And the dumps/temperatures posted by users really shows that the equation used by ASUS is a bit different to the ForceWare's one.
    To find out the correct equation I'll need to install any G92 card with working thermal monitoring in my rig, but currently I have no G92 based samples at all. So I'm trying to find one and maybe I'll be able to get it and implement direct G92 diode support in NVThermalDiode plugin in 2.08.
     
  14. Nicanor

    Nicanor New Member

    Messages:
    8
    Likes Received:
    0
    I can tell you that is almost right ! I made some measurements with you method and i got 44C IDLE and 62-64C FULL ! I also did the measurements with a digital sensor directly on the heat pipes , memory and mosfet ... and i got 40 IDLE and 60 FULL for the GPU. So its almost the same .. and i think the formula is more accurate than my external sensor and should be trusted !
    For the other temperatures measured they were :
    45C IDLE 70 FULL (memory) and 40C IDLE 65 FULL (mosfet)
     
  15. Unwinder

    Unwinder Moderator Staff Member

    Messages:
    15,353
    Likes Received:
    2,630
    Guys, more reports wanted.
     

  16. Unwinder

    Unwinder Moderator Staff Member

    Messages:
    15,353
    Likes Received:
    2,630
    Well, from my experience of ASUS hardware and software analysis I'd say: never trust ASUS!
    Looking at the way ASUS implement thier temperature calibration formula in SmartDoctor code (i.e. T = (offset + raw_diode_data) / divider), I may say almost with 100% warranty that the formula doesn't come from NVIDIA and ASUS have approximated it using raw diode readings and comparing it with external sensor readings. Most likely they recorded raw diode data for 0C monitored by their external sensor and this way received offset then recorded raw diode data for let's say 50C, applied previously calculated offset and this way calculated the divider. This fully explains why they use offset / divider approach instead of offset / gain formula, traditionally used by NVIDIA driver on the rest G92 with properly functioning thermal monitoring.
    Unfortunately, as I've said above, the diodes are NOT ideal and calibrated differently, so creating approximated formula using one diode can give good results on the same card but lie on other.
     
  17. pirlouy

    pirlouy New Member

    Messages:
    3
    Likes Received:
    0
    I have the same card as Igor Levicki: an Asus 8800GT 512 MB.

    Core T°: 0°C (57°C with SmartDoctor)

    Code:
    256-bit G92 (A2, 112sp) with 512MB DDR3
    
    Reg      00020008             : c008375e
    Reg      00020400             : 00000000
    Can I hope a solution with RivaTuner or speed fan will inevitably stay at maximum ? For now I have connected fan to motherboard, it turns at 5V.
     
  18. Unwinder

    Unwinder Moderator Staff Member

    Messages:
    15,353
    Likes Received:
    2,630
    Your fan is not controlable for sure, no luck for you.
     
  19. Unwinder

    Unwinder Moderator Staff Member

    Messages:
    15,353
    Likes Received:
    2,630
    I've invested some more time into analyzing the G92 thermal monitoring codepath in the driver and G92 thermal table in G92 BIOS images and I guess we can close the issue, because the situation is more than clear for me now. The facts:

    1) Due to some unknown reason thermal table of all G92 BIOS images (i.e. special data area in BIOS, which calibrates thermal logic of the card) contains "Auto detect external thermal sensor" flag, which tells the ForceWare that it should try to use supported external thermal sensor instead of internal one. Reference design G92 boards use ADT7473 external sensor / PWM controller to automatically adjust fan speed, and this sensor's readings are used by the ForceWare to report GPU temperature on any G92.
    2) Some vendors including ASUS are altering cooling systems then trying to make PCB cheaper by removing unwanted ADT7473 controller from reference PCB and unfortunately this way remove the only sensor supported by ForceWare on these cards too.
    3) NVIDIA G92 BIOS images completely lack G92 internal diode calibration info in the thermal table, so it won't be that easy to edit the BIOS to force the driver to use the internal sensor instead of external one. Enabling diode is not the only thing to be done in BIOS - diode calibration must be also provided to driver to allow it to convert raw data to temperatures.
    Due to the same reason it won't be that easy to provide G92 diode support in NVThermalDiode plugin. I've no ideas why NVIDIA decided to avoid internal sensor usage on G92, probably it is broken at hardware level. If external sensor is used on G92 erroneously, then you can hope for fix in future VGA BIOS updates and ForceWare drivers.
    3) ASUS SmartDoctor's diode calibration formula had been approximated for sure by ASUS engineers and unfortunately it is absolutely not the best way, which can easily give +/-10C inaccuracy comparing to real diode temperatures.
     
  20. pirlouy

    pirlouy New Member

    Messages:
    3
    Likes Received:
    0
    What do you mean ? How can you tell for sure fan is not controllable ? the code I gave you ? The BIOS ?
    And it is not possible for Asus to create a new BIOS which allows a slower speed ?

    If it's the case, it's a scandal. In their product, it is written "fansink". I won't accept this situation.
     

Share This Page