AMD Catalyst (Modified Build)

Discussion in 'Videocards - AMD Radeon Drivers Section' started by TwL, May 29, 2010.

?

Should the Official INF be modified or not ?

Poll closed Jul 21, 2010.
  1. Yes (I want to go through registry and copy GUIDs to .reg file and add changes separately.)

    5 vote(s)
    8.9%
  2. No (Functions should be editable as usual in CCC straight after install.)

    25 vote(s)
    44.6%
  3. Both (I want to have official INF in install, separated modified INF & 'Functions.reg' as choice)

    26 vote(s)
    46.4%
  1. colin87

    colin87 Active Member

    Messages:
    60
    Likes Received:
    0
    GPU:
    270x xfire
    So far the new drivers work pretty great.

    I had some weird issue on Metro 2033, where gameplay became very very choppy, just before the cursed mission. Don't know if it was drivers though.

    My 3D Mark scores went down about 100 points from just over 16K to just under, don't know if I should be concerned. Not worried to much about it, real gameplay experience is much more important.

    I was so hoping that my card's weak overclock was due to drivers, but just seems I got a sh$tty overclocker. Stock speeds are 765/1125, I can get the core to around 800 and cannot touch the memory. Non-reference means no voltage control. Stupid card :(

    Anyhow, think I will re-install the drivers anyhow, and do some more benchmarking. Followed your instructions all the way through, so don't think I messed it up somewhere though.
     
  2. Har3inger

    Har3inger Member

    Messages:
    43
    Likes Received:
    0
    GPU:
    ATI HD 8870m 2GB GDDR5
    Just a bit of feedback from me about 3.95. I recently upgraded to those from 10.7 WHQL. I did 2 runs of 3DMark Vantage Entry for each driver (look at my specs, and you'll see why I did that). I'd post screenshots, but I'm lazy, so people will have to take my word that these results are honest.

    10.7:
    First (right after reboot): E5463
    GPU 5896
    CPU 4475

    Second (after doing some stuff): E5495
    GPU 5910
    CPU 4539

    3.95:
    First (reboot): E5503
    GPU 5919
    CPU 4544

    Second (after a bit of gaming): E5494
    GPU 5912
    CPU 4531

    Basically, no significant improvement to be seen on a card as old as mine. Still keeping these though, since there is a smidge of benefit. I'll have to see about that bioshock issue though. I'll test to see if it crashes a lot for me later.

    For lulz, I did an Extreme run at 1366x768. GPU score is 1026.
     
    Last edited: Aug 13, 2010
  3. Aegnora

    Aegnora Member

    Messages:
    18
    Likes Received:
    0
    GPU:
    HIS radeon 5870@900/1250
    Ok after applying BFBC2 patch still no luck in Bioshock, just crashed two time in a row.

    GPU usage was 43% with max 47%, temperatures max were 44,5°C, 45°C and 47.5°C (gpu,mem,vrm)
     
  4. TwL

    TwL Ancient Guru

    Messages:
    1,828
    Likes Received:
    0
    GPU:
    2x5850 2x6950 + 9800GTX
    You should use the BF:BC2 patch on this title. Would probably work higher, but this title needs ATI profiling a lot. It just is 'an NVIDIA title' can't help there.

    I do know that current drivers will benchmarking better on lower clocks, but I don't know about overclocking of this specific card, but voltage shouldn't be cumulative with memory voltage. I got here ASUS card myself got
    no issues in sense popping around 1175Mhz memory speeds on stock voltage. Only when I start to rise the core speeds around 900-1000Mhz I need same
    voltages as HD5870 would as in 1.162v would be the most stable and then next stable voltage has been 1.212v and so on.

    hmm, gonna check this one as you got pretty similar hardware set as I do and I happen to have the game here somewhere.
     
    Last edited: Aug 13, 2010

  5. colin87

    colin87 Active Member

    Messages:
    60
    Likes Received:
    0
    GPU:
    270x xfire
    I'll try that patch, thanx.

    As for the clocks, I'll be testing with 800 core over the weekend. 850, and benchmarks/games fail almost immediately. I've tried 1175 with my card, same story as with core. I guess I can play around and find the highest stable mem clock, but the increase in FPS will almost be nothing. At least my tempratures are great, as I need that. Summer is approaching again, and the heat is terrible here and nothing I can do about it. Just one of the many perks of living in a 3d world country :bang:
     
  6. chanw4

    chanw4 Guest

    Messages:
    2,362
    Likes Received:
    26
    GPU:
    NITRO+ RX6800XT SE
    You can try flashing your card.
     
  7. colin87

    colin87 Active Member

    Messages:
    60
    Likes Received:
    0
    GPU:
    270x xfire
    I guess so, but from reading forums. It seems that the reason you can change voltages on reference cards, is that it has a regulator, which the non-reference do not have, please correct me If I'm wrong. If it is the case, even flashing would not raise the voltages
     
  8. chanw4

    chanw4 Guest

    Messages:
    2,362
    Likes Received:
    26
    GPU:
    NITRO+ RX6800XT SE
    Oh, I mean you can flash your card with a lower clock. Not sure about the voltage regulator tho. It really depends on the manufacturer I suppose. If no regulator on the board, then i suppose nothing would change the voltage.
     
  9. Aegnora

    Aegnora Member

    Messages:
    18
    Likes Received:
    0
    GPU:
    HIS radeon 5870@900/1250

    Ok thanks TwL.

    After the last two crashes, i've just finished a play session from Neptune bounty mid level to Arcadia beginning, no crash till just now. Crashed during the "chase" of the houdini.
     
  10. Laykun

    Laykun Guest

    Messages:
    3,202
    Likes Received:
    0
    GPU:
    2 x EVGA GTX670 4GB
    Unfortunately the toxic cards do not come with the voltage control circuitry of the stock HD5850. Any custom PCB will usually mean you don't get the ability to control voltage on your card. I returned my Sapphire Toxic for this very reason for an ASUS and have been happy ever since.
     

  11. colin87

    colin87 Active Member

    Messages:
    60
    Likes Received:
    0
    GPU:
    270x xfire
    Won't be able to return my card just because of the voltage control, well not in South Africa anyhow. Maybe sell it sometime :)

    Anyhow, back to the drivers, I don't want to hijack TwL's thread with my OC problems
     
  12. IcE

    IcE Don Snow

    Messages:
    10,693
    Likes Received:
    79
    GPU:
    3070Ti FE
    Lol, I just noticed you had Adaptive AA enabled by default in your driver...That's hilarious, I was getting the same fps with it on as I was before with it off. Didn't even notice.
     
  13. aporellos

    aporellos New Member

    Messages:
    2
    Likes Received:
    0
    GPU:
    4850X2
    Hello, my first post.

    The first "Good job TwL". Thanks.

    I have a 4850X2 and find that the 390aX64 (W7 X64) drivers are good for me.
    Are the 350X64 drivers better than the 390aX64 for 4850X2 ?

    Where is the post of HD3xxx/4xxx X2 card setups?

    what are the best drives for XP?

    Sorry for my English
     
  14. TwL

    TwL Ancient Guru

    Messages:
    1,828
    Likes Received:
    0
    GPU:
    2x5850 2x6950 + 9800GTX
    @Aegnora

    Ok, Bioshock running maximum GPU.. hoyle .... I didn't remember how much this engine needs fixing.

    1. Run the game only with v1.0 1 time & start a 'New Game' (With no compatibility modes)
    2. Patch the v1.1 up to the game. (With no compatibility modes)
    3. Make sure you have at Control Panel > Sound > Only your sound card enabled and sound most be 16bit/24bit '48000hz' frequency or below. Makes sure that under 'Recording'-tab 'WaveMixer' or 'Stereo Mix' is enabled device (right click empty space show hidden devices should find it). Other wise no go on Windows 7.
    4. Settings: %APPDATA%\Bioshock.ini: Find topic '[WinDrv.WindowsClient]' > change values:
    Code:
    FullscreenViewportX=1920
    FullscreenViewportY=1200
    ^ Add these or your own resolution manually.
    and
    Code:
    TextureDetailInterface=UltraHigh
    TextureDetailTerrain=UltraHigh
    TextureDetailWeaponSkin=UltraHigh
    TextureDetailPlayerSkin=UltraHigh
    TextureDetailWorld=UltraHigh
    TextureDetailRenderMap=UltraHigh
    TextureDetailLightmap=UltraHigh
    (we don't want to play any minor settings now on new hardware do we?)
    6. find topics '[D3DDrv10.D3DRenderDevice10]' & '[D3DDrv.D3DRenderDevice]' > change these:
    Code:
    LevelOfAnisotropy=16
    7. Save the file > right click '%APPDATA%\Bioshock.ini' > properties > check 'Read ONLY' OK
    8. Now Open Catalyst and change:
    * Anti-aliasing: EgdeAA 8x(x24) / in case of DX10 you should choose 'application selection' for AA on catalyst as you cannot apply AA to DX10 mode (as far I tested by any rename trick. Maybe some RadeonPRO style profiling tool?).
    * Anisotropic Filtering > HQ AF (checked) > 'Use Application Selection'
    * Catalyst AI: Standard
    * VSYNC: Off, Unless Application Specifies
    * Enable 'Adaptive Anti-Aliasing'

    Now if you want Anti-Aliasing to Apply Create shotcut:
    Target:
    Code:
    <game install folders>\BioShock\Builds\Release\Bioshock.exe -DX9
    If you want DX10 Just launch the normal shortcut..

    I played couple levels here.. Took few screenshots from the start...

    DirectX 10.0 - Screens from Water / GPU usages etc etc on the corner:
    [​IMG]
    [​IMG]


    DirectX 9.0 - Screens with Anti-Aliasing pushed to absolute maximum levels with 16x HQ Anisotropic Filtering pushed up:
    [​IMG]
    [​IMG]

    -edit-

    Well, since I now got the game installed and ones again I have to go through this SecuROM crap. Might as well play the game a bit further see, if this 'Bioshock' has anything weird in it while playing.

    @Aegnora

    You got any suggestion should I play as DX9 or DX10 mode ? (I'll try to play same mode what ever mode you are playing. So, I see if there's an crash somewhere while at it.)
     
    Last edited: Aug 13, 2010
  15. TwL

    TwL Ancient Guru

    Messages:
    1,828
    Likes Received:
    0
    GPU:
    2x5850 2x6950 + 9800GTX
    Yeah, screw the bad quality crap.. Kick in decent gaming settings straight up, hehe.. :biggun:

    I haven't yet build the drivers for these HD4xxx X2 cards. 3D detection under D3D9 on current builds according to some people here is bad for HD4870x2 cards at least. Which I am not too sure on since I don't understand why these people thinks v3.95 would work for their systems, but according to few even v3.90a is bad for HD4xxx X2 cards.

    So, guess I will have to take a look at there as soon as possible.
     
    Last edited: Aug 13, 2010

  16. crushilista

    crushilista Ancient Guru

    Messages:
    3,467
    Likes Received:
    0
    GPU:
    MSI GTX 670
    What I found weird was that I could overclock my 5850 to 950 core and 1250 memory, and with your new drivers and the 8x edge detect, multisample AA, and temporal AA all on in Left 4 Dead 2, and I was still getting 75-110 FPS.

    That's just in-fricken-sane.
     
  17. TwL

    TwL Ancient Guru

    Messages:
    1,828
    Likes Received:
    0
    GPU:
    2x5850 2x6950 + 9800GTX
    Good that you remind on 'Valve' 'Source Engine' gonna take a look on that next.

    Even while usually these 'Source Engine' people understand their engine ground very well. I honestly believe problems people are having here are just some user error. Although, if someone wants 221FPS instead of 215FPS that's like, but have to see.
     
  18. crushilista

    crushilista Ancient Guru

    Messages:
    3,467
    Likes Received:
    0
    GPU:
    MSI GTX 670
    In the Bioshock picture, where did you get that fantastic tool that says the temps? I'd love to have that lol.
     
  19. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    Check out RivaTuner, you probably need to work around the missing support for ATI5xxx series monitoring thou.

    I think MSI Afterburner and that EVGA thing can monitor like that too?
     
  20. TwL

    TwL Ancient Guru

    Messages:
    1,828
    Likes Received:
    0
    GPU:
    2x5850 2x6950 + 9800GTX
    Actually It's v1.6.1 MSI Afterburner. Just check 'Show on OSD' in settings for 'readings'.

    Suggest you add shortcuts like CTRL-ALT-H I have added makes OSD Visible/hidden and CTRL-ALT-J takes screenshots from MSI Afterburner and so on.
     

Share This Page