Drivers....i'm confused!

Discussion in 'Videocards - AMD Radeon Drivers Section' started by Thug, Mar 31, 2010.

  1. Thug

    Thug Guest

    Messages:
    2,199
    Likes Received:
    10
    GPU:
    RTX 3080
    Getting my 5870 tomorrow (coming from nVidia), but downloading the drivers tonight ready for the install.

    I have noticed several different ones and not sure which one to install.

    So far i have...

    10-3_vista64_win7_64_dd (48.6 mb, file version 8.712.0.0, 30/3/10)

    10-3_vista_win7_32-64_ccc_lang1 (53.9 mb, file version 8.712.0.0, 30/3/10)

    10-3_vista64_win7_64_dd_ccc_wdm_enu (71.5 mb, file version 8.712.0.0, 30/3/10)

    ati_catalyst_10.3_ogl4_preview_win7_vista_march29 (111 meg, file version 0.0.0.0 31/3/10)

    Why cant they make it simple (for me).

    Which one do i use?
     
    Last edited: Mar 31, 2010
  2. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,213
    Likes Received:
    1,537
    GPU:
    NVIDIA RTX 4080 FE
    I'm in the same boat as you, a soon to be ex-NVIDIA card owner awaiting delivery of my HD 5870 Vapour X tomorrow! :D

    The drivers are confusing but the official WHQL ones, 10.3, are the oldest and are the same as 10.3a I believe. There is a newer version 10.3b with bug fixes for Battlefield Bad Company 2 and the oversized mouse pointer. The latest and newest drivers are the OpenGL 4.0 preview set, dated 29th March, so those are the ones I'm going to install. I'm unsure whether these contains the fixes from the 10.3b drivers however. There is also an application profile meant for the official 10.3 WHQLs but, again, I'm unsure whether these are included in 10.3b and 10.3 OpenGL 4.0 preview. :bang:

    As you say... confusing after NVIDIA's mostly simplistic numbering scheme where the higher version number is usually the latest (but not always!). Instead of referring to these drivers as 10.3, 10.3a, etc., why don't ATI add the version number of the driver itself, 8.7whatever, so that it's easier to tell which is the newest set?
     
  3. Knox

    Knox Ancient Guru

    Messages:
    1,573
    Likes Received:
    0
    GPU:
    Red Devil RX580 8GB
    ati_catalyst_10.3_ogl4_preview_win7_vista_march29 (111 meg, file version 0.0.0.0 31/3/10)

    The Open GL 4.0 preview driver seems to be the best so far.
     
  4. Thug

    Thug Guest

    Messages:
    2,199
    Likes Received:
    10
    GPU:
    RTX 3080
    Thanks, i will try that one.

    I mainly play BFBC2 (and COD:MW2), so hope its optimised.
     

  5. EaglePC

    EaglePC Guest

    Messages:
    332
    Likes Received:
    0
    GPU:
    EVGA GTX 5700
  6. Sever

    Sever Ancient Guru

    Messages:
    4,825
    Likes Received:
    0
    GPU:
    Galaxy 3GB 660TI
    well, these opengl4.0 drivers completely eliminated screen tearing for me in bfbc2 so there are some dx optimisations, and codmw2 has been running like a dream since 9.12 hotfix for me. apparently theres supposed to be a bit of a profile for metro2033 in these. for me, it seems to run a little better, but that might just be placebo effect.
     
  7. bhokuto

    bhokuto Master Guru

    Messages:
    239
    Likes Received:
    0
    GPU:
    MSI R5670 HD 1GB DDR5
    That would be too easy and makes too much sense.
     
  8. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,213
    Likes Received:
    1,537
    GPU:
    NVIDIA RTX 4080 FE
    Why are these drivers called OpenGL 4.0 when the release notes refer to OpenGL 3.3 support?!?
     
  9. Decane

    Decane Ancient Guru

    Messages:
    5,195
    Likes Received:
    21
    GPU:
    GTX 1060 6GB
    Because only the HD5xxx series supports OpenGL 4.0, whilst the older HD4xxx, HD3xxx and HD2xxx series get support for OpenGL 3.3.
    ATI doesn't usually do it like this. Usually, they release Catalyst 10.3 in March, and then 10.4 in April, 10.5 in May, etc. with perhaps a few "leaked" beta drivers in between. However, with NVIDIA's Fermi launch, ATI needed to pull a few tricks which they could not afford to leave till April, so they've been releasing the intermediate drivers like crazy. Basically, 10.3 < 10.3a < 10.3b < OpenGL 4.0 Preview Driver (March 29) in terms of how new they are. I omitted a few releases (the first preview driver and the first OpenGL 4.0 driver) because they are generally not used anymore with updated counterparts having been released.
     
    Last edited: Mar 31, 2010
  10. bhokuto

    bhokuto Master Guru

    Messages:
    239
    Likes Received:
    0
    GPU:
    MSI R5670 HD 1GB DDR5
    well, darrel,

    it's a play on Nvidia. Since is the new guy on the block still, while ATI has been around much longer. Rivals. Just like Intel and AMD.

    OGl4 in reality is still in limbo. But around the corner? Not sure.
    I think a few games actually use OGL 3.2? I think they're still using 2.0.

    3.3 is the standard according to the time ticker and 4.0 is due around the corner, but who's counting?

    This stuff right now is good for Benchmarks and some demos. The games are different story.

    One side likes DirectX, another side likes OGL and the up and coming is OCL. But, like this really matters?

    As long as you get good frame rates, and no stretchies, skittles, tears, bleeps, blobs, freezes, stutters and so forth, then you won't complain.

    It's also ego, our company has reached this plateau first.

    Who's first? That's the game. These companies pay these firms that create these protocols or have someone sit in them to make this stuff up.
    A consortium. Defactos, such is life.

    CEC, TIA, stuff like this. Standards based to keep companies on the same playing field, like sports. You have umpires, and the Umpire Squads, the rule makers so forth. Standards to guide and keep and uphold, but it's a bit different in the IT world, or the tech world. So really these tag names like OGL4, Dx11, OCL are all just protocols to pick and choose from and these companies look for OS's that support these and they create games and benchmarking tools.

    Then when a company reaches that first, they pat themselves on the back and say: 'next' because without goals your company putters in the golf course. Your 1 Wood chips trees instead of the tee.
     

  11. Thug

    Thug Guest

    Messages:
    2,199
    Likes Received:
    10
    GPU:
    RTX 3080
    Well, got it all up and running, and pleased to say it works great.
    I am running BFBC2 at 1920x1200, everything on as high as it can go, and it looks lovely in DX11..

    [​IMG]

    Temps are good too, between 53 - 80 after 2 hours of BFBC2.
    Oh, and its VERY quite.
     
  12. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,213
    Likes Received:
    1,537
    GPU:
    NVIDIA RTX 4080 FE
    @PCthug - I'm jealous already.

    My card came this morning at 9:15, fifteen minutes after I'd left for work, so as you can no doubt imagine I'm positively itching to get home and install it! I keep looking at the time every minute wishing it was 5pm already!!!

    Good job I'm off work from tomorrow night for 10 days, can't wait to mess around with it and see how it compares with my dependable, reliable but getting on a bit GTX 280.

    Did you get the Vapour X by the way?

    P.S. I didn't know BFBC2 supported DX11... I'll be sure to check that out. The lighting looks awesome in that screenshot. Does DX11 improve the quality of the AA?
     
    Last edited: Mar 31, 2010
  13. Thug

    Thug Guest

    Messages:
    2,199
    Likes Received:
    10
    GPU:
    RTX 3080
    I'm a bit of a tight @rse, so went for the cheapest i could find (£295)...
    PowerColor HD 5870 1GB
    http://www.ebuyer.com/product/175996
    Actually, i thought that if it wasnt up to scratch and the 470/480 proved to be VERY good i could easily resell my 5870 in a few months time and not lose much at all as they are going for £270 on ebay.

    Setting up was a doddle. Here is what you need to do...

    Uninstall all nVidia drivers, run driver sweep to get rid of any left behind.
    Power down, remove old card, fit in 5870 (its BIG and heavy).
    Panic because you cant remember where you put your spare PSU leads (it takes 2x 6pin, and not the 1x 6pin like the 8800GTS).
    Hunt high and low until you find the lead whilst cussing.
    Eventually locate it and wire it up.
    Start up your PC and log in.
    Install ati_catalyst_10.3_ogl4_preview_win7_vista_march29 drivers.
    Reboot.

    If all went well, you will just need to sort out your resolution and thats it.

    Now go and enjoy the games.

    As for AA, i am not sure. The reasons being, i NEVER ran AA with my other card. Everything else was set as high as it could go, but there was too much of a hit with AA. I always thought, when running around trying to last more than 30 seconds in MP i didnt have time to check jaggies out.
    With this card i have ran it with and without AA, and there is only a difference of about 5fps (when you are getting between 55 - 80 it isnt worth NOT using AA).

    As for the card itself, its BIG and VERY heavy, but also very nice quality. There are blanking plugs for unused connectors (inside and out, which is a nice touch), its really quiet, and so far i am liking it.

    Is there any adjustments i should be making in BIOS? Or does anyone know any other settings i should be looking at in the CCC advanced?
     
    Last edited: Mar 31, 2010
  14. mitzi76

    mitzi76 Guest

    Messages:
    8,738
    Likes Received:
    33
    GPU:
    MSI 970 (Gaming)
    you can pretty much always run with 4xaa and 16af. you can try diff aa modes..so far i found adaptive only works on some games (e/g mass effect2) and looks even better. also changing to edge-detect will improve things.

    as a general rule adaptive and super-sampling will lower fps but i found adaptive didnt affect some games (cod modernwarfare2/masseffect2). although napoleon total war runs like poo with adaptive..

    i always use let application decide. standard for catalyst a.i, although having it on advanced is supposed to perhaps enhance things like iq in certain games.

    p.s i just added a 2nd 5870. boy oh boy it's fast. :)
     
    Last edited: Mar 31, 2010
  15. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,213
    Likes Received:
    1,537
    GPU:
    NVIDIA RTX 4080 FE
    Funnily enough, that's the card I was going to buy as EBuyer are not far from where I live and I've always found them reliable. In the end though I opted for the Vapour X as it has better, quieter cooling which will come in handy for overclocking.

    Shame mine doesn't have the uber-sexy red ATI RADEON logo on the side though. :(
     

  16. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,213
    Likes Received:
    1,537
    GPU:
    NVIDIA RTX 4080 FE
    I would expect that... it's faster than the HD5970 isn't it as that card has lower clock speeds to keep the heat down?
     
  17. Revs

    Revs Member

    Messages:
    35
    Likes Received:
    0
    GPU:
    EVGA GTX970 2.0
    Got my 5850's yesterday and I too had some fun deciphering the ATi drivers page, but I've got is sussed now. My only problem now is getting crossfire to work as it should. I think there might be some nV residue in the drivers somewhere :D. Gonna have a blast with driver sweeper tonight and see how it goes.
     
  18. evilfury

    evilfury Guest

    Messages:
    542
    Likes Received:
    0
    GPU:
    MSI 1070 GTX Gaming X
    Such modern techonology and we can still see ugly ugly rocks textures...
     
  19. Thug

    Thug Guest

    Messages:
    2,199
    Likes Received:
    10
    GPU:
    RTX 3080
    Just played COD:MW2 and found it to not be very smooth.
    Its appears to be capped at 60fps. I know 60 shpuld be fine, but does anyone know how to increase or remove this cap?
    Its not as smooth as i hoped it would be.
     
  20. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,213
    Likes Received:
    1,537
    GPU:
    NVIDIA RTX 4080 FE
    COD: MW2 is benchmarked at well over 60 fps on both NVIDIA and ATI cards so isn't it a case of just disabling v-sync (you'll get screen tearing though)?

    If not, then check out the config file in the game folder as there may be a variable that caps the framerate. I've always played the game with v-sync on as I can't stomach tearing so I don't know whether it is capped at 60 fps with my GTX 280.

    *EDIT*
    From reading around it seems to be a variable called maxfps that limits the framerate to 85. Setting it to 0 removes the cap.
     
    Last edited: Mar 31, 2010

Share This Page