This truely sucks.

Discussion in 'Videocards - AMD Radeon Drivers Section' started by MetalFox, Aug 27, 2008.

  1. MetalFox

    MetalFox Master Guru

    Messages:
    907
    Likes Received:
    1
    GPU:
    GTX670 DCII
    This truely sucks. (Ati slows down furmark)

    http://en.expreview.com/2008/08/26/...ze-catalyst-for-furmark-making-it-run-slower/

    Oh yeah, I'd call it a virus nice job! Nice way to try to tape the bad cooler and low rpm's to fix everything magically!

    Well what do you think about this? I'd go so far that I'm a bit worried about this product, what about in the future some game is so gpu intensive they would do the same and slow it down to prevent heat... that's frikking funny.

    What a joke!

    Anyhow you can fix the card by using aftermarket bios or modifying it by yourself, but fixing product shouldn't be the buyers job.

    Let's see how this unfolds.
     
    Last edited: Aug 27, 2008
  2. drouge

    drouge Master Guru

    Messages:
    433
    Likes Received:
    0
    GPU:
    2x HIS HD5970
    Same thing on cat 8.54

    So 8.54 must be cat 8.9 beta?
     
  3. cowie

    cowie Ancient Guru

    Messages:
    13,276
    Likes Received:
    357
    GPU:
    GTX
    well i sent my 48x2 back i paid way to much for this software....the hardware i like thuo
     
  4. urmysin

    urmysin Master Guru

    Messages:
    237
    Likes Received:
    0
    GPU:
    2x EVGA 9800GT 1GB SLi
    I have to say that is just pitiful. What are they doing to themselves? :bang: Seems like to me a better cooling solution is in order here after reading that garbage. Does AMD/ATI have developer with earned their degrees or did they buy them from an offer of spam email? :nerd: People don't want less performance they want something that is flexible and working straight from the package.
     

  5. Egregious

    Egregious Member

    Messages:
    17
    Likes Received:
    0
    GPU:
    Mobility 5830 1GB GDDR3
    I guess I didn't really understand the article.

    So they're saying there's more potention to these cards than ATI is letting on?

    I don't see what the problem with that is, honestly. If they're still faster than the competing nVidia card, why should it matter?

    All this means is that you can rename .exe files you can "unlock" the higher potential of your card and in turn melt it?

    As long as the games we get 100fps in, while the nVidia cards are getting 95fps in same aren't suddenly being limited to 60fps then it shouldn't really matter what measures ATI takes to keep the cards from melting themselves.

    I guess my opinion is that as long as the performance is as advertised I won't be worrying about some sort of "unlockable" aspect to my card to make the FPS jump.

    Just my 2 cents, and you can correct me if I have misunderstood the written article!:)
     
  6. MetalFox

    MetalFox Master Guru

    Messages:
    907
    Likes Received:
    1
    GPU:
    GTX670 DCII
    Nope, it the thing slows down 50% only in furmark. It might be bad optimization perhaps, would not be new. Anyhow this proves that the stock cooler is way too weak for the card, the design is a bit flawed, it can be fixed trough modifying but that's not the buyers job.

    Of course I'm buying a accelero rev.2 to this thing, but well you all get the point.
     
  7. Dr. Vodka

    Dr. Vodka Guest

    Messages:
    3,790
    Likes Received:
    9
    GPU:
    Sapphire R9 290 Tri-X
    They deliberately "optimize" their driver to make furmark not stress their GPUs that much, because there have been reports of furmark killing HD4800 cards because of the heat (as you can see, nearly 100°C sustained aren't good for the card's health). If you rename furmark.exe to something else, this "optimization" goes away and the bench runs at full throttle.

    I checked this yesterday, got the same results. AS5 + stock cooler 40% fan.

    Furmark.exe: 79° max, 36 FPS avg
    [​IMG]

    renamed.exe: 94° max, 66 FPS avg
    [​IMG]

    If I set the fan at 65% the temps with the renamed furmark go down at about 76°C, more than healthy for a 4850. I'll be buying an accelero S1 soon... as summer is approaching here in Argentina, I don't want my GFX to die from extreme heat while gaming, specially with so many games coming out the following 3 months. I also don't want to lose some hearing because of the high pitched sound of the stock fan, too.
     
  8. Iarwain

    Iarwain Banned

    Messages:
    3,047
    Likes Received:
    0
    GPU:
    4890 990//1120
    Before you start tossing around completely unsubstantiated claims, how can you be sure they purposely wanted to slow down furmark. For all you know someone was testing something and forgot to put it back.

    I'm not saying your wrong, ATI and Nvidia have done worse things, but why not ask ATI? Instead of just running your mouth before you've even looked into it.
     
  9. Dr. Vodka

    Dr. Vodka Guest

    Messages:
    3,790
    Likes Received:
    9
    GPU:
    Sapphire R9 290 Tri-X
    I can't see another excuse for this, they want their cards with 10% fan speeds to survive furmark's heat, so they on purpose did this.

    This is explained by renaming furmark to something else, I can hear my card work much harder, the VRMs are really working at full speed with the renamed furmark, the other way around they sound as if they were almost idle. You know, the high pitched noise coming from these GFX cards.. if you for example change some game's .exe name to furmark, it'll slow down too, as it loses its optimizations AND furmark's driver settings kick in. The only difference being furmark is optimized for lower performance, ergo lower heat. This must be on purpose. They just won't cap their cards' performance on a benchmark by mistake, of course they wouldn't do that. Such thinking is exactly the opposite to what they'd usually do, improve performance.

    My "theory" has a good start, don't you think? I've looked into this, and I have hard proof on this thing. The strange behaviour is there, and this is the most probable explanation. I'm not that kind of guy who says things before having read about them; of course I'd love to hear what AMD has to say about this.

    Anyway, as long as they don't mess with game performance, and pull a GFFX-like driver optimization out of their hat, it's alright.
     
    Last edited: Aug 28, 2008
  10. sutyi

    sutyi Member Guru

    Messages:
    106
    Likes Received:
    0
    GPU:
    Gainward GTX 660 OC 2GB
    Tried this out, works the same either way @ me.
     

  11. kapu

    kapu Ancient Guru

    Messages:
    5,417
    Likes Received:
    796
    GPU:
    Radeon 7800XT
    I am using 8.8 and renaming doesnt changed anything.

    73C Full load with 39% FAN, damn my HD4870 is cooler than my old 8800GTS witch was goign up to 84C :)
     
  12. allesclar

    allesclar Ancient Guru

    Messages:
    5,768
    Likes Received:
    176
    GPU:
    GeForce GTX 1070
    i hope this is only on ati cards :( but that does seem wierd.
     
  13. Iarwain

    Iarwain Banned

    Messages:
    3,047
    Likes Received:
    0
    GPU:
    4890 990//1120
    I find it unlikely they would do this for any reason that "tries" to make their card look better. They know this is a fairly known benchmark. People are going to notice FPS drops, especially of almost half. They know this, they also know what's happened in the past when a company makes an "optimization" like this. I just find it unlikely that it is on purpose, or to make themselves look better.

    Let's be honest, if they wanted, they could just update the driver with new fan speeds, instead of slowing this one program down.
     
  14. Psychlone

    Psychlone Ancient Guru

    Messages:
    3,686
    Likes Received:
    2
    GPU:
    Radeon HD5970 Engineering
    I'm guessing that none of you have ever renamed a game .exe to something else for any reason whatsoever...right?

    A lot of games aren't properly coded to utilize Crossfire right, so renaming the .exe to a 'known' program that uses Crossfire properly will indeed unleash the potential of the X2 cards.

    Wow...I thought that was common knowledge - this isn't an attempt by ATi to screw their card's performance - think about that for a second...what purpose would dropping benchmark scores on their flagship cards do for them and their reputation considering it's up against the biggest, baddest nVidia card??? NONE, NOTHING...it's not their intention, it's the poorly written code in some benches and games that causes this, not a conspiracy from ATi!!!:funny:
    Psychlone
     
  15. RejZoR

    RejZoR Ancient Guru

    Messages:
    4,211
    Likes Received:
    0
    GPU:
    Sapphire HD4870 Silent
    Lol. When they "cheat" to gain higher scores in benches they complain, when you slow it down, they all jump up again and complain even more.
    Yeah sure, better cooling and another 20-30 eur to the price. Why? So you can run some freakin dumb FurMark? Give me a break. Stock cooling does it's job just fine for job intended. FurMark was specifically designed to torture graphic cards beyond anything ANY game wil ever do. Not even Crysis loads graphic cards as much and it's the most demanding game there is.
     

  16. WaroDaBeast

    WaroDaBeast Ancient Guru

    Messages:
    1,963
    Likes Received:
    0
    GPU:
    Gigabyte HD7950
    @RejZoR: I concur.
     
  17. Mr.StanleyDudek

    Mr.StanleyDudek Master Guru

    Messages:
    339
    Likes Received:
    0
    GPU:
    VisionTek HD 3870 X2 1GB
    Agreed.
     
  18. Stukov

    Stukov Ancient Guru

    Messages:
    4,899
    Likes Received:
    0
    GPU:
    6970/4870X2 (both dead)
    Thus sayeth the trutheth...

    (same for Psychlone too)
     
  19. Risco

    Risco Master Guru

    Messages:
    485
    Likes Received:
    0
    GPU:
    GTX 650M
    While it is annoying, who plays benchmarks? I will be annoyed though it it affects all OGL games.
     
  20. Alexraptor

    Alexraptor Guest

    Messages:
    1,315
    Likes Received:
    5
    GPU:
    GTX 1080, 8192MB
    Thats just unacceptable. I would RMA a card that had those kind of problems.
    A graphics card should be able to run "any" application without getting damaged or overheated when running on factory specs. If you can not it is unacceptable and you should get your money back.
     

Share This Page