1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Adobe Flash HW Acceleration / AMD Overdrive driver bug - who else can reproduce this?

Discussion in 'Videocards - AMD Radeon Drivers Section' started by freibooter, Aug 2, 2013.

  1. Offler2

    Offler2 Member Guru

    Messages:
    185
    Likes Received:
    0
    GPU:
    Gigabyte HD 7970OC, 3gb
    Be aware that Youtube is not using only flash player to render videos. Its using also one different codec.

    so when we are talking about Hw acc enabled and HW acc disabled we MAY or MAY NOT be talking about two completely different codecs...

    if you want to render anything on idle vram clocks, you need a lot of time. usually idle clock is not enough to perform all tasks so its better to clock them up....


    In the past changing of clock on CPU using multiplier, and decreasing CPU voltage resulted in very interesting behaviour called VDROOP. In this situation, if CPU finished task, it clocked down. First voltage went UP and in few miliseconds went down to IDLE. This required change in Intels VRM, but still - you can write some bastard code which require cpu to clock up and down. With a bit of luck you can cause system to crash.

    I decided to prevent this problem. My CPU is clocked to fixed frequency and set to fixed voltage. However I have not such control over GPU.

    Good code clock GPU to 501Mhz, and memory to 1375Mhz (in case of my Hd7970), and keep GPU clocked like this for few second AFTER the task is finished. This cost bit more energy, but its safe.

    If I want to spare as much electric energy as possible and i am stupid enough i will write code which sets GPU back to idle immediately when the task finishes. However... GPU can be multitasked. So if Task 1 sends a request to clock down, while Task 2 is still executing we got a problem. This can be a problem even in situation when Task 2 tries to start too early after Task1 was finished, because switching from from 501Mhz to idle and then back takes some time, changing voltage took even much moretime (few miliseconds).

    This perfectly fits to situation where my Opera is working well, and other browsers are screwed. However I dont have direct access to code they are using, so its only rought description of the situation I have seen.
     
  2. freibooter

    freibooter Active Member

    Messages:
    58
    Likes Received:
    0
    GPU:
    PowerColor R9 280X 3GB OC
    Adobe Flash uses AMD's UVD for HW acceleration (as it technically should) Chrome's PPAPI for some reason does not (while still providing proper video quality).

    The problem is with AMD's UVD and the stupid way its separate clock speeds are implemented and used (overriding everything else), not with the applications using it.

    I don't like Adobe as much as the next guy, but this is one time where they aren't to blame.
     
  3. Espionage724

    Espionage724 Guest

    But what is Chrome using then if it isn't using UVD? If I recall right, my GPU clock speed reduced as intended when I opened up some hardware-accelerated flash player content from Chrome.

    I'm not on Windows atm so I can't really test it atm :/
     
  4. freibooter

    freibooter Active Member

    Messages:
    58
    Likes Received:
    0
    GPU:
    PowerColor R9 280X 3GB OC
    @Offler2

    I think you may not be understanding the issue here. And sentences like this not only show that, but also show a fundamental misunderstanding of AMD's basic energy saving mechanics:

    A few seconds, WTF? And what on earth are you talking about with "codecs" when we have already established that AMD'S UVD clocks are clearly to blame here?


    Just about everything you wrote has almost nothing to do with the issue at hand, I'll try to explain my sentences you quoted above one more time:

    Do you have a multimonitor setup?

    If more than one monitor is connected to any modern AMD card, energy management for the memory clock is disabled! This is by design.

    The memory clock never downclocks, never changes and always runs at the set maximum. This almost doubles the energy usage during IDLE in a dual-monitor setup.
    So how come AMD's cards change from a dynamic memory clock to a fully static one (not even downclocking a tiny bit) as soon as a second monitor is connected? Again, this is by design and Nvidia does something similar in dual-monitor environments.

    The only time the memory clock ever changes in a multi-monitor setup is when:
    1. It is adjusted manually (e.g. via Overdrive)
    2. Something triggers the separate UVD clock speeds

    Coincidentally, the only time my GPU is ever unstable, is when playing a lot of videos which trigger these UVD clock speeds when they don't match the manually set memory clock speed.

    Example:

    My HD 7850 OC has a native max memory clock speed of 1250Mhz, its UVD max clock speed is identical. In a dual-monitor setup this clock speed is constant, always, during idle desktop and when running Furmark, it never, ever, ever changes!

    Now if I should adjust the memory clock to, let's say, 1245Mhz and watching some videos on Youtube, every single time playback stops or resume for whatever reason the memory clock switches from 1250Mhz to 1245Mhz and back.

    I assume there must be a darn good reason for AMD to otherwise locking it at 1250Mhz, since the system eventually starts to flicker during every clock change and eventually becomes unstable and, after many videos and some prolonged uptime the system shows artifacts and locks up.

    While I'm sure there very much is[/b] a reason to change the minimum core clock and memory clock during UVD accelerated video playback, there is absolutely no reason to override the maximum core speeds! But the latter is what AMD does and what's causing all the problems.
     
    Last edited: Aug 4, 2013

  5. freibooter

    freibooter Active Member

    Messages:
    58
    Likes Received:
    0
    GPU:
    PowerColor R9 280X 3GB OC
    UVD clocks don't reduce anything, at least not on the current 7xxx cards! In fact, they slightly elevate core clock and vcore during UVD accelerated playback if the GPU is otherwise idle - which apparently is or was the purpose of the separate UVD clocks.

    The problem is, that UVD clocks additionally also change the maximum core and memory clocks.
    At least on current cards these maximum UVD clock speeds are always identical to the regular maximums of any card (which shows that these aren't some magic numbers with a special purpose - a Sapphire HD 7850 has a max memory clock and max UVD clock of 1200, a Sapphire HD 7850 OC has a max memory clock and max UVD clock of 1250). So, no, nothing is "reduced" by default.

    The problem is, when you manually override the clocks via AMD overdrive or similar, it's doesn't matter if you under- or overclock here!

    Every time anything triggers the UVD clocks the card is forcefully reset to its maximum BIOS default clocks - this can be reduction or increase, depending in which direction they were previously adjusted - and then set back to the manually adjusted maximums as soon as the video acceleration ends.

    And, no, I have no idea why on earth video playback in Chrome doesn't trigger the UVD clocks while still looking good - but you can hardly blame Adobe for using one of AMD's badly implemented features, the one they are supposed to use for the purpose.
     
    Last edited: Aug 4, 2013
  6. Espionage724

    Espionage724 Guest

    The reason Chrome's player looks good is because it's still hardware-accelerated (GPU-Z shows clock changes and large GPU usage with flash content open), and I guess AMD's video enhancement stuff can still affect it.

    If Chrome doesn't use UVD though, why else would clock speeds be affected (I'm pretty sure they are anyway; going off of fuzzy memory lol)

    If UVD itself is the issue, then technically anything that uses it should do the same right? GPU-accelerated decoding enabled from VLC for example should cause a similar effect. I can try to mess around with UVD in Linux.
     
  7. Offler2

    Offler2 Member Guru

    Messages:
    185
    Likes Received:
    0
    GPU:
    Gigabyte HD 7970OC, 3gb
    Overdrive sets maximum frequencies for 3d mode only. So even if you decrease it, this frequency will be used only in 3d games, or as in your case with dual monitor setup.

    UVD frequency is specifically set in bios of the GPU. Once you start GPU accelerated video codec, specific frequency is set on GPU and VRAM.

    And trust me I know why I am talking about codecs. Its the codec which give initial input to change frequency.

    So if you really want to continue in this discussion, take the time and read my previous post carefully please.

    You know, the frequencies are not changing by their free will. Bios provides only tables which contain information which GPU/Vram frequency will be used. Everything else depends on the software, not driver, not bios, not HW.
     
  8. Offler2

    Offler2 Member Guru

    Messages:
    185
    Likes Received:
    0
    GPU:
    Gigabyte HD 7970OC, 3gb
    Already tested UVD with various codecs. No such issue here.

    GPU clocks in Windows desktop however react even when you are moving any window. Thats not making any issue.

    Everything I have seen so far in the browsers is caused by badly written software which expects to change GPU/Vram voltage immediatelly (virtually 0ms), and thats physically not possible.
     
  9. freibooter

    freibooter Active Member

    Messages:
    58
    Likes Received:
    0
    GPU:
    PowerColor R9 280X 3GB OC
    Offler2, I know everything you keep repeating here.
    I know now where UVD clocks are stored etc.

    I explained to you exactly what the issue is and why it is an issue and why it can cause performance loss in 3D mode and serious stability problems in dual (or more) monitor setups. Could you try to address that please and not just repeat mostly unrelated facts.

    Again, the above statement clearly shows that you are not grasping the issue.

    1. Overdrive max memory clock speeds are very much applied and constantly used in "2D mode/desktop idle mode" for multimonitor setups.

    2. The simple fact that the maximum speeds are not applied to the UVD clocks causes this problem! This is the entire issue!

    Overclock your GPU by 15%, start a game and it will run 15% faster. Launch a simple Youtube video in the background (e.g. to listen to some music) or watch it on your second monitor and your gaming performance is decreased by 15% because the UVD clock maximums are applied during playback (for no good reason at all!) even when running high intensity 3D applications.

    This is a problem, but not the problem:

    The problem is that the memory clock, in any mode, is never, ever, ever, ever changed in a dualmonitor environment. I'm going to assume that there is a darn good reason to waste all this energy and to never, ever, ever dynamically adjust the memory clock in a dual-monitor environment.

    Except when to stupid UVD clocks are applied! Only then and only when they differ from the maximum as set in Overdrive. And only then, does the system become unstable.
     
    Last edited: Aug 12, 2013
  10. freibooter

    freibooter Active Member

    Messages:
    58
    Likes Received:
    0
    GPU:
    PowerColor R9 280X 3GB OC
    That is not the problem! That has absolutely nothing to do with the problem! That only shows that you neither read, nor understood, nor tried to reproduce the problem!

    EDIT:
    Offler2, please at least answer this before you keep derailing this more and more:
    Do you have more than one monitor?
    Have you changed you core and memory clock in AMD overdrive in any direction (doesn't matter by how much, doesn't matter if higher or lower, just change it at least a tiny bit in either direction!)
    Have you tried to reproduce what I explained in the very first post?
    Launch GPU-Z, launch Furmark, look at the clock speeds, play a HW accelerated Youtube video in Firefox, look at the clocks during playback while still running Furmark.
    What are your results?
     
    Last edited: Aug 4, 2013

  11. dellon132

    dellon132 Ancient Guru

    Messages:
    1,898
    Likes Received:
    0
    GPU:
    HD 5570 1 GB DDR3
  12. DrunkenDonkey

    DrunkenDonkey Master Guru

    Messages:
    204
    Likes Received:
    2
    GPU:
    2xPC 290 PCS+
    Sadly, flash10 won't help too, I just installed and saw it changing to UVD clocks too, while there is 0 need to do. I need to hard reset the computer every day or 2, because (stupid me) I have tried to run a flash movie. Disabling acceleration really makes for some ugly images, worse in motion than on picture.
     
  13. Offler2

    Offler2 Member Guru

    Messages:
    185
    Likes Received:
    0
    GPU:
    Gigabyte HD 7970OC, 3gb
    I know that when any GPU accelerated video is running, overdrive is ignored.

    And ALL time I was explaining why its like this.

    It has nothing to do with chrome, adobe flash, IE, nor dual display configuration.

    and yes, i reproduced it YEARS ago. I believe that the issue has been fixed, since HD7000. Previously UVD clock (501Mhz) was used everytime when gpu accelerated video/flash was running. Now its used 3d clock without overdrive which is far better.

    If you wish to run video decoding on OCed GPU good luck with synchronizing.

    The "Real" Adobe flash bug which has been mentioned before, mostly in Firefox is the one, when flash parts get corrupted.

    the bug you mention is nothing new. I dont claim its absolutely fixed, i claim that AMD used workaround few years ago. And the rest is explanation why, and how.
     
  14. freibooter

    freibooter Active Member

    Messages:
    58
    Likes Received:
    0
    GPU:
    PowerColor R9 280X 3GB OC
    Offler2, you never explained why AMD uses the UVD clock crutch and why everybody else (Nvidia, Intel) can do HW accelerated video just fine without these limitations.

    You never explain why AMD Overdrive settings cannot be applied to the, entirely arbitrary, maximum UVD clocks. As I said, there is absolutely no good reason for the specific maximums, they are different in different HD 7xxx cards of the same series and generally identical to the 3D clock max unless these are change manually (the problem is that the change is not applied to the UVD clock maximums).

    And you just completely ignore the fact that this actually causes a serious issue on dual-monitor systems by constantly adjusting the memory clock in a way that makes these system unstable.

    I'm also still trying to figure out if you know what you are talking about and your lack of English skills are preventing you from properly absorbing the given facts here and communicating your own knowledge, or if you are simply full of dangerous half-knowledge and you simply don't understand what you are talking about - re-reading some of your comments makes me think it's the latter.
     
  15. DrunkenDonkey

    DrunkenDonkey Master Guru

    Messages:
    204
    Likes Received:
    2
    GPU:
    2xPC 290 PCS+
    If you want to see what the screen looks like if I run flash video while gaming, the pic is not mine, but looks exactly like it:
    http://i.imgur.com/TtZLsi2.jpg

    Almost all cards I have owned were amd/ati, but I'm fresh returning to this camp from couple of years with nvidia and must say I'm kinda disappointed. It does not happen with flash only, doing single thing with the card works, but srsly, amd, just disable that uvd thing if you can't make it work properly and be done for, will have my thanks.
     
    Last edited: Aug 4, 2013

  16. Falkentyne

    Falkentyne Master Guru

    Messages:
    418
    Likes Received:
    2
    GPU:
    Sapphire HD 7970 Ghz Ed.
    Sorry, but you're wrong.
    It does *NOT* use the 3d clock without overdrive. It uses 501 MHz (at least on 7970's and 7950s) for the core and bios defaults (1375 MHz) on non ghz edition 7970's and 1500 MHz on Ghz edition cards.

    The problem is the memory clock being CHANGED from 3D overdrive (not bios default) clocks to bios default clocks when a flash video is played. If the RAM is running at STOCK (1375) but the CORE is overclocked, the problem does not occur.

    You don't see the problem when the card is idle at 300/150 and then play a flash video. Yes it goes to 501/1375 but doesn't trigger corruption.

    The corruption happens *IF* the memory clocks are ALREADY actively running at the overdrive speeds (example: 1700, like if a game is running in the background and the GPU is not on idle) and THEN a flash video is played--the forces the GPU to 501 MHz and the memory clocks back to the bios default, and that can trigger the scrambled screen.

    This never happens if the gpu is currently at 300/150 idle.

    You can ALWAYS Trigger this bug on 144 hz monitors at the max refresh rate, because at 144 hz, the clocks are 500/(your overdrive ram speeds) instead of 300/150 at idle. So if you're using 1700 MHz, it is 500/1700 on the desktop (idle). Run a flash video->scrambled screen (if you manage to unscramble it, gpu-z will show 501/1375 as the clocks)..

    Does NOT happen at 120hz or lower on a 144 hz monitor.

    Not many games keep the GPU running at 3D clocks when you alt tab out. I would imagine that since Black ops 2 does, that might also suffer from the image corruption problem regardless of refresh rate since you will be in 3d clocks alt tabbed).

    An interesting test: If you DISABLE overdrive and use sapphire trixx to overclock the core and memory, does the problem still happen? (if it doesn't, I am going to guess that the clocks wind up getting reset back to stock even after the video is over).
     
    Last edited: Aug 5, 2013
  17. Espionage724

    Espionage724 Guest

    The issue with clock speed being lower when media is playing can also occur on Linux, with the 13.8 Beta driver.

    I start up PCSX2, check the clocks, and the clocks are at the highest as expected (950/1200). Open up a video with VLC with GPU acceleration enabled, core clock drops to 860MHz (memory stays highest; I'm using 3 screens).

    Video playing from Chrome (accelerated video rendering, software video decoding) doesn't affect clock speeds from what they're at (if nothing is open, clocks stick at 300/1200; if PCSX2 is open, 950/1200).
     
  18. freibooter

    freibooter Active Member

    Messages:
    58
    Likes Received:
    0
    GPU:
    PowerColor R9 280X 3GB OC
    There are a lot of things that can lock the memory clock at maximum speed with AMD cards: 144Hz monitors, dual monitors ... pretty much anything that isn't a 60/120Hz single screen will lock your memory clocks at maximum for some (apparently good) reason. When UVD clocks are triggered then, this causes the problems described in this thread.

    When you have two screens, playing a game on one and a video on the other is a pretty normal thing to do (at least I did it all the time when I still had an Nvidia card and when this wasn't something I thought could ever be a problem) ... this also triggers the bug.

    Last time I tested it, Trixx worked almost exactly like Overdrive. It doesn't affect the UVD clocks either and the results and problems are pretty much identical to Overdrive.

    I don't think there is any way to "fix" the UVD clock maximums since AMD made VBIOS editing nearly impossible with the HD 7xxx series.

    Right now there are basically three options:

    1. Not changing the memory clock, ever, and just accept that we bought cards that don't offer any stable over- or underclocking and that don't offer proper multitasking (like gaming and watching videos simultaneously) either.

    2. Disabling absolutely everything that could possible trigger the UVD clocks and accept the fact that videos now like look crap.

    3. Somehow convincing AMD that this is indeed a serious problem they might want to address in the near future and not some oh so clever feature only AMD has thought of that is working as intended (which is basically their attitude right now).
     
    Last edited: Aug 5, 2013
  19. DrunkenDonkey

    DrunkenDonkey Master Guru

    Messages:
    204
    Likes Received:
    2
    GPU:
    2xPC 290 PCS+
    4. Moving back to nvidia.

    I like amd, I like competition too, I loved my 2x7950 for several hours too, before I ran the first movie and then the next days I have spent changing monitors, connectors, cables, drivers, bios, frequencies, ran all kind of tests to see if my cards are somehow broken, searched the internet...
    It is not a big thing to request, it is not adding new things, a plain single line in the code, if core and memory frequencies are above what is min needed to play a video, then simply don't touch them, there, problem gone.
     
  20. freibooter

    freibooter Active Member

    Messages:
    58
    Likes Received:
    0
    GPU:
    PowerColor R9 280X 3GB OC
    I still like my HD 7850, it's great hardware with a great feature set and a brilliant price/performance ratio.

    The biggest reason to buy it over any Nvidia product, however, was its incredible OC potential. Enabling overclocking to the maximum of what Overdrive allows is something this card was basically built for, it runs on these clock speeds without vcore tweaking and without even a hitch and 100% absolutely stable in any situation ...

    ... until the stupid UVD clocks are triggered, then all hell breaks loose.

    And, yes, this would be a relatively simple fix in a driver update. But knowing AMD's attitude this is unlikely going to happen, ever. At least not for current gen cards, maybe future generations will do UVD differently.

    A simple Google search reveals that people have had issues with AMD's unique UVD clocks for years and several generations of cards.
    I wasn't aware of that when I purchased my HD 7850 - but I would have avoided it like the plague if I had known ...

    Why on earth is AMD so god damn stubborn when it comes to driver and design problems?

    Other than the UVD issue, the idiotic default overscan for HDMI is pretty much the best example for this, even now, when the majority of monitors and GPUs come with HDMI connections rather than DVI, AMD still thinks its a brilliant idea to treat them like turn of the century CRT TVs and enable overscan. And disabling this default overscan for all resolutions is only possible via registry hacks - it's utterly ridiculous.

    It makes no sense to enable it by default, it causes serious usability problems and it could literally be fixed in under five minutes. But it's "by design" and it won't be changed.

    The UVD core problem has one small "advantage", though: Unlike the insane overscan default, it does cause crashes and stability issues and there is no acceptable workaround - so maybe there is a very slim chance that we could convince AMD that it may not be the best way of doing things.

    Maybe if enough people file a bug report they'll take it seriously:

    http://www.amd.com/betareport
     

Share This Page