Shame on you AMD .... !!!!

Discussion in 'Videocards - AMD Radeon Drivers Section' started by testooo, Nov 25, 2015.

Thread Status:
Not open for further replies.
  1. bobalazs

    bobalazs Guest

    Messages:
    28
    Likes Received:
    6
    GPU:
    RX 480 4Gb
    VLIW cards are not capable of doing most of the stuff the GCNs do.
    VSR i know for a fact can't be run on those cards.
     
    Last edited: Nov 26, 2015
  2. Spectrobozo

    Spectrobozo Guest

    Messages:
    48
    Likes Received:
    0
    GPU:
    8600GT
    cannot run because AMD didn't want to or what is the hardware limitation for VSR? and Shader Cache?

    what about frame rate target? I'm 100% convinced that they could, but AMD didn't bother enabling it for the old (still supported by the time) hardware,

    while Nvidia added support for DSR and Adaptive Vsync (not the same as frame rate target, actually it's more complex and useful), for older cards when they launched those features mainly to add value to Kepler and Maxwell and Fermi was already old news (older than VLIW4)
     
  3. akbaar

    akbaar Master Guru

    Messages:
    426
    Likes Received:
    55
    GPU:
    ASUS TUFF 3080 12Gb
    i say its about damn time

    clean up that mess and delete those old cards from factory
     
  4. Espionage724

    Espionage724 Guest

    Honestly I can't see why something like VSR would even be limited by a certain kind of GPU. I'm positive xrandr can do this on Linux (someone reported it could; I haven't been able to reproduce), and the only requirement is that you can run XOrg (and a version of xrandr that supports the scaling feature, but this still covers a large amount of cards across Intel, AMD, NVIDIA, and (likely) even stuff like Maxtor, SiS, and etc).

    All you're doing is taking a larger resolution and scaling it to fit at a smaller resolution. You could toss in some hardware-acceleration and other complicated matters into the mix, but there's nothing that I'm aware of really restricting implementation of this software-side (at least on Linux; not sure what goes on on Windows exactly but would be surprised if it was different).
     

  5. xacid0

    xacid0 Guest

    Messages:
    443
    Likes Received:
    3
    GPU:
    Zotac GTX980Ti AMP! Omega
    About time for them to dump those VLIW and focus on GCN.
     
  6. bobalazs

    bobalazs Guest

    Messages:
    28
    Likes Received:
    6
    GPU:
    RX 480 4Gb
    .

    All you're doing is taking a larger resolution and scaling it to fit at a smaller resolution. You could toss in some hardware-acceleration and other complicated matters into the mix, but there's nothing that I'm aware of really restricting implementation of this software-side (at least on Linux; not sure what goes on on Windows exactly but would be surprised if it was different).[/QUOTE]


    Exactly. Software side and hardware side are two entirely different things.
    You could possibly downscale a 2D image sure. Perhaps you could downscale a windows desktop it does have 2D, but there is a lot of other issues that i'm sure amd does not have enough resources to solve. I mean it took them this long to finally reach a point where a decent linux driver has been released. Might have to do something with steambox being linux based, perhaps.
    Nvidia had a completely different architecture from the beginning, pointless to compare.
    If you really want to get deep into how it all works, you should read into Scalable Graphics Engine that GCN has. here.
     
  7. FunkyMike

    FunkyMike Guest

    Messages:
    539
    Likes Received:
    0
    GPU:
    ATI 6850m /Intel HD3000
    I thought we had all been over the fact that VSR can be supported by VLIW and that it was a software limitation.

    Asder & Co did an amazing job at unlocking it for as long as they did.


    Spoiler: GCN had no magic VSR chips.
     
  8. bobalazs

    bobalazs Guest

    Messages:
    28
    Likes Received:
    6
    GPU:
    RX 480 4Gb
    Link your sources. I bet those were gcn cards as well.
     
  9. FunkyMike

    FunkyMike Guest

    Messages:
    539
    Likes Received:
    0
    GPU:
    ATI 6850m /Intel HD3000

    Look I realise you are new here but I suggest you go back 7 pages or so.

    This "magic VSR" chip that AMD claimed was needed was debunked in this very forum.

    AMD wasn't in the best of places PR wise after their little "marketing" stunt.
     
  10. CalculuS

    CalculuS Ancient Guru

    Messages:
    3,282
    Likes Received:
    502
    GPU:
    GTX 1660Ti
    I can't believe people are hating on FunkyMike, without him a good deal of mobile users would have been ****ed sideways by Microsoft and AMD.
     

  11. bobalazs

    bobalazs Guest

    Messages:
    28
    Likes Received:
    6
    GPU:
    RX 480 4Gb
    There is no hate Calculus, not among us nerds, for sure.


    I don't know what im looking for. This?
    How was game performance on Pre 7000 cards with VSR then? If it were solely a software limitation?
     
  12. Espionage724

    Espionage724 Guest

    On Linux with the OSS radeon driver, you can accelerate XOrg (basically the thing that handles window positions and where things go on a monitor in this sense) on older (VLIW) GPUs though either EXA (if I understand right, basically some dedicated chip that can handle 2D content on GPUs), or with Glamor (2D gets ran through either OpenGL and/or EGL). Generally speaking, glamor would be better to use in most cases, but can be incompatible on older cards (the Xpress 200M I have in one computer doesn't work at all with glamor).

    GCN cards don't have that 2D chip/functionality, so all 2D gets ran through glamor. I have no idea how this compares at all to anything Intel or NVIDIA though, but the NVIDIA open-source driver from my understanding either defaults to using glamor, or can only use it.

    So with that in-mind, if you perform the scaling with xrandr, it's still ran ran/interacting with the GPU considering the X session itself is either using EXA or Glamor (both accelerated by the GPU). 3D content is accelerated through either glamor or EXA as well, so scaled games also interact with the GPU too.

    So in-short, scaling with xrandr on Linux is already mostly hardware-accelerated if you really think about it, but afaik, it doesn't use the specialized hardware chip on newer GPUs if-present (at least, I haven't heard of this happening or being developed, but I know nothing of how such specialized hardware actually works anyway).

    Most of that info is based on somewhat weak background knowledge of how XOrg works though, so I could be wrong (someone please feel free to correct where/if needed).

    As for Windows; as I understand, pre-Aero interfaces were drawn and scaled purely CPU-side, with only 3D and video APIs (D3D, OGL, DXVA, etc) running on the GPU based on how the GPU driver itself handled it. I'm not certain how resolution scaling works on Windows nowadays or anything, or if there's even any way to handle that sort of thing without a graphics driver; but if that is only able to be handled by the graphics driver, that would explain why AMD could "specify" any limitation they wanted.
     
    Last edited by a moderator: Nov 26, 2015
  13. nichenstein

    nichenstein Guest

    Messages:
    22
    Likes Received:
    0
    GPU:
    Sapphire R9 295x2
  14. FunkyMike

    FunkyMike Guest

    Messages:
    539
    Likes Received:
    0
    GPU:
    ATI 6850m /Intel HD3000

    Much love dude! Thank you for those kind words : )



    Yes that is indeed the link. There were 2 ways of enabling VSR.

    1. The reactive way was to apply a dword fix onto the card. Something CCC would generate if VSR was used. That is what you see with the link you posted.

    2. Asder was also releasing modded drivers that had VSR enabled in a more or less native way.

    This was possible because AMD didn't lock the kernel to reject certain dwords that were meant for CGN to enable VSR.

    Turned out you can use them on older hardware too. This was patched by AMD further down the line but it didn't prevent the reactive dword method.

    The 3rd method was to "enable" it for all cards via CCC. The setting was simply disabled in CCC for certain cards. This however was never released.
     
  15. db87

    db87 Member

    Messages:
    28
    Likes Received:
    3
    GPU:
    Gainward GTX 1070 @ 2025
    My secondary system has an AMD Radeon HD5850 1GB graphics card. The graphics card is from September 2009, that's more than 6 years old. That's a long time for getting uncompromised support.

    Earlier it was already known that the VLIW architecture wasn't an good architecture for DirectX12 because of the inflexible scheduling. So AMD excluded those GPU's from DirectX 12. And don't blame AMD for that, the HD5000 series was the first DirectX 11 graphics card that got released.

    I still consider my AMD Radeon HD5850 1GB to be one of the best GPU's I bought (210 euro). A lot of games still run great. For fun I recently tried Star Wars Battlefront, a game that officially requires a ATI Radeon HD 7850 with 2 GB as minimum. My HD5850 ran the game perfectly fine at low detail and textures at medium at 70fps average and 50fps minimum, 1680x1050.

    It's not like you all of a sudden got a useless graphics card. Also Legacy support doesn't mean 0% support. So get real and stop complaining ;)
     
    Last edited: Nov 26, 2015

  16. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    [​IMG]

    Nope. The 5000 series was fighting Fermi. The 200 series by NVIDIA were a holdout until Fermi was out, just six months after the 5000 series. Even if your *SPAM* was correct (and it's not), the 200 series by NVIDIA are still getting new drivers until 04/2016 even though they are on legacy status. In fact the last driver was as recent as 10/11/2015. Want to compare to what the 4000 series cards get? This:
    [​IMG]

    Nobody speaks about performance enhancements. But if you believe that you can use a card even for indie OpenGL games with the Windows Update driver, you are sorely mistaken.
    Orly? This is just one small example. Not only it was working, they wanted to disable it for the original GCN series based on the numbering and on supposed hardware "scalers" that only the 200 series had. They backed out last moment and we haven't heard from anyone from AMD since then. They also disabled it on the driver itself for 5000/6000 series with the latest releases, to make sure that nobody will get anything more than what they give. They did the same sh*t with antialiasing options back in the day too.

    The crux of the matter is: NVIDIA is actually supporting their cards much better. What's happening now with AMD and older GCN cards still getting support is just an afterthought of the company policy to stick to a single architecture, and not AMD suddenly caring. NVIDIA is maintaining three different architectures but AMD can't even maintain two.
     
  17. xacid0

    xacid0 Guest

    Messages:
    443
    Likes Received:
    3
    GPU:
    Zotac GTX980Ti AMP! Omega
    And Nvidia had 2 WHQL drivers that killed GPUs. :banana:
     
  18. TheDukeSD

    TheDukeSD Guest

    Messages:
    145
    Likes Received:
    11
    GPU:
    MSI GT 1030 2GH OC
    Well supporting in a way or another their older products show that they care about their customers. Even if that support is more a fake one (like we did the min to make it run on win 10 and after that we just don't touch that code) it's still good for sales.

    If it was me taking the decisions:
    - dx9 cards support would had ended around April 8, 2014 (in same time with win xp)
    - dx10 cards support would had ended around April 11, 2017 (in same time with vista)
    - dx11 cards support would had ended around January 14, 2020 (in same time with win 7)
     
    Last edited: Nov 26, 2015
  19. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    Even "just the minimum" means that you get actual support for the latest version if Windows, and things like Video Acceleration, Profiles, OpenGL still work. People forget how important OpenGL and game profile support is, especially for older hardware that's gonna be used mainly for indie games.
     
  20. Espionage724

    Espionage724 Guest

    NVIDIA only lies about hardware specifications, removes working driver features (mosaic on Linux because Windows didn't support it), releases WHQL drivers that cause hardware failure, and openly hinders the development of their OSS Linux driver, but hey, all that can be excused because at least they support older GPUs better on Windows :p

    Some problems easily get past, and others don't. The problems that do go by though are what brings about the questionably as to why WHQL even exists (like the 1-2 WHQL drivers that caused hardware damage on NVIDIA GPUs).

    Driver developers may as well just call it "Microsoft tax", because I'm certain the program doesn't exist for legitimate driver quality... If anything, it holds up driver releases considering OEMs have to wait for the "testing", and I suppose it stops script-kiddies and other malicious entities (without disposable income) from just quickly whipping up a dangerous driver and distributing it.
     
    Last edited by a moderator: Nov 26, 2015
Thread Status:
Not open for further replies.

Share This Page