AMD, I am disappoint

Discussion in 'Videocards - AMD Radeon' started by Neo Cyrus, Feb 27, 2011.

  1. Scyphe

    Scyphe Master Guru

    Messages:
    426
    Likes Received:
    1
    GPU:
    MSI Radeon 7950@1150/1700
    It sets AA to Supersampling mode instead of Multisampling or Adaptive.

    To me, no. But since it seems like a very subjective question this is really meaningless.

    Not that I know of.

    I use a balanced setup as standard, then I use RadeonPro to create overriding settings specific for each game.

    My standard settings:
    4xEQ, Edge Detect
    16xAF
    High Quality AI (AI off)
    AA Mode: SSAA

    Personally I'm not super-touchy about AA and freak out if some edges are jagged etc.! You seem to be quite obsessive about it to be honest.
     
  2. Nato.dbnz

    Nato.dbnz Ancient Guru

    Messages:
    3,264
    Likes Received:
    0
    GPU:
    5870 Crossfire
    Neo Cyrus can you please post some screenshots of the image issues you are having? I have 5870's and I don't think I'm having the same issues as you. Would be good if you could just point out exactly what the issues are with some screenshots to illustrate it.
     
  3. Khronis

    Khronis Member Guru

    Messages:
    163
    Likes Received:
    0
    GPU:
    ATI 5870 x2
    I thought adaptive AA was combining MSAA and SSAA, just using MSAA generally and SSAA on pixels that were still aliased. I may be wrong but that's the general sense I got from googling.
     
  4. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    9,287
    Likes Received:
    346
    GPU:
    GTX 1080 Ti @ 2GHz
    I should have worded it better. I know it says it changes from standard MS to adaptive to SS but what I mean is what exactly is the super sampling setting? I'm used to nVidia terminology. 1x2, 2x1, 2x2, 3x3, 4x4, etc for super sampling.

    I always just assumed one is length and one is width so 1x2 would render one dimension at double the resolution while leaving the other at default to use less resolution than rendering both length and width at double the resolution as 2x2 would. Though I have no idea if that's correct hence why I asked somewhere in there.

    In CCC/RadeonPro the slider just enables SS but doesn't say what exactly the setting is. As in is 2x the equal of nVidia's 2x2? Because it certainly does not look like it. 8x super sampling on the card looks about the same as 2x2 SS did with my GTX280.

    2x2SS + 8xMS gave great results... I tried 8xSS + ML on this to see if I can get something close, it looked arguably worse than just 8xSS on its own and was a massive hit to resources.
    I wanted to do that but I have no screenshots saved from my GTX280 to show the difference. I'd have to reinstall that and show you two screenshots side by side to show you what I mean.
    I didn't find a clear answer... but that seems to be it. Does anyone know for certain?
     
    Last edited: Feb 27, 2011

  5. Nato.dbnz

    Nato.dbnz Ancient Guru

    Messages:
    3,264
    Likes Received:
    0
    GPU:
    5870 Crossfire
    Should post some anyway just so we can have a look. I went from a 280 to my 5870's and I personally thought the IQ increased coming to ATI, although the driver stability definitely decreased. Once you've posted some shots though then an nVidia user can maybe give us some comparison shots so we can see the differences.

    At the end of the day though if what you are experiencing IQ-wise is a deal breaker maybe you should take it as a loss, sell the card and jump back on the green wagon.
     
  6. isidore

    isidore Ancient Guru

    Messages:
    6,210
    Likes Received:
    22
    GPU:
    RTX 2080TI GamingOC
  7. sykozis

    sykozis Ancient Guru

    Messages:
    21,394
    Likes Received:
    804
    GPU:
    MSI RX5700
    I've never, at any point, experienced anything but flawless image quality from ATI products. I'm sorry you're having issues. BUT, according to IQ reviews done on cards from each manufacturer, nVidia's GTX460 is their absolute best card for image quality, but falls just short of every ATI product in regards to image quality. The review was done using the same image quality testing software that so many people rely on to determine otherwise undetectable "flaws" in images. If your image quality is really that bad (for those that have seen just how bad image quality can get, Radeon 9600XT and Geforce 5600XT were the absolute worst I've ever seen), consider an RMA. The only time I've seen image quality drop to the point that you're describing has been related to a faulty GPU or VRAM (again, with Radeon 9600XT and GeForce 5600XT)... Again, I'm sorry you're having such issues, but this is all I can recommend. I've considered reinstalling my HD4850 to see if there's some driver issue, but I've enough problems getting and keeping my GTX460 working properly to chance it. If your card is an "unlocked" HD6950....it's possible that the disabled shaders were actually damaged, but that's just speculation on my part and not subject to any factual information. If you're simply that pissed off....switch back to nVidia...

    Either way, there's really no reason to take your frustration out on forum members who've never experienced the issues you're describing. I've had massive IQ issues with my GTX460 when trying to watch movies in PowerDVD, MPC-HC, WinDVD, Windows Media Players, etc. So far, my issues haven't been resolved by nVidia with any driver release. My issues leave HTPC functions completely unuseable....and I bought PowerDVD 10 Ultra Mark II.....and it's now completely useless to me after spending $60USD for it because images are either pixelated, blurred or distorted to the point that I get headaches from trying to watch any movies (including Blu-Ray).... Trust me, I know how frustrating poor IQ can be. But it's not worth being an ass to others who actually want to help.....even if they really can't.
     
    Last edited: Feb 27, 2011
  8. Krogtheclown

    Krogtheclown Master Guru

    Messages:
    960
    Likes Received:
    0
    GPU:
    FuryXfire 3440x14freesync
    I dont think anyone here upgrades video cards more then I do, every 8 months without fail so far. I buy both NV and AMD and i can say that I prefer ATI picture quality but think nv has better AA. Only problems I had with ATI drivers where me fuking them up and 5970 xcross had some problems. Dont really have any issues with NV drivers but have never had Quads with NV either.
     
  9. The Chubu

    The Chubu Ancient Guru

    Messages:
    2,540
    Likes Received:
    0
    GPU:
    MSi GTX560 TwinFrozrII OC
    MLAA its better when you cant get playable performance at 8x AA so you combine 2x AA or 4x AA with MLAA. Not the best quality, but the jaggies are gone.

    I actually like the effect so i force 4x Box AA with MLAA (i didnt noticed the difference between that combination and 4x Edge Detect and MLAA, but Edge dettect feels less smooth) in FO New Vegas so it stays >40 fps (ingame AA causes crashes when loading interior or exterior cells).

    But thats just me. My last two cards before this one were GF6600LE and GF8600GT (that surprisingly let me play some texture mods on FO3 thanks to its cheaper 512Mb of DDR2), so you can guess that i never put AA more than 2x before (and 2x on very few games). I can crank 4x AA with MLAA and im happy with my first ATi card...
     
  10. naike

    naike Ancient Guru

    Messages:
    2,021
    Likes Received:
    0
    GPU:
    Asus EAH5870
    To be honest, to me it sounds like you are imagining that it's worse.
    If you already looked at your amd card like it was a piece of crap, then it sure will be crap when you try it out.

    I'm not very concerned about IQ so don't know.
    Anyway, can you guys post screenshots of any games you have with proper settings and IQ? Just so I can see the difference to mine, any game will do.
    Preferably BC2 as I have it installed?
     

  11. sykozis

    sykozis Ancient Guru

    Messages:
    21,394
    Likes Received:
    804
    GPU:
    MSI RX5700
    It's possible that he's imagining it....but I'd say it's unlikely as well. I thought I was imagining my IQ issues with my 460 in HTPC operation.....as have others. My wife has seen my 4850, 275 and 460 running PowerDVD and has confirmed that my 460 has vastly degraded IQ during HTPC operation. Fact is, not everyone will have the same experience in all cases. I believe he's actually having these IQ issues as he's described simply because I've seen it in the past. My Radeon 9600XT actually looked worse as IQ settings were increased...as did my GeForce 5600XT. In both cases, the GPU was at fault. He needs to address his concerns of poor IQ with the graphics card maker and see if they will allow an RMA. I've never seen a properly functioning graphics card produce the results he's describing.
     
  12. TheHunter

    TheHunter Banned

    Messages:
    13,408
    Likes Received:
    1
    GPU:
    MSi N570GTX TFIII [OC|PE]
    ^
    i found ATI9600pro IQ waaay better then by GF2 mx400 or even 6600gt later..
     
  13. TDurden

    TDurden Ancient Guru

    Messages:
    1,980
    Likes Received:
    3
    GPU:
    Sapphire R9 390 Nitro
    Can you give a link?
     
  14. Raiga

    Raiga Maha Guru

    Messages:
    1,099
    Likes Received:
    0
    GPU:
    GPU
    @Neo Cyrus

    with same settings, ATI Image Quality is better. If you wanna disagree, POST some SCREEN SHOTS...

    Just because you don't know how to use drivers settings, doesn't mean "AMD is at fault".

    And as far as drivers settings are concerned, ATI/AMD drivers are giving you all the options you need to control your graphics..Unlike Nvidia, which doesn't officially* give you options to change these features on its windows drivers and can only** be changed using a 3rd party tool.

    LOL, what bull crap...that means you just accept the numbers show on the driver/tools screen and you can't actually distinguish.

    Plus.. Did you know ATI/AMD SSAA uses RGSSAA, while GTX280 used a normal Grid algorithm SSAA (where diagonal lines were not even aliased!!)

    Edit -

    Ahem..correction

    should be

    (where some of the diagonal lines were not even anti-aliased properly)
     
    Last edited: Feb 28, 2011
  15. nikitash

    nikitash Banned

    Messages:
    337
    Likes Received:
    0
    GPU:
    MSI R6870 HAWK

    Both nvidia and ati radeon gpu have different architectures so its not necessary to have same features like nvidia have or vice versa.. what matters is the application ur running support your hardware and optimize for it for that i blame microsoft and in-competitive developers.
     

  16. nikitash

    nikitash Banned

    Messages:
    337
    Likes Received:
    0
    GPU:
    MSI R6870 HAWK
    Damn Raiga thats the new thing i have learnt from you about SSAA. Thankx for that:)
     
  17. LinkDrive

    LinkDrive Ancient Guru

    Messages:
    4,675
    Likes Received:
    0
    GPU:
    Toaster
    The reason why your image is blurry, Neo, is because you're not using a tool that can adjust your LOD. Start using Radeon Pro and you'll get an image quality on par with 4x4 SSAA, and possibly even 8x8 if you set the AA level that high.
     
  18. att_user

    att_user Banned

    Messages:
    336
    Likes Received:
    0
    GPU:
    Radeon HD 5870 2 GB
    Do not force AA settings via the CCC which the game engine does not support. If your game only support 4x AA then this is the max AA you can get out of this engine. If you start forcing 8x AA or even SS on this game via the control panel you may get very blurry images.

    For example i tried the game Oblivion to work with 4xAA and 8x AA in SS mode and it looked like ****. It looks even worse if you force the 16x AF in CCC combined with 8xSSAA.

    Just use the ingame options and max them all out while setting all CCC options to "Use Application Settings" and then only add the EQ Edge-Detect Filter and set AA Mode to "Adaptive Multi-Sample AA" and Catalyst AI to "High Quality".

    I think this is the maximum IQ you can get out of a Radeon card. Using the "Multi Sample AA" setting result in lesser IQ.
     
  19. w1w

    w1w Member

    Messages:
    23
    Likes Received:
    0
    GPU:
    XFX 6950 2GB
    OP sounds like a troll from the chans tbh. He's bitching about something subjective for a start and consistently mentions in one way or another how Nvidia get it right everytime.
    So you know, don't feed the troll and all that.
     
  20. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    9,287
    Likes Received:
    346
    GPU:
    GTX 1080 Ti @ 2GHz
    I've been using RadeonPro the entire time.

    I went to take some screenshots in Mass Effect 2 to show the blur but it doesn't occur in that. It must be game specific then, the first two games I tried were blurred to hell and back, neither of which I have installed right now (Vindictus for anyone who has it installed and... why do I forget).

    For those tools with like 2 posts each talking crap to me for daring to say that AMD screwed up:
    [​IMG]

    Back on topic... I don't suppose anyone knows why enabling super sampling in certain games would cause the screen to become really blurry?

    At least from the ME2 screenshots I found out that the slider settings between the regular and EQ settings are not the same thing for super sampling.

    4x -
    [​IMG]

    4xEQ -
    [​IMG]

    Max -
    [​IMG]
     
    Last edited: Mar 1, 2011

Share This Page