Don't buy R9 290X Crossfire Battlefield 4 Mantle or DirectX

Discussion in 'General Hardware' started by ysn, Mar 8, 2014.

  1. ysn

    ysn Guest

    Messages:
    2
    Likes Received:
    0
    GPU:
    MSI R9 290X GAMING
    He guys,

    This is my first post on this forum. I need help and to tell you this important information. One month ago I started building my first new Gaming PC for battlefield 4. My specs
    -------------------------
    i7 4770K @ 4.5
    MSI Z87 Xpower
    2x MSI R9 290X GAMING 4GB (Crossfire)
    16GB Corsair Dominator Platinum 2400Mhz
    Samsung SSD 840 Pro 256GB
    PSU Corsair RM1000
    Windows 8.1 64Bit
    Samsung Full LED HD TV UE46F5000, 100Hz Clear Rate Motion
    --------------------------

    I thought I had a beast setup with those 290X in Crossfire, but I was wrong...

    Since day one I had issues playing Battlefield 4 with smoothness, expected FPS, no lag etc. The FPS I was getting was terrible. I must mention 290X CROSSFIRE reviews are mostly or only above 1920x1080 resolution.

    My first step was to change my motherboard to MSI Z87 Xpower (changed it with Asus Z87 Plus or Deluxe). MSI Z87 Xpower supports PCI-e 3.0 at dual x16 x16. Even though I have seen lots of people telling that it doesnt make a difference to have x8 x8 2.0/3.0 etc. In my case, they were wrong. I had a huge FPS boost. I think it was double. Ive read alot of other guys having R9 290X Crossfire issues, so this might help you. Test it out.

    Went from average 80/100 to 180/200 FPS. Crossfire enabled (and there is a big problem with Crossfire I will show and talk about further down).

    I was happy with the results. I was always testing in TEST RANGE and using the internal BF4 FPS monitor. Typing Perfoverlay.drawfps 1 , ENTER after pressing this button ~ on keyboard. I couldnt really see difference in Crossfire ON/OFF. Both were smooth and the difference was something like Single GPU 140 FPS
    Crossfire 200 FPS (maybe more, values above 200 are not supported).
    AMD Mantle was Enabled. Using AMD Catalyst version 14.1 (beta).

    I benchmarked with Heaven Unigine 4.0 and got the same results as in this review (unfortanely link removed, because of forum rule).
    (Im going to try contact Paul in this video or Newegg TV. At 2:52 you can see that at 1920x1080 resolution R9 290X has problems with 3-/4-way Crossfire. Something like +2 FPS from 2-way R9 290X. At 4:30 he talks about the bad scalers while benchmarking Crossfire. He mentions BF4 at 1080 bad. I will try to find out what he means with that. A little further he tells that AMD has designed these cards for 4K and not 1080

    Meanwhile I have played Battlefield 4 alot and have followed steps to get the best FPS boost and smoothness while playing. Found tips on forums (thanks guys :) ) and youtube.
    -Unparking CPU Cores
    -EnableULPS set all find in regedit to 0, means off (some sort of power saving management)
    -Control Panel,select in Power Options - High Performance
    -BIOS, Disable Intel EIST


    Still while I was playing I noticed that something was off, but I just went on playing. But still couldnt believe that this was the result of the money I spent. Others on youtube had older build systems than mine and there game experience looked much better. I dont think it was only because they were using desktop monitors with higher resolutions like 2560x1440 etc.

    At this moment Im using AMD Catalyst version 14.2 (beta). Two important highlights
    - Mantle: Multi-GPU configurations (up to 4 GPUs) running Battlefield 4 are now supported
    - Frame Pacing for Dual Graphics and non-XDMA configurations above 2560x1600 are now supported with Battlefield 3 and Battlefield 4

    So what I did yesterday was disable Crossfire and I was shocked. It was a great look and feeling while playing. Never seen Battlefield 4 this smooth. Finally I saw the real FPS that was being recorded on screen. The R9 290X is a beast. Unfortunately only in Single GPU mode. When putting 2x R9 290X Crossfire at work you lose. Please look at the pictures beneath. Its weird.

    My description of what is happening in words (Probably tomorrow I will record two videos. Crossfire ON and OFF. Not sure if the difference will be shown while recording. Cant record above 60 fps with Mantle at the moment. Using Mirillis Action desktop mode to record. And chaging Fullscreen in BF4 video settings to Borderless or Windowed)

    CROSSFIRE enabled shows higher FPS on screen in comparison with Single R9 290X. That is what is suppose to be happening right? But when you actually play. The Crossfire is worse. I think it had something to do with the two cards not working together properly and drivers. And the only information I could find is in the above youtube video. But not the complete story Im looking for or any solution. What is wrong with these cards?

    So after using "Perfoverlay.framefilelogenable 1" and analysing the logs saved in my documents. I got the pictures beneath. I played two games for testing Conquest large 64 players on Ultra.

    You can see that the average FPS is only 20/25 FPS higher in Crossfire. With a second card enabled I expect at least somewhere near 50% FPS boost. 20/25 is to low, need at least 60 FPS more with CROSSFIRE enabled. And the average GPU FPS is lower in Crossfire then SIngle GPU. The minimal GPU FPS has the same. Lower in Crossfire mode.

    These numbers are not shown in the FPS monitor while playing. But while playing you can feel something is off. I see 150 FPS in Crossfire but it feels like 30/60, while using Single R9 290X is great. You can just see it and it gets more obvious while playing for awhile and switching back to Single R9 290x to great gameplay


    Where the pictures should be, but doesnt allow it because this is my first post. I need to make at least 10. Forum rule, before allowing links




    So guys first of all thanks for reading. Maybe you have learned something. Finally I want to ask you for you help. I've seen others on youtube with R9 290X crossfire and no problems. They use above 1920x1080 resolutions. I was also minding my TV. It has 100Hz, but i cant select above 60Hz in BF4 settings. But still I dont think this could be the issue causing the strange FPS drop and smoothness loss while playing in Crossfire. You think I should drag my PC to the store and test it with 120Hz?

    The resolution the problem? This R9 290X Crossfire not supporting 1920x1080? Made for 2560x1440, 4K etc? Is this the wrong place to ask for help?

    I need to do some more testing with other games. Think will try Crysis 3 and BF3 tomorrow if possible. For now thnx.
     
  2. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    For good reason, 290X CF is utterly pointless at 1080p since you will usually find no CPU capable of pushing the framerates these dole out @ 1080p, which, as you have seen, tend to be way over 120FPS even.

    Because you're having an insane framerate. When that happens, boosts occur when you boost your memory and bus speeds.

    You're running a 60Hz TV. The second card is basically of no practical benefit. Your FPS is way above 60FPS with one card already.

    Your framerate is already way above your refresh rate to be able to tell any difference. Of course there's no difference.

    Someone should try telling Paul that a heavily CPU-bound application such as BF4 cannot take advantage of more than 1-2 such powerful GPUs at a relatively low resolution of 1080p. Of course this CrossFire solution is not made for 1080p. That's like grabbing a Titan and running 640x480.


    Huge huge CPU bottleneck on 64 player maps @ 1080p. One card is more than enough at this resolution. The system can't go any faster. You're not going to get a 50% boost unless your CPU is that extra % faster.

    Your TV has a motion smoothing technology that is called "100Hz". The TV might work internally at 100Hz, might use black frame insertion, or any technique that shows perceptibly smoother motion than running at FPS = refresh rate with no interpolation or fooling around whatsoever.

    The TV is most likely not capable of receiving more than a 60Hz signal. You could attempt to feed it more by using CRU and adding custom resolutions with refresh rates higher than 60Hz. Watch out for HDMI bandwidth limits so stick to testing below ~74-75Hz first and use this test. If the UFO skips, then you're getting frameskip. If you have a keen eye, you could also notice boxes being missed in the frameskip test there, or a camera could show it.

    You could always use resolution scale in BF4 to simulate a higher refresh rate, and see how the dynamics change. Surely the two cards will choke at 200% resolution scale, and you might see ~100% scaling in that scenario.

    The title is misleading, btw.
     
  3. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    290x xfire is complete overkill for 1080. I ran 780ti sli at 1080 for a month so I would know. These cards are really made for 1440 or triple screens. Something weaker like 680 sli or similar will benefit at 1080 if you want to max out games with 60fps
     
  4. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    Judging from the first post....the OP didn't actually pay attention to the reviews and had no idea what he was doing. People should really seek advice from those with experience before building their first gaming system.
     

  5. airbud7

    airbud7 Guest

    Messages:
    7,833
    Likes Received:
    4,797
    GPU:
    pny gtx 1060 xlr8
  6. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    Just sell that extra card or swap for something else.

    One 290x is enough like you saw yourself + smooth.
     
  7. ysn

    ysn Guest

    Messages:
    2
    Likes Received:
    0
    GPU:
    MSI R9 290X GAMING
    Excuse Yasamoka, but it reached you. That was my point. I wanna thank you man for sending such a great reply on my post. Learned alot from it. First next thing Im going to try is a different screen. If possible, Higher resolution and higher Hz, to see if there is any difference. Like someone posted me on another forum. I connected this beast to an iphone. Not a good idea. I think thats my problem.

    Do you have any recomendation of a good HD TV for gaming? 46 or 50 inch. Need one with high Hz. 120, 144, 240? And I guess True Hz. There is lots of marketing numbers out there on TVs. I dont have a desk in the room for a smaller monitor.
     
    Last edited: Mar 8, 2014
  8. scoter man1

    scoter man1 Ancient Guru

    Messages:
    4,930
    Likes Received:
    217
    GPU:
    MSI GTX 1070ti
    Grab a Qnix QX2710. It's cheap, 1440p, and most people get them up to around 120hz.

    Edit: I didn't see 46". I feel like a lot of big tv's have a lot of "special features" that make their response rates low. I suppose you can turn them off...
    Still, that big tv is only going to be 1080p, so that won't solve your "problem".
     
  9. eclap

    eclap Banned

    Messages:
    31,468
    Likes Received:
    4
    GPU:
    Palit GR 1080 2000/11000
    A big screen tv will still run 1080p, unless you buy a 4k. Imo sell one of the 290x or buy a Korean (1440p, 100hz+). For 1080p 60hz, you really don't need 290x crossfire.
     
  10. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    If you want a TV, it'll either have to be 1080p or 4K as eclap said. So if you're going for a 1080p TV, it'll have to be 120Hz to be different from the one you currently own. Few TVs are 120Hz, and the TVs that use active 3D have a high chance of receiving a 120Hz signal. I haven't seen a TV officially supporting 120Hz so you'll have to add it as a custom resolution, easy to do with CRU.
    Here's a list of confirmed 1080p 120Hz TVs:
    http://www.blurbusters.com/overclock/120hz-pc-to-tv/

    The advice other gurus have given you is sound advice. Either sell the second card, or go for a higher-resolution monitor / 1080p 120Hz monitor / TV.

    On a sidenote, don't TVs almost always have input lag, even if set to PC or game mode? Lowest was about 18ms maybe? Some can reach 100+ ms, that's crazy.
     

  11. AbjectBlitz

    AbjectBlitz Ancient Guru

    Messages:
    3,463
    Likes Received:
    2
    GPU:
    R390 1200/1720
    So much rubbish in this thread. sigh.

    Microstutter is by the sound of it what he is getting, and having 200 fps on 60 hz does matter. Crossfire on 1080p is good, always the more fps you are pushing the better it is if you are receptive to it. It FEELS better, if you run something like vsync then it would be a waste.

    Same reason as I would not use vsync or run on a high input lag screen. Feel is important, many cant notice a difference from mouse to mouse, input lag or vsync. Those things are lost on those folks.

    Playing on TV is useless, its not real 100hz and has insane input lag. No more to say.

    120 fps feels better than 60 any day on any screen, on any resolution.

    :bang:
     
    Last edited: Mar 10, 2014
  12. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    Please explain to me how 290X CF can have microstutter, with frame pacing on, VSync off, in BF4 (the most backed title for AMD), with a CPU bottleneck.

    Cards that used to have microstutter (which is now fixed on DX10+ and Eyefinity) did NOT have ANY microstutter when there was a CPU bottleneck.

    Bah, even if there was 100% microstutter, with the frames of the second card matching the pace of the first, being completely useless, the perceived framerate would be half, way above 60FPS already. It cannot possibly feel like 30FPS, as OP is describing, if it were microstutter.
     
    Last edited: Mar 10, 2014
  13. AbjectBlitz

    AbjectBlitz Ancient Guru

    Messages:
    3,463
    Likes Received:
    2
    GPU:
    R390 1200/1720
    Microstutter has been reduced, but crossfire is still not as smooth as single. My 290 singe is smoother at 60 than my old crossfire at 60 even with frame pacing.

    He does not have a CPU bottleneck if he is hitting 200 fps.

    Known for long time that crossfire 100 fps does not feel as smooth but most people don't notice it. Most people don't notice 120hz vs 60hz or 60 vs 120 fps. Most people are wrong. Op obviously is noticing something is not as smooth.

    Op has to post a video single vs crossfire and see if there is anything obvious wrong.
     
    Last edited: Mar 10, 2014
  14. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    Stop kidding yourself. CrossFire is as smooth as anything out there. What's your "old" CrossFire exactly? And did you lock or VSync at 60FPS?

    He either has a CPU bottleneck of is capping out his framerate at 200FPS. BOTH cause GPU usage to drop below full, and there is NO microstutter when the GPUs are not saturated and hard at work.

    I'm running a 110Hz monitor. CrossFire is as smooth as can be. Games that don't need a second GPU, I just turn that off. Never felt that I gained any smoothness as long as my FPS exceeds my refresh rate.

    Most people don't notice 60 vs. 120? Stop kidding yourself. You have no statistics about this, so don't claim that "most" can't notice the difference.

    There is not a single user on Guru3D that has upgraded from 60Hz to 120Hz and hasn't felt the difference. The vast majority would even refuse to go back to 60Hz.

    And...as long as this discussion is held at Guru3D, all we care about is if *we* can tell the difference. And yes, we can tell. Most goes out of the window.
     
  15. AbjectBlitz

    AbjectBlitz Ancient Guru

    Messages:
    3,463
    Likes Received:
    2
    GPU:
    R390 1200/1720
    Had 290 crossfire and did notice it smoother with one card.

    Hitting 200 fps is hardly what I would consider a bottleneck, something else is at play. I did find with crossfire I was better of at setting a fps limit. Such as 59.4 for a 60hz screen. On a single GPU this is less pronounced.

    Average joe does not notice 120hz or 120fps is better that 60, average joe will argue blind that you dont need more that 30 fps.

    But you are right "we can tell. Most goes out of the window".
     

  16. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    I most certainly can tell the difference between 60hz and 120hz the difference is quite amazing. I'm sure I am not the only person on here that knows that.
     
  17. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    It depends on the monitor.....and the person. On a 60hz monitor, the difference between 60fps and 120fps isn't noticeable (other than screen tearing) because the monitor can only display 60fps. On a 120hz or 144hz (non-G-sync) display, the difference should be easily noticeable......
     
  18. chinobino

    chinobino Maha Guru

    Messages:
    1,140
    Likes Received:
    75
    GPU:
    MSI 3060Ti Gaming X
    Completely made up statistics.

    I used to play CS:S in 1152 x 864 resolution @ 100 Hz (capital H for Hertz, it is named after the German physicist Heinrich Hertz afterall) on my old CRT and then went down to 60 Hz on a 1920 x 1200 widescreen LCD, the difference was very noticeable.

    I put up with the Samsung 60 Hz LCD monitor for 4 years and when I bought an Asus 120 Hz LED monitor to replace it - the difference was night and day.

    More made up statistics.

    Only if average Joe is a console player and has never seen or used a PC.
     
  19. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    There is a difference. Less frametime variance means more consistency and a smoother experience. Shorter frametimes means reduced input lag. The wasted frames are the ones that are fully rendered but then superceded by newer frame(s) in one refresh cycle. Those are shown partially but there's no way to skip rendering those frames as we do not know how long a frame will take to render if we delay its rendering towards the end of the refresh cycle.

    But it's nowhere near groundbreaking. It does make everything feel more responsive though. Used to get that feeling @120FPS @ 60Hz and now @220FPS @ 110Hz (Bioshock Infinite). Don't even have to look at FPS, I just go "ahhh, now FPS is double the refresh rate".
     
  20. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    Personally, I see/feel no difference between 60fps and 200fps. I don't even notice screen tearing. This is why I said that it depends on the person. Anything below 60fps drives me nuts, but I don't notice (without a frame counter) when it goes over 60fps.
     

Share This Page