Ryzen 3800x experience and help needed

Discussion in 'Processors and motherboards AMD' started by Bibanu, Dec 14, 2019.

  1. Bibanu

    Bibanu New Member

    Messages:
    8
    Likes Received:
    3
    GPU:
    Rx 480 8gb
    Hello,
    This will be a long post but please have patience with me.
    Last month i took advantage of Black Friday and got myself an upgrade: Ryzen 3800x, Gigabyte x570 Aorus Elite and 2x8 Gb HyperX 3200 mhz cl16. I was always hoped AMD would make a come back because i remember my rig from way back: Amd 64 3000+ skt 754, Abit mobo, 2x512 ddr, ATI, yes ATI x800 pro , 80 gb Hdd Chieftec case and PSU and a massive 19 inch Monitor . That pc rocked!! Anyway before my current upgrade i got an Intel i5 4460, 2x8 Gb ddr3 1600mhz, Asus z97-a mobo, Msi rx 480 8gb gamig x that replaced a dead Sapphire r9 390 nitro, 2 ssd , Creative sound card, Seasonic 750w psu and a Zalman H1 case. The last 4 years were ok, but the performance started to fall and here comes Black Friday. I used from the old pc the case, psu, videocard and the ssd's. The sound cars wasn't used because is PCI. I assemble the pc, everything is in order, update bios, install win 10 pro and all drivers and some games. My experience so far is underwhelming. I tested with Cinebench 20 and i got 514 single core and 4970 multicore. A good score but perhaps it can get better? I have not made any modifications other that selecting xmp profile for ram @ 3200 mhz. In gaming i am not seeing a lot of improvement. I ran Jedi Fallen Order and in order to play it i limited the frames at 45fps. The fps were all over and the only stable setting was 45 fps. MechWarriors 5 is a bust. I let the game autodetect the settings and i get between 30 and 40 fps with dips to 25. The fps are all over the place. I don't especially like the AA implementation. It blurs alot and without AA it looks like crap. Sharpening it is not to my likening. I lowered the settings:view distance medium, effects medium, shadows medium, textures high, foliage medium, post processing low, AA high because medium makes the game look bad, anisotropy 8x, sharpening off. Resolution scale is 100%, 1920x1080 and vsync is off. The fps are beween 28 and 75 but mostly around 35 The cpu and gpu frequencies are all over. Afterburner show the cpu between 3.8 and 4.45 Ghz and the Gpu between 960 and 1303 Mhz. The temps appear to be ok, 63 for gpu and up to 70 for the cpu with stock cooler. Locking fps with radeon chill at 30 fps does the trick and i can play the game. Outerworlds is a nice game, i got it for free along with Borderlands 3 when i aquired the cpu and it has that Obsidian touch. With some tweaks i got it to work at 60fps most of the time. Sometime i see a drop to 45 fps Settings are:screen effects on low, view distance high, shadows medium, textures high visual effects medium, foliage medium, chromatic aberration is off.
    The only improvement is in the good old S.T.A.L.K.E.R. I play the Anomaly mod and is made on a x64 version of the xray Engine. I have stable 60 fps in the latest version of the mod compared to 40 fps i was having with the old pc.
    I have the latest drivers and win 10 is updated. Settings in the latest adrenalin driver are the default ones. The case has 5 fans. I use afterburner to set videocard's fan to 35 or 40 % speed in order to keep the temps at around 60-65 degrees. The fan curve in the driver or what Msi decided to do with the card is messed up. The card starts to spin up the fans like crazy after a while. Sometimes during gaming i get a CTD and then i get a message telling me that radeon wattman settings has been restored due to unexpected system failure. It happened with the previous pc build before the upgrade. I checked with gpuz an the card runs PCIex 16x 3.0.
    Any ideas on what tests i should make? I know that the stock cooler is rubbish and the videocard is old but still it should have been some improvement compared with the 4 years old pc. I know that Intel has some advantage in gaming but a 3800x should bring some muscle. Perhaps is the videocard who is to blame and is completely useless or maybe is about to kick the bucket.
    Thanks and apologies for the long post
     
    insp1re2600 and Tiny_Clanger like this.
  2. Kool64

    Kool64 Master Guru

    Messages:
    495
    Likes Received:
    179
    GPU:
    Gigabyte GTX 1070
    The 480 is the bottleneck. Even a 9900ks would give you the same results. Also the stock cooler actually isn’t rubbish.
     
    moo100times and GarrettL like this.
  3. GarrettL

    GarrettL Active Member

    Messages:
    83
    Likes Received:
    43
    GPU:
    Asus Strix 2070S
    I’m running a RTX 2070 Super and 3800x at 1440p. I don’t have those games to compare to but BFV, Borderlands 3 etc run smoothly at high frame rates for me.
     
  4. liviut

    liviut Active Member

    Messages:
    87
    Likes Received:
    5
    GPU:
    Sapphire TRI-X R9 290
    definetely the rx480 is the bottleneck for sure
     

  5. K.S.

    K.S. Ancient Guru

    Messages:
    2,053
    Likes Received:
    532
    GPU:
    EVGA RTX 2080 Ti XC
    Hi @Bibanu hopefully we can figure some things out... In recap, this is your current system.
    "Ryzen 3800x, Gigabyte x570 Aorus Elite, 2x8 GB HyperX DDR4-3200MHZ CL16 & MSI RX 480 8GB Gaming X"

    • Reference RX 480 8GB averages 50-55 fps in SW Jedi Fallen order @ 1920x1080p on "Epic" note: update to the latest version of the game I cannot account for non-retail copies. This game has a known streaming problem that causes stuttering.
    • Please verify your overall GPU performance matches this (note: some results are outliers)
      RX 480 8GB Benchmarks -AnandTech
    • You mentioned updating the BIOS, are you running version F11? It was only posted on the 9th of this month, this Monday.
    • Four different VBIOS have been released since the MSI RX 480 8GB Gaming X launched.
    • To verify if your card qualifies for an in-warranty BIOS update, download and run MSI Live Update 6. Once you're finished updating your VBIOS & restarted your PC - uninstall MSI Live Update.
     
    Last edited: Dec 14, 2019
  6. anticupidon

    anticupidon Ancient Guru

    Messages:
    4,410
    Likes Received:
    1,022
    GPU:
    Vega/Navi
    I have the same stock cooler for my 3700X, and it is decent as it can be. And quiet, which was a big pleasant surprise.
    That RX480...I had one, but shows its age.
    Congrats on the new computer, enjoy it!
     
  7. Bibanu

    Bibanu New Member

    Messages:
    8
    Likes Received:
    3
    GPU:
    Rx 480 8gb
    Hi,
    I have the f11 bios for mobo and i installed MSI live update. It says i have the latest bios. The videocard was purchased on September 6, 2016. I ran userbenchmark and it failed the gpu test. The screen went black and the fans started to spin at max. It is either on its last leg or maybe it needs new thermal paste? Regarding the stock cooler perhaps the mobo is to blame for its erratic behavior. Maybe Gigabyte will fix this. Anyway i need to start saving money for a videocard.
     
  8. -Tj-

    -Tj- Ancient Guru

    Messages:
    16,505
    Likes Received:
    1,538
    GPU:
    Zotac GTX980Ti OC
    -Stalker is very single threaded and you saw a big boost, so cpu is working correct.
    -Also cinebench20 single and multi core score is good.
    -Other two games are just buggy and gpu bound too..


    Wattman is apparently driver issue and being fixed, seen it in release notes.
     
    AsiJu likes this.
  9. K.S.

    K.S. Ancient Guru

    Messages:
    2,053
    Likes Received:
    532
    GPU:
    EVGA RTX 2080 Ti XC
    3 years - Warranty period of 36 months applies on new retail msi graphics cards

    - Ouch try emailing them/opening an RMA with your proof of purchase and explain it just started failing right after your Warranty ended; companies have been known to temporarily grant *grace* RMAs that close to end of.
    (source: handled RMAs at Razer, Apple & Microsoft in previous roles)
     
    Last edited: Dec 15, 2019
  10. Bibanu

    Bibanu New Member

    Messages:
    8
    Likes Received:
    3
    GPU:
    Rx 480 8gb
    Yeah, what would be the chance of that happening?
     
    K.S. likes this.

  11. AsiJu

    AsiJu Ancient Guru

    Messages:
    5,875
    Likes Received:
    1,274
    GPU:
    MSI RTX 2070 Armor
    As opposed to what? Those numbers are spot on for 3800X:

    https://www.guru3d.com/articles_pages/amd_ryzen_7_3800x_review,10.html

    Actually your single core score is excellent.

    As has been said you're GPU bottlenecked, even moreso now.
    That and your GPU seems to be about to bite the dust.

    Of course a very good reason to upgrade GPU as well. Save a bit and slap a custom cooled RX 5700 XT to that rig and see what you've been missing ;)

    For example the Sapphire Nitro+ seems to sell for little under 500 EUR give or take, which to me is a very good price point for that card / features.

    You can find cheaper ones too that still perform about the same. Just get one with a custom cooler.
     
    -Tj- likes this.
  12. SiMallV719

    SiMallV719 Member

    Messages:
    12
    Likes Received:
    0
    GPU:
    Rtx2080ti
    I in the process of upgrading (gear comes tomorrow) from Intel 5930k x99 to 3800x X570 3333ddr4 with 2080ti, hoping for a bit more fps, but my old 5930k died the other day and this is the replacement getting excited as not done Amd for a number of years and nice to see them up with I tel again
     
  13. GarrettL

    GarrettL Active Member

    Messages:
    83
    Likes Received:
    43
    GPU:
    Asus Strix 2070S
    You wont be disappointed! It's a great platform.
     
    SiMallV719 likes this.
  14. Bibanu

    Bibanu New Member

    Messages:
    8
    Likes Received:
    3
    GPU:
    Rx 480 8gb
    Hi, i got a 425 euro bonus for Christmas and i think i am going to get a new card. I am comparing x5700 xt and 2060 super. Which one would be better? The only setback is the fact that my display is 4 year old one, full Hd resolution 144hz and it does not have a display port. It only has Dvi and Hdmi inputs. I am using a dvi cable at the moment.
     
  15. GarrettL

    GarrettL Active Member

    Messages:
    83
    Likes Received:
    43
    GPU:
    Asus Strix 2070S
    The x5700 xt, the 2060S isn't even close.

    https://www.guru3d.com/articles_pages/msi_radeon_rx_5700_xt_gaming_x_review,13.html
     

  16. Bibanu

    Bibanu New Member

    Messages:
    8
    Likes Received:
    3
    GPU:
    Rx 480 8gb
    I just ordered the card for 420 euro. The question is if i should use 2 cables , one 8 pin and one 6 pin, each coming separately from the psu or a single cable that splits in two like i have right now. I have a Seasonic 750 w psu. I will also do a clean windows install.
     
  17. bobblunderton

    bobblunderton Active Member

    Messages:
    70
    Likes Received:
    29
    GPU:
    16g 2400mhz (2x8)
    Use the single cable from your power supply that splits. You just need to make sure they say PCI-E or 'GPU' on the end, that's all. Look where it's plugged into if the supply is modular, that's all. Refer to documentation of the unit if you have additional questions. The 750 watt Seasonic is a good unit and will power a LOT of computer, and most modern supplies are often 'one rail' now especially if they're EPS standard (multi-rail was common for 'ATX' units, where EPS has higher current limit standards hence forgoing the multi-rail need entirely).

    Why does only STALKER show the benefit of a better processor?
    Stalker is the only game you tried that wasn't hammering the GPU, and hence, since the GPU wasn't saturated/holding the system back, the processor increase was noticed there.

    There is a HUGE, monstrous difference coming from an intel quad core - even an i7 4790k @ 4.4ghz with 2400mhz RAM, to a Ryzen 8-core 16-thread CPU with 3000~3200mhz (or better) RAM. I upgraded from my 4790k w/32gb 2400mhz cl-11 memory last summer. It's a night and day difference between that old heap an this 3700x. Your difference won't be found in single-core speed (though statistically it's a decent bit better over a 4000 series intel anything, and has time-saving newer instructions too) as much as the Ryzen will absolutely obliterate the intel when it comes to MODERN games that aren't bound by the Rx 480. I just had my Rx 480 go belly up on the 2nd of this month, it artifacted on the desktop, but it's given me 6 months advanced notice per-say, so I threw in a 2070 Super, and it's an amazing night and day difference. Regardless if you get a 2070, 2060, the super variants, or a 5700/5700XT models from AMD, you won't be disappointed. Just make sure it has 8gb of RAM, because for that chunk of change, you deserve 8gb. It's diminishing returns for your investment going above a 5700xt / 2060 Super right now, as they're pretty good bang for the buck (hint: the AMD has better value, a little more FPS for your money, but the 2060 super will ray-trace quake 2 at 'almost 60fps' though I question the value of what little ray trace power it has in modern games). So it's your money, you decide.
    Your Ryzen also has exponentially more CACHE memory on it which will result in less hitching, and less idle time in the CPU as it won't have to fetch from RAM nearly as much. Playing Cities Skylines while having a media player in the background will be a big difference, let me tell you. If I shut hyper-threading off on my i7 (as it was a heat monger until I delidded it in Jan 2017!), I couldn't listen to music without clicks or pops. Playing Skylines or BeamNG Drive on this Ryzen absolutely DEMOLISHES the 4790k. Just do a comparison of ZIP / UNZIP file work with 7zip using all the threads, and see what I mean.
    Even if you had merely bought a better video board, anything faster than an Rx 480 would have been held back by a locked i5. For modern gaming, the quad core's days are very numbered, what ones that clock high enough (k models) are even having trouble in select titles, the rest are getting dusted by modern processors. So don't have an ounce of buyer's remorse with that.
    That having been all said, you'll enjoy your new system for years. If games or applications or other possibly productivity demands warrant it, you might wish to step up to a 12 or 16-core processor down the road (3000 or 4000 series), because you CAN with AMD. Not that you'll have to.
     
  18. GarrettL

    GarrettL Active Member

    Messages:
    83
    Likes Received:
    43
    GPU:
    Asus Strix 2070S
    I went from an i7 960 to a 3800x on the x570 with a 2070 Super as well. An amazing difference in performance. Good to see AMD is back!
     
  19. 386SX

    386SX Master Guru

    Messages:
    659
    Likes Received:
    698
    GPU:
    AMD Vega64 RedDevil
    @bobblunderton
    Giving tips without knowing the exact model of the PSU is not good. Your assumption is wrong mate.
    Seasonic itself strongly recommends to use 2 separate cables because of heat, for ALL their PSUs, written on the homepage. Say 2x 8pin are at GPU. If you use 1 cable it has to transfer 300w through it, when split to 2 cables each one only has 150w to transfer.

    Proof: https://knowledge.seasonic.com/article/6-how-to-connect-power-supply-cables
    Skip to about 1:30.

    And EPS has "higher current limit"? Mind to share a source?
     
    HandR likes this.
  20. bobblunderton

    bobblunderton Active Member

    Messages:
    70
    Likes Received:
    29
    GPU:
    16g 2400mhz (2x8)
    Well look, if Seasonic designs a PSU that you can't use the full wattage the cables are rated for, then there's larger problems than me bud.
    Does it really matter, short of overloading the plugs with 6 to 8 pin connectors or other y-adapters? Is it going to burn down your computer if you hook up the cable to the plugs IT IS DESIGNED TO BE PLUGGED IN TO? I've been doing this 25 years plus and never had an issue. Sheesh!
    Assuming you can plug in a cable into the device it's designed to power, to use the current it's designed to provide... "Assumptions"... let alone on one of the top brands of power supplies you can purchase for a home PC ... OK. Sorry I ASSUMED that I could do with it what it's designed to do. That's a heck of a lot of liability for a company, if you can plug the PCI-E connector into the device that's specifically keyed to plug in to it - and then suddenly, OH, THAT'S WRONG. It's not going to blow up. Maybe a cheapo unit will eventually, but not a SeaSonic. Worst-case it will shut off and over-current will kick in, but that shouldn't be an issue on a 750 watt unless you've got two mega-overclocked Vega 56 cards on it trying to pull 500w each (which obviously, would be very bad for the unit, and could be a major issue!).
    EPS12V is a successor to ATX (often both are listed) that takes into account higher current limits and demands on the 12v side of the power supply since the Pentium 4 days when we started needing that 'Pentium 4' plug on the motherboard for the 12v feed for the CPU. It's printed pretty much everywhere, go ahead and look it up on Wikipedia. Prior to this, components would load the 3.3v and 5v rails of the PSU more than the 12v side.
    EPS12V usually means a supply will come with (at-least one) an 8-pin CPU 12v connector for modern enthusiast and HEDT motherboards, though some extreme boards could need as much as two or even THREE of these (I saw 3 I believe on the 3175X's board, though I reckon they allow you to hook TWO supplies to it).
    Don't quote me on Pentium 4 being the reason, but that was one of the first ones that needed a dedicated power supply for the CPU VRM componentry. Core 2 and AMD AM2 Athlon II boards I know always had the little 4-pin connector by that time. My first Pentium 4 Willammette or however it's spelled (1.7ghz) actually used a plug that wasn't around long, which looked like the ones from the AT days, but by the time Prescott landed it was back to the 4-pin that's still on the majority of lower-end boards (if lacking the full 8-pin like higher end boards) today. I was glad when they developed some standard for it, because the rare plug used by the 1.7ghz board was rather hard to find a proper supply for three to five years later.

    It's time to read!
    https://en.wikipedia.org/wiki/Power_supply_unit_(computer)
    Specifically, read through ATX specification and then Entry-Level Power Specification (EPS).
     

Share This Page