AMD Ryzen 5 1400 gaming performance leaks - analysed on YT

Discussion in 'Frontpage news' started by Aritra Das, Apr 1, 2017.

  1. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,128
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    I don't know if you've seen the latest *********** video on Ryzen with the initial Windows and bios updates. The improvements especially in the lowest 1% of frametimes is quite staggering at points.
     
  2. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    Yes I have me and chipsy were talking about it. It looks like the majority of improvements were from the windows update and game updates but the max and average FPS still are around what there were with SMP off. Again any improvements are good improvements the biggest jumps I'm overall performance still comes from core overclocking. But as long as there is no diminishing returns in the RAM overclocking we could see a 10-15% improvement if we could get 4200MHz DDR4 on Ryzen.
     
  3. eclap

    eclap Banned

    Messages:
    31,468
    Likes Received:
    4
    GPU:
    Palit GR 1080 2000/11000
    Intel decided to do nothing? Well, first of all, if they decided to do anything major and push for more performance, there would be no Ryzen, they would have utter monopoly. They did exactly what they needed to do, small increases with each generation.

    Newsflash, they are a business and they had no competition so they held back. They're not here to bleed and sweat and outdo themselves with each generation just to keep people like you happy. They are here to hit profit margins.

    That being said, they did enough. I can honestly tell you that going from a 2500k @4.5ghz to a 7700k @5ghz is a good step up in performance.

    Hey, why don't you buy a Ryzen if it's that great?
    If you have an IPC advantage and can clock 1ghz higher, it's quite an advantage. Then it comes to down whether you need more than 12 threads. If you do, Ryzen is the one to get. If you don't, 6800k all the way, there's no 2 ways about it, or if you only game, 7700k with it's insane IPC and high clocks is the one to get.
     
    Last edited: Apr 4, 2017
  4. eclap

    eclap Banned

    Messages:
    31,468
    Likes Received:
    4
    GPU:
    Palit GR 1080 2000/11000
    I watched Gamers Nexus update of the Ryzen gaming performance. With bios updates, windows updates and fast ram, some games got a nice performance boost. The 2 games that got the most performance were BF1 and one other game I can't remember now. Thing is, those 2 games also had new patches released for those particular games. In other games, the performance difference was pretty much within margin of error.

    Also, even with all those increases, Ryzen @3.9 with 3466mhz RAM was still a fair bit slower than a 7700k @stock and 3200mhz Ram.

    So yes, while Ryzen performance improved (it had to), an overclocked 1700 is still slower than a stock 7700k.

    Anyway, here's the video, check it out and tell me what you think https://www.youtube.com/watch?v=8cHJ7FDZKg8

    Then there's this https://www.youtube.com/watch?v=nLRCK7RfbUg

    Here you see the 1700 @4ghz (overclocked Ryzen) losing to a 7700k @4ghz (underclocked Intel) in pretty much every game, be it on a AMD GPU or Nvidia GPU. You overclock that 7700k to 5ghz or close to it (I swear all of them will do 4.8ghz) and you're laughing.
     
    Last edited: Apr 5, 2017

  5. 0blivious

    0blivious Ancient Guru

    Messages:
    3,301
    Likes Received:
    824
    GPU:
    7800 XT / 5700 XT
    Seems to be a lot of folks trying to convince themselves of things that just don't jive with the numbers. Just buy your Ryzen and enjoy it. It's pretty good. But for goodness sake just stop with the "it's the best CPU for gamers to buy" when it clearly is NOT. If you look at all the benchmarks, it's not even that much better than the FX series was. Kind of in the middle between that and the Intel lineup.

    On a personal front, if Ryzen was that wonderful for gaming, I would have already bought one because I'm pretty antsy to build. I'm not willing to downgrade my gaming performance though. I'm actually annoyed that Ryzen turned about to be kind of weak for gaming, strong for everything else. I was going to buy. I'd be better off getting a 7700K but even that would be pretty pointless since my 4790K is almost as strong.
     
  6. Silva

    Silva Ancient Guru

    Messages:
    2,049
    Likes Received:
    1,199
    GPU:
    Asus Dual RX580 O4G
    I think the real reason why Ryzen isn't shining in games is memory latency: 7700k has half of it and it does play a role, has been shown by benchmarks that Ryzen is sensible to memory speed too.

    Fixed it for you.

    Don't need it and the price doesn't justify the upgrade yet. I'll be buying a RX570 first.

    IPC and clock advantage doesn't mean anything if the software uses all the 16 threads.
    6800K costs more 50€ over 1700X, I would get the 1700X instead.
    I would never buy a CPU for gaming, I use my computer for more than just that.
     
  7. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    [​IMG]

    Really? A 500MHz core overclock gained more performance than a 1000MHz memory overclock. While I agree Ryzen is sensitive to memory clock it still gains more from an increase in core clock.
     
  8. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,128
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    I was reffering more to the minimums. In a lot of the games tested by GN, the minimum 1% was equal if not higher than the 7700k, after the Windows scheduler update.

    Also Hardware Unboxed videos only have average and minimum framerates. The average FPS does give an idea, but using minimums and not 1% or 0.1% percent values is positively retarded. He talks about the Fury X having "low minimums" (which I actually believe it might have), and his proof is the "minimum framerate", and not an actual answer.

    As for the whole point of his video, unless you get a percentage performance comparison between an AMD and Intel CPU, using NVIDIA and AMD GPUs as points, then he basically proves sh*t. The logic behind his tests are non-existent. He's basically doing a GPU test with no meaning at all.

    The proper procedure would have been to take the two CPUs he wants to compare (let's say the 7700k and the 1700), set the benchmarks to a preset that the AMD cards can follow (something like 1080p high would be fine for the Fury X and the 1070), measure first the two CPUs with one card. The whole first half of his video is completely useless, and then he introduced Crossfire as a test parameter and another set of GPUs, without even doing the basic comparison required for his test. :3eyes:

    This would give you something like this:

    NVIDIA test:
    7700k: 100%
    1700: 75%

    Then you use the AMD GPU with the exact same settings, and you get the percentage differences between the CPUs again:

    AMD test:
    7700k: 100%
    1700: 85%

    That could give you some continuity and an actual argument. He only compares the same GPUs under ONE title, he then switches the GPU set for no serious reason, and then uses the new set of GPUs for ANOTHER title.

    Mother of God.

    Even through all the retardation, below are the actual numbers he gives in the end, for the only two tests that had any meaning (even for that meaning we have to overlook the Crossfire factor because he's an idiot), it was like this:
    [​IMG]

    15 minutes of video, enabling Crossfire, switching GPUs, for that. :infinity:

    Which, of course, tells us completely nothing, as he discarded the part in Rise of the Tomb Raider that the Titan XP couldn't be utilized at all with Ryzen, but the sh*tty CFX 295x2 was churning along just fine.

    Oh, and he obviously doesn't know what a GPU driver is or does, or he would have started by saying it's basically a JIT compiler, therefore specific CPU optimization is paramount for its function, especially for NVIDIA hardware that has critical parts of its scheduling done in software.

    TL;DR:

    He should compare the percentage performance differences with the CPUs, using different GPUs in the same settings. After getting the data in the end, he could say, for example:

    "The 7700k is 25% faster when an NVIDIA GPU is used, and 55% faster when an AMD GPU is used."

    Doing the test the way he "did" it, he can't say that.
     
    Last edited: Apr 5, 2017
  9. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    Oh look someone finally noticed this... tiny detail, so I don't have to correct everything by myself. Didn't the other guy use 480CF as well?
    But at least this Unboxed guy is smart enough not to overextend himself by drawing unsubstantiated conclusions like the click-hunting AdoRed Leader.
     
  10. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,128
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Unlike Hardware Unboxed, Adored used CFX480/1070 consistently in multiple tests, and compared 1800x vs 7700k NVIDIA/AMD numbers as percentage increases/drops depending on the GPU you use. He did that under multiple conditions and actual gameplay in RotTR. He also didn't claim that the abnormaly low performance of AMD GPUs under DX11 in RotTR was a "bug".

    Hardware unboxed did nothing like that and he switched hardware for his second "test", while simultaneously rejecting readings from Rise of the Tomb Raider with the Titan X, despite them being repeatable and basically confirming what Adored was claiming (that the NVIDIA DX12 driver is either bad or it doesn't like Ryzen). Hardware Unboxed's Titan X had horrible utilization issues with Ryzen, while the AMD GPU's didn't, and he shuddered it away despite these results literally being the crux of the whole matter.

    What I want to say, is that if that video is an indication of general reasoning capacity from Hardware unboxed, he should be back at school with some basic scientific method classes, learn about control groups, conditions and the rest.
     
    Last edited: Apr 6, 2017

  11. Turanis

    Turanis Guest

    Messages:
    1,779
    Likes Received:
    489
    GPU:
    Gigabyte RX500
    As I said on this forum,some reviers like to be bribed,by green or blue side,very rarely by red side. :)

    [​IMG]

    This explain a lot of software scheduling by nvidia and hardware scheduling by Amd in DX12 Rise of.

    But what about this lesson of scheduling.
    https://www.youtube.com/watch?v=nIoZB-cnjc0

    Next time maybe Mr.Hilbert will test CFX RX 480,because nvidia drivers are not good with hardware scheduling in DX12.Well,if we have a real DX12 game.
     
  12. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    He only drew a general conclusion of NV driver holding back Ryzen, based on 1 or 2 games. Did I remember well?
    I cant watch him too long due to his insanely biased agenda driven content.

    He could just as well claim that Ryzen is holding back Nvidia cards, but nooo that is not even an option for that kook :)
     
  13. Turanis

    Turanis Guest

    Messages:
    1,779
    Likes Received:
    489
    GPU:
    Gigabyte RX500
  14. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,128
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Another channel basically just confirmed what he was saying though.

    Since I'm in the super autist mood lately, I compiled all of the MindBlank Tech results in an Excel. There are some definitive trends. Like that the AMD DX11 driver is a piece of crap still, but that the AMD DX12 driver transforms Ryzen into a beautiful swan that seems to be giving much much less stutter than with the NVIDIA DX12 driver. At 720p where nothing is GPU bound, the AMD driver is not really faster at the average framerate (just 2.9%), but it's 13.9% faster than the NVIDIA driver on the 0.1% bottom frame rate. At 1080p DX12 that increases up to 20.9% faster on average with Ryzen.

    There are some dramatic differences at points, like BF1, where the lower 0.1% of frametimes is 52.6% faster with the RX 480 on the absolute number, and 67% faster as a percentage of the AMD vs NVIDIA driver when using Ryzen. Similar stuff is happening at the 0.1% with Hitman.

    The AMD DX12 driver in 720p is faster than the NVIDIA driver when using Ryzen, in all titles, and in 1080p it's faster in BF1, Gears 4, Hitman and RotTR.

    Here is the link to the Excel.

    Below is the raw data itself. Also the 7700k is a f*cking demon, but I still wouldn't get it for use for more than a couple of years. The AMD GPU driver already is less stuttery under DX12 with Ryzen, as Ryzen provides 10% better bottom 0.1% frametime numbers. I see this increasing even more as patches, simple compiler changes and the platform matures.

    I start with the most important sheet, the one that takes the cumulative results and shows which driver is faster in what with Ryzen.

    In the CPU numbers below the 7700k is taken as reference, so negative numbers means that Ryzen is slower, and in the GPU driver numbers the NVIDIA driver is taken as base, so negative numbers mean that the AMD driver is slower and vice versa.

    [​IMG]

    The analytic results are in the spoiler below.

    [​IMG]

    [​IMG]

    [​IMG]

    [​IMG]
     
    Last edited: Apr 6, 2017
  15. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    If I am tractor driver. And I have no clue how to change gears...
    Am I holding back this tractor or is it holding me back?
     

  16. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,128
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Funny thing how the numbers from another YouTuber seem to show that there might be something there in the claim that there might be issues with the NVidia driver, yet when numbers are on everybody stops talking and the memes are continued elsewhere.
     
  17. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    Battlefield is your saving grace in your argument there. The 50% better in DX12 .01% really muddies the results. If you take that one out the results are starkly different. That being said. Ryzen has soon good in BF1 from day one and it's obvious that AMD has continued to work close with DICE. They need to invest the same dedication with other AAA developers. I would venture to say the vast majority of performance issues are coming from game engines not using Ryzen properly and less to do with Nvidia's DX12 implementation (granted there could be some improvements found there and it would be stupid of Nvidia not to address it).
     
  18. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,128
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    It's also 31% with Hitman and 15% with RotTR. In 720p it's even more, 30% in RotTR. That's not a coincidence and it's not just BF1 optimization. These differences are GPU driver differences, not CPU performance differences, nor absolute GPU performance numbers.

    It's basically telling you that the AMD DX12 driver is much better at using Ryzen, especially at stutter prevention and actual gameplay.
     
    Last edited: Apr 6, 2017
  19. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    It's three games there are plenty of games out there that show the opposite.

    BTW what a revelation there. AMD is better currently in DX12 than Nvidia. DX12 that was based heavily off Mantle, which was designed solely to leverage perfectly the strength of GCN at the low level. Microsoft jumped onto it because of XBox.

    I'm not dissing DX12 however. If it is finally leveraging the power that AMD GPU's have that is great. Nvidia has the capital to invest in getting back on top of this new API and will. So in the end we have a more even playing field.

    On the stutter prevention is that the drivers or the fact that DX12 utilizes multithreaded engines better and since Ryzen has about 3 times the multithread performance of 7700k.
     
    Last edited: Apr 6, 2017
  20. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    Exactly.
    Tractor driver representing CPU, and tractor being GPU :D

    There seems to be some kind of an issue with R7+NV+DX12. But hell if I know whose to blame, if anyone, and why should we be passing blame.
    But with AMD community, the blaming season lasts entire year. Freaking drama which never ends.
    BTW where are 25x16 and 4k benchmarks now? Oh but now 720p is suddenly perfectly fine, and yet yesterday 1080p had been too low... Jesus...

    Haha!
    What I don't understand is that a lot of people seem to be making a lot of effort to try and paint the picture that any reason AMD performs worse is because of another company's doing.
    Why does this matter?
    At the end of the day it's the performance the end user is getting if they buy that product, if you arent happy with the performance I don't see how it is of any comfort that its not the manufacturers fault!
    But then that's what makes a fanboy a fanboy I guess!


    TS
     

Share This Page