AMD Ryzen 7 5800X3D -1080p and 720p gaming gets tested

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Apr 12, 2022.

  1. TheDeeGee

    TheDeeGee Ancient Guru

    Messages:
    8,326
    Likes Received:
    2,313
    GPU:
    NVIDIA GTX 1070 8GB
    Definitely not.
     
  2. Venix

    Venix Ancient Guru

    Messages:
    2,695
    Likes Received:
    1,398
    GPU:
    Palit 1060 6gb
    In my opinion 720p testing should not be a thing as I said in the past when Intel CPUs where ahead .... What's next 480p to find a meaningful different numbers?
     
    Ufirst likes this.
  3. TLD LARS

    TLD LARS Master Guru

    Messages:
    448
    Likes Received:
    177
    GPU:
    AMD 6900XT
    The price difference between the G-skill 6400 CL32 DDR5 and a DDR4 G-skill 4000 CL18 is something like 3 times or in other words, it is possible to buy the DDR4 4000 32gb kit, a cheap AM4 motherboard and a 5800x for the same money as the G-skill 6400 cl32 DDR5 kit alone.

    Get a DDR4 4000 kit and downclock it to max AMD 1:1 speed and put the 400€ saved in the pocket.

    By the way, what did the cheap Dell memory cost? You have mentioned multiple times that it was cheap, but I have newer seen the price or partnumber.
     
  4. mattm4

    mattm4 Member Guru

    Messages:
    178
    Likes Received:
    22
    GPU:
    Asrock 6900xt
    Sheesh... A lot of back and forth in this thread. I don't really care who is the fastest, but more so impressed with the performance lift that comes from the L3 cache alone. Means going forward, we can see big gains from a big cache for certain workloads, vs boosting power/thermals to improve IPC for the same result.
     

  5. Kaarme

    Kaarme Ancient Guru

    Messages:
    3,205
    Likes Received:
    2,014
    GPU:
    Sapphire 390
    That doesn't really matter. The CPUs themselves, which are being compared, aren't technically comparable. The 12900KF is 200 bucks more than what the 5800X3D is expected to cost. Since this is a comparison that totally ignores the price, I said the memory shouldn't be comparable either; it should be the fastest model reasonably available. Of course the total memory size should be the same so that the games don't behave differently.
     
    Valken and tunejunky like this.
  6. Undying

    Undying Ancient Guru

    Messages:
    20,332
    Likes Received:
    8,622
    GPU:
    RTX 2080S AMP
    Thing is that people test such a low resolutions to test the cpus then crank up the details where any significant difference diminishes.
     
  7. ~AngusHades~

    ~AngusHades~ Active Member

    Messages:
    80
    Likes Received:
    33
    GPU:
    Nvidia RTX 2080
    I do know of the problem but ill go on a limb here and say that it was exacerbated a lot by low IQ Reddit users. I have 5 of my friends who have 5000 series CPUs and do not have that problem, granted they just update their bios, click XMP and game. I won't defend AMD for not being able to solve or know about this problem with their products but I'm old enough to know that every time companies reinvent the wheel there are going to be teething problems. You either haven't brought as much hardware as I or are been deliberately ignorant. Your 11900k is the same Fing ring bus CPU that intel has been refining for over a decade and which allowed them to Fing rip customers off.


    I want to also add that from the original 2600k all Intel did was improve the clocks, add USB ports and remove or add pins to the sockets so you rebuy the same CPU every two years. I wouldn't expect you to have any problems with an Intel CPU from 2600K to an 11 series as it is all the same with more cores.
     
    Last edited: Apr 12, 2022
    Valken and mohiuddin like this.
  8. mohiuddin

    mohiuddin Master Guru

    Messages:
    950
    Likes Received:
    160
    GPU:
    GTX670 4gb ll RX480 8gb
    Yeah right?
    Using r5 2600 like forever... no hardware related issues. Or even any major software issue
     
  9. nosirrahx

    nosirrahx Master Guru

    Messages:
    432
    Likes Received:
    125
    GPU:
    HD7700
    Completely agree with this. A lot of game testing in general revolves around resolutions no one is playing at given the price of the hardware being tested and games that have 1% lows over 120 FPS even on midrange hardware.

    "This $4000 systems plays game X at 547 FPS at 720p." <- wow, such critical information.

    If a CPU/GPU is top of the line, 1440p and 4K are really all that matters. Games that have 1% lows above 120 FPS on 3 year old midrange hardware aren't even worth testing.
     
    ~AngusHades~ likes this.
  10. -Tj-

    -Tj- Ancient Guru

    Messages:
    17,690
    Likes Received:
    2,287
    GPU:
    Zotac GTX980Ti OC
    I think my 3733mhz CL15 is plenty fast with tight 35-36ns latency for years to come :p
    goodrun.jpg
     

  11. Airbud

    Airbud Ancient Guru

    Messages:
    1,764
    Likes Received:
    2,802
    GPU:
    RTX 4090-48GB-900W
    But, but that's why I love this place!...(now where's my coffee?)
     
  12. H83

    H83 Ancient Guru

    Messages:
    4,271
    Likes Received:
    1,744
    GPU:
    MSI Duke GTX1080Ti
    We need to test at lower resolutions to see the real difference between CPUs, otherwise tests are almost useless.

    If we test CPUs at 4k, for example, performance differences are going to be mininal, giving the (false) impression that all the parts are equally fast.

    Anyway, the results seem very good.

    I wonder if it´s possible to test the CPU with 2 and 4 cores disabled, to see how much the extra cache really matters.

    And also to see how more cores impact performance.

    I need Guru`s 3d review!
     
  13. tunejunky

    tunejunky Ancient Guru

    Messages:
    2,877
    Likes Received:
    1,491
    GPU:
    RX6900XT, 2070
    absolutely
    this is why we talk of some games being CPU bound, and where frame rate consistency is best measured. a lot of folks talk of 1% lows and even 0.01% lows.

    but frame to frame is the best analysis of gameplay as it shows (esp Win 11) scheduling latencies.
    i could care less about hypothetical max frames @ 4k as only 4-6 GPU's can even get 120Hz + @ 4k across a broad variety of games.
    what i'm getting at is the only people who do not see 120+ fps @ 4k as a rhetorical exercise are few and far between numbering in the thousands, while everyone else numbers from the tens of thousand to the millions.
    and sub 4k is where the demons of CPU's live.
    AND win 11 is still a major factor causing fps drops (from win 10 same equipment), large latency spikes between frames, and other related issues at 1440p too not just 1080p.
     
    Valken likes this.
  14. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,025
    Likes Received:
    3,444
    GPU:
    HIS R9 290
    But then it's not an apples to apples comparison. Why not just swap out the GPU for a faster one? Why not overclock the AL? It's a CPU benchmark - you want to keep things as similar as possible except the CPU itself.
    That being said, knowing temperatures would be good, because a good CPU performance test is one where neither CPU is thermally throttled.

    Are you using the same RAM, GPU, and clock settings? Because those will make a difference.

    The point of 720p is to make the CPU the bottleneck instead of GPU. Otherwise yeah, it's stupid, because once you get past 144FPS, there's really not much of a noteworthy benefit of going faster. Only a small handful of people can actually take advantage of framerates higher than that, regardless of whether they can see the difference.
     
    Last edited: Apr 12, 2022
    tunejunky likes this.
  15. nizzen

    nizzen Ancient Guru

    Messages:
    2,241
    Likes Received:
    986
    GPU:
    3x3090/3060ti/2080t
    When you test CPU, take away other bottlenecks like GPU and Memory. Memory is a huge bottleneck in many games with 100+ fps, so why not take away most of this bottleneck too?
     
    mohiuddin, tunejunky and Valken like this.

  16. fredgml7

    fredgml7 Member Guru

    Messages:
    170
    Likes Received:
    49
    GPU:
    Sapphire RX 570 4GB
    Based on what you say, I notice at least a preference, and that's ok, you don’t need to be ashamed of it. You seem to care too much about your choices being seen as the best ones, because you “clearly know better”, that’s kinda different from just having an opinion.
    In my case, despite not being decisive, I have a little preference for Intel stuff. In my house we have 5 computers, only my main rig is AMD, but it could be all AMD (based on cost/benefit, opportunity etc.).
     
    ~AngusHades~ likes this.
  17. alpha007org

    alpha007org Member Guru

    Messages:
    101
    Likes Received:
    32
    GPU:
    Gigabyte 7950 Windforce 3
    Games have never been a good test for a CPU. However, because 5800X3D is targeting gamers, how should we compare them? At 1440p and above, all of them (from 5700X up to KF) are going to be GPU limited and will produce around 5% AVG FPS variance.
     
    Why_Me likes this.
  18. Valken

    Valken Ancient Guru

    Messages:
    2,416
    Likes Received:
    615
    GPU:
    Forsa 1060 3GB Temp GPU
    Completely not wrong but I understand the case to use the SAME ram kit.

    In an ideal world, we would have the SAME fastest RAM kit to test both CPUs in gear 1 or 1:1 IF but NOT all IMC are not created equal.
     
    tunejunky likes this.
  19. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,025
    Likes Received:
    3,444
    GPU:
    HIS R9 290
    That's totally fine, so long as BOTH CPUs are getting the memory upgrade.
     
    mohiuddin, tunejunky and Valken like this.
  20. nizzen

    nizzen Ancient Guru

    Messages:
    2,241
    Likes Received:
    986
    GPU:
    3x3090/3060ti/2080t
    Yep, that is why I compare max tuned 5950x vs max tuned 12900k at home. 5950x has max 3866c14 tuned to 52ns, and 12900k has 7200c30 48ns.

    Latency wil be a bit lower with "Ghost spectrum" win 10 vs Standard win 11 I use.
     

Share This Page