1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

AMD Faked 9900K Benchmark During E3 Conference

Discussion in 'Processors and motherboards AMD' started by MegaFalloutFan, Jun 12, 2019.

  1. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,469
    Likes Received:
    2,082
    GPU:
    5700XT+AW@240Hz
    Wrong premise:
    Thinking that settings shown in some other demo somewhere else is same as there.

    What's real:
    You can create simple preset which will choke 8C/16T which will run well on 12C/24T. And that's what AMD did. That's unfair part because you really can have reasonably good streaming on 8C/16T. AMD simply made scenario where one fares much worse.

    No reason to think that AMD had different settings under the hood for each of tested processors.
     
  2. Only Intruder

    Only Intruder Maha Guru

    Messages:
    1,062
    Likes Received:
    108
    GPU:
    Sapphire Fury Nitro
    I have a problem with this whole argument... (And Gamers Nexus for only thinking about this from a quality perspective) and I'm very surprised very few people have taken the time to look at this from another perspective which is, efficiency.

    Encoder preset determines quality of the video at the bitrate set as it's a setting that allocates CPU time to encoding efficiency. It's more advantageous to use slower presets where possible as it will ensure quality within the bitrates set.

    Now, the truth of the matter is, when at very high bitrates, the compression of the video is of little consequence and there is little need to have more CPU time dedicated to preserving quality but as bitrate is decreased, this is where encoding speed then matters as faster encoding will introduce compression artefacts such as banding, smearing, macroblocking etc, the slower encoding will do its best to preserve the quality within the narrower bitrate and as such minimise the occurrence of artefacts. So generally speaking, streamers with very high upload bandwidth available, aren't often concerned with encoding speed and resulting quality but in comparison, those with limited bandwidth will often need to dedicate more CPU time in order to maintain quality within a limited bitrate.

    Handbrake (I know Handbrake isn't used for streaming, but it's what OBS uses in its backend for x264 encoding which you can see as it has the same settings too) actually has a very nice tooltip that explains this quite nicely. "You should generally set this to the slowest you can bear since slower settings will result in better quality or smaller files.".

    [​IMG]

    And here you can see the same settings exposed in OBS that mirror handbrake

    [​IMG]

    I don't really understand what the problem is, unless AMD were using completely unrealistic settings (such as placebo, which is meant to be used for professional use to maintain quality on a per frame basis with a high number of keyframes), the slow setting is a perfectly reasonable setting to use and all I see here is AMD leveraging the fact that a 3900X can utilise a heavier load which in this case, being able to allocate more CPU time to encoding efficiency.

    Is this a fake test? Absolutely not, fake implies fabricated results. Is it an unfair test? no, not if settings and conditions were kept the same across all platforms. Is it an unreasonable test? Possibly as the results between medium and slow can be at worst negligible and at best substantial, depending on bitrate used. Misleading? To a certain extent, yes but not enough to be concerned about as the settings used are likely only applicable in only a few scenarios (such as bandwidth limited situations I described above).
     
  3. D3M1G0D

    D3M1G0D Ancient Guru

    Messages:
    1,796
    Likes Received:
    1,136
    GPU:
    2 x GeForce 1080 Ti
    I think the issue that Gamers Nexus had was that it gave the impression that the 9900K was incapable of streaming, which is not the case - it's just not very capable at those particular settings. If they had labeled it as "streaming in highest/extreme quality" then I doubt they would have complained. Yes, it's a valid test, but it's also misleading.
     
  4. Only Intruder

    Only Intruder Maha Guru

    Messages:
    1,062
    Likes Received:
    108
    GPU:
    Sapphire Fury Nitro
    I perfectly understand that point of view but the thing I have a grievance with - Gamers Nexus called it misleading in the context of it only being a quality setting and didn't consider how it relates to efficiency with bitrates used.

    The argument they've provided hasn't been fully balanced with the implications and scenarios where these settings would be used - just "Game streaming, it's just odd to be intentionally misleading when your product is already seemingly good. The settings used did not represent any actual use case that is reasonable."... And "Misleading because the average viewer won't understand those settings and those settings aren't really used in game streaming. Most people have no idea what "veryfast" or "slow" mean, and certainly don't know how they're used. All they know is "bigger bar better."" No examples, just a statement with "Most people" assumptions - it's pure opinion on Gamers Nexus part (which is fair enough, there can only be opinions made on AMDs claims of performance) but nevertheless, opinions based on assumptions...

    I'd have to ask, what settings do "most people" use? Personally I would assume "Most people" would use the best quality available intrinsic to their situation, so, hardware capability, bandwidth availability and target audience but I could be very wrong in that assumption, most people might simply select "default settings" or "Most people" would select "automatic" settings etc... See the problem? I find it very surprising, even frustrating, how Gamers Nexus, people who are usually not only critical but objective in their analysis of products, would make such a broad and open ended opinion based on a speculative assumption.

    The best thing to do imo is not only to take AMDs performance results with a grain of salt but also take opinions and speculations with salt as well and just wait until hardware is in the hands of consumers (so reviewers and customers alike), we simply cannot make any evaluations of the capabilities of hardware until then, it's just so absurd to do otherwise.
     

  5. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,469
    Likes Received:
    2,082
    GPU:
    5700XT+AW@240Hz
    Just for laughs. This is how I did set up my OBS for 1080p streaming for Path of Exile:

    2700X (CRF30 Placebo Profile=None Tune=Film):
    8x8dct=1 aq-mode=1 aq-strength=1.8 ipratio=1.8 bframes=3 pbratio=2.5 b-adapt=2 open-gop=normal fullrange=on colorprim=smpte240m transfer=smpte240m colormatrix=smpte240m deblock=-1:-1 direct=temporal min-keyint=30 keyint=180 level=4.1 me=umh merange=12 crf-max=48 min-keyint=auto mixed-refs=1 no-mbtree=0 partitions=p8x8,b8x8,i8x8,i4x4 psy-rd=0.5:0.0 rc-lookahead=0 ref=1 scenecut=50 subme=6 threads=12 vbv-maxrate=3600 ratetol=20 vbv-bufsize=0 trellis=1 weightb=1 weightp=2

    Adjusting CRF to ones liking is key here. But values are heavily optimized to deliver maximum IQ per bitrate within capability of 2700X.
    I can't wait to see what I'll be able to do with 3900X.
     
  6. Extraordinary

    Extraordinary Ancient Guru

    Messages:
    19,327
    Likes Received:
    1,383
    GPU:
    GTX980 SLI
    Didn't this happen a couple years ago too with GPU benchmarks?
     
  7. las

    las Master Guru

    Messages:
    271
    Likes Received:
    30
    GPU:
    1080 Ti @ 2+ GHz
    You know the only reason 3950X is releasing later is that so AMD has time to cherrypick and get fully working dies right - Price is high because then demand will be low (Limited Supply). It will be a nice chip for sure, just not for gaming. Here it will be "decent" but still lose to Intel's best gaming offerings, just like all the other Zen 2 CPU's.

    Both 3900X and 3950X will deliver worse gaming performance than even 8700K which is 2 years old and this is even with AMD having the node-advantage and also newer arch since 8700K uses Skylake - Not really impressive if you ask me

    Add to this the expensive X570 boards and active chipset cooling - I'd rather get X470 tbh...
     
  8. dorsai

    dorsai New Member

    Messages:
    4
    Likes Received:
    2
    GPU:
    Vega 56
    We don't know that though because the CPU's haven't been released...if AMD's IPC improvements prove to be true then it's entirely possible AMD will at least equal Intel in gaming which is really impressive IMHO considering where they were just 3 years ago. Add in that on pretty much everything besides gaming the Ryzen 7's were already beating Intel in many cases I think there's a lot to be excited about...at the minimum 3000 series should push Intel to improve it's own products.
     
  9. las

    las Master Guru

    Messages:
    271
    Likes Received:
    30
    GPU:
    1080 Ti @ 2+ GHz
    https://www.overclock3d.net/news/cpu_mainboard/amd_ryzen_5_3600_cpu_benchmarks_leak/1

    Single thread perf almost identical to 2700X in CPU-Z. OFC they leave out CB15 single thread result. Typical Ryzen users to leave this important part out. Must mean it's not worth talking about.

    And btw, outside of gaming 8700K beat 2700X in tons of tasks. Actually 8700K overall performs better (as in stable and good performance in every workload compared to hit or miss).

    Ryzen falls short in many games and apps. There's tons of applications that run "bad" on Ryzen. Cemu Emulation (most old i5's beat 2700X here - If you want the worst experience possible, use Ryzen + AMD GPU).. Handbrake... Adobe Suite... etc etc..


    I can't wait to see Zen 2 in CPU bound gaming scenarios, this is where both 1000 and 2000 series choke and gimps fps alot. Tested and tried.
    I hope Zen 2 will at least be able to match Skylake, considering the node-advantage.
     
    Last edited: Jun 20, 2019
  10. D3M1G0D

    D3M1G0D Ancient Guru

    Messages:
    1,796
    Likes Received:
    1,136
    GPU:
    2 x GeForce 1080 Ti
    Stop spreading gossip and rumor as fact! You have no idea how these chips will actually perform, nor does anybody else.
     
    dorsai likes this.

  11. las

    las Master Guru

    Messages:
    271
    Likes Received:
    30
    GPU:
    1080 Ti @ 2+ GHz
    Nothing hurts like the truth
     
  12. D3M1G0D

    D3M1G0D Ancient Guru

    Messages:
    1,796
    Likes Received:
    1,136
    GPU:
    2 x GeForce 1080 Ti
    You're not interested in the truth. All you're doing is cherry-picking certain results that fit your beliefs (while ignoring everything else that doesn't) and claiming them to be the truth. Spewing this garbage over and over again isn't going to make it true - it just makes you look like a desperate Intel shill who wants to spoil the party before release day arrives.
     
    Aura89 and Fox2232 like this.
  13. las

    las Master Guru

    Messages:
    271
    Likes Received:
    30
    GPU:
    1080 Ti @ 2+ GHz
    Yes I am, and I know for a fact that Ryzen lacks in many aspects. They are cheap for a reason.

    Even 8700K beats Zen 2 in gaming, especially when CPU bound, you'll see in a month or so.
     
  14. norton

    norton Member Guru

    Messages:
    176
    Likes Received:
    33
    GPU:
    GTX970 G1 Gaming
    overclock3d said that the source of this fake crap is videocardz.net and when i go to vieocardz.net i can't find any of this fake crap so stop spreading this useless fake websites here
     
  15. D3M1G0D

    D3M1G0D Ancient Guru

    Messages:
    1,796
    Likes Received:
    1,136
    GPU:
    2 x GeForce 1080 Ti
    Like I said, you are not interested in the truth - spreading unsubstantiated rumor and speculation as fact is not what fact-based people do. You are looking desperately for any and all reasons to bash Ryzen so that Intel's Core looks better by comparison. The most sensible thing to do is to take any pre-release numbers with a truckload of salt and wait for the official benchmarks for real data.
     

  16. MegaFalloutFan

    MegaFalloutFan Master Guru

    Messages:
    570
    Likes Received:
    63
    GPU:
    RTX 2080Ti 11Gb
    We know one thing for sure and it makes me sad, ZEN 2 or ZEN 3 or whatever will SUCK BALLS for emulation, thats a fact and especially PS3 emulation that uses Intel TSX to get IMMENSE performance boost.
    and that makes me SAD.
    I game at 4K and use OLED as my PC monitor so im limited to 60hz, as soon as Nvidia adds HDMI 2.1 i will get 2019 OLED [or probably 2020 model] and will have an option to do 4K/120 but even with my 2080ti and future 3080ti 4k/120 is a dream so ill be gaming 4k/60 with VRR, so honestly most of the benchmark wars are irrelevant to me, im GPU limited
    Ill be getting maybe the 12 core ryzen BUT i would prefer the x590 mobo over 570 and 16 core, i will be very angry at amd if they release Ryzen 3000 and dont say nothing about z590 and just 2 months later when 16 core comes out they will release it with x590.
    I want AMD to be at least honest with us and on the day of ryzen 3000 release, let ome big wbe sites the 16 core, just so WE the buyer could make a choice if we go with 12 core [that potentially has higher overclock] or we wait for 16 core if Ryzen 3000 doesn't overclocks above extra 100-200 mhz and people rather get more cores then such lame overclock
    I could wait extra 2 month, i wont die, but if im buying the 12 core I want to know if 590 is real, i NEED MORE PCIE lanes
     
  17. las

    las Master Guru

    Messages:
    271
    Likes Received:
    30
    GPU:
    1080 Ti @ 2+ GHz
    Yeah... I remember trying CEMU / Zelda BOTW on my Ryzen 1700 @ 4.15, performance was crazy bad. Same settings on 8700K; Smooth as butter.

    I think X590 will launch with 3950X late Q4

    I don't think we'll see HDMI 2.1 on GPU's before Ampere 7nm next year, hopefully Q2
     
  18. Astyanax

    Astyanax Ancient Guru

    Messages:
    2,463
    Likes Received:
    610
    GPU:
    GTX 1080ti
    Shill just keeps shilling.
     
  19. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,469
    Likes Received:
    2,082
    GPU:
    5700XT+AW@240Hz
    Every time you mark your blatant lie as fact you look more like a fool.
    I noticed you as one long time ago. But you try hard to let everyone else to notice too.

    I think that staying out of AMD related threads could do you some good.
     
  20. MegaFalloutFan

    MegaFalloutFan Master Guru

    Messages:
    570
    Likes Received:
    63
    GPU:
    RTX 2080Ti 11Gb
    You can google it yourself, Intel is faster for emulation, especially PS3 since PS3 emu uses TSX, so even intel CPUs without TSX get lower performance.
     

Share This Page