AMD Faked 9900K Benchmark During E3 Conference

Discussion in 'Processors and motherboards AMD' started by MegaFalloutFan, Jun 12, 2019.

  1. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Wrong premise:
    Thinking that settings shown in some other demo somewhere else is same as there.

    What's real:
    You can create simple preset which will choke 8C/16T which will run well on 12C/24T. And that's what AMD did. That's unfair part because you really can have reasonably good streaming on 8C/16T. AMD simply made scenario where one fares much worse.

    No reason to think that AMD had different settings under the hood for each of tested processors.
     
  2. Truder

    Truder Ancient Guru

    Messages:
    2,400
    Likes Received:
    1,430
    GPU:
    RX 6700XT Nitro+
    I have a problem with this whole argument... (And Gamers Nexus for only thinking about this from a quality perspective) and I'm very surprised very few people have taken the time to look at this from another perspective which is, efficiency.

    Encoder preset determines quality of the video at the bitrate set as it's a setting that allocates CPU time to encoding efficiency. It's more advantageous to use slower presets where possible as it will ensure quality within the bitrates set.

    Now, the truth of the matter is, when at very high bitrates, the compression of the video is of little consequence and there is little need to have more CPU time dedicated to preserving quality but as bitrate is decreased, this is where encoding speed then matters as faster encoding will introduce compression artefacts such as banding, smearing, macroblocking etc, the slower encoding will do its best to preserve the quality within the narrower bitrate and as such minimise the occurrence of artefacts. So generally speaking, streamers with very high upload bandwidth available, aren't often concerned with encoding speed and resulting quality but in comparison, those with limited bandwidth will often need to dedicate more CPU time in order to maintain quality within a limited bitrate.

    Handbrake (I know Handbrake isn't used for streaming, but it's what OBS uses in its backend for x264 encoding which you can see as it has the same settings too) actually has a very nice tooltip that explains this quite nicely. "You should generally set this to the slowest you can bear since slower settings will result in better quality or smaller files.".

    [​IMG]

    And here you can see the same settings exposed in OBS that mirror handbrake

    [​IMG]

    I don't really understand what the problem is, unless AMD were using completely unrealistic settings (such as placebo, which is meant to be used for professional use to maintain quality on a per frame basis with a high number of keyframes), the slow setting is a perfectly reasonable setting to use and all I see here is AMD leveraging the fact that a 3900X can utilise a heavier load which in this case, being able to allocate more CPU time to encoding efficiency.

    Is this a fake test? Absolutely not, fake implies fabricated results. Is it an unfair test? no, not if settings and conditions were kept the same across all platforms. Is it an unreasonable test? Possibly as the results between medium and slow can be at worst negligible and at best substantial, depending on bitrate used. Misleading? To a certain extent, yes but not enough to be concerned about as the settings used are likely only applicable in only a few scenarios (such as bandwidth limited situations I described above).
     
  3. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    I think the issue that Gamers Nexus had was that it gave the impression that the 9900K was incapable of streaming, which is not the case - it's just not very capable at those particular settings. If they had labeled it as "streaming in highest/extreme quality" then I doubt they would have complained. Yes, it's a valid test, but it's also misleading.
     
  4. Truder

    Truder Ancient Guru

    Messages:
    2,400
    Likes Received:
    1,430
    GPU:
    RX 6700XT Nitro+
    I perfectly understand that point of view but the thing I have a grievance with - Gamers Nexus called it misleading in the context of it only being a quality setting and didn't consider how it relates to efficiency with bitrates used.

    The argument they've provided hasn't been fully balanced with the implications and scenarios where these settings would be used - just "Game streaming, it's just odd to be intentionally misleading when your product is already seemingly good. The settings used did not represent any actual use case that is reasonable."... And "Misleading because the average viewer won't understand those settings and those settings aren't really used in game streaming. Most people have no idea what "veryfast" or "slow" mean, and certainly don't know how they're used. All they know is "bigger bar better."" No examples, just a statement with "Most people" assumptions - it's pure opinion on Gamers Nexus part (which is fair enough, there can only be opinions made on AMDs claims of performance) but nevertheless, opinions based on assumptions...

    I'd have to ask, what settings do "most people" use? Personally I would assume "Most people" would use the best quality available intrinsic to their situation, so, hardware capability, bandwidth availability and target audience but I could be very wrong in that assumption, most people might simply select "default settings" or "Most people" would select "automatic" settings etc... See the problem? I find it very surprising, even frustrating, how Gamers Nexus, people who are usually not only critical but objective in their analysis of products, would make such a broad and open ended opinion based on a speculative assumption.

    The best thing to do imo is not only to take AMDs performance results with a grain of salt but also take opinions and speculations with salt as well and just wait until hardware is in the hands of consumers (so reviewers and customers alike), we simply cannot make any evaluations of the capabilities of hardware until then, it's just so absurd to do otherwise.
     

  5. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Just for laughs. This is how I did set up my OBS for 1080p streaming for Path of Exile:

    2700X (CRF30 Placebo Profile=None Tune=Film):
    8x8dct=1 aq-mode=1 aq-strength=1.8 ipratio=1.8 bframes=3 pbratio=2.5 b-adapt=2 open-gop=normal fullrange=on colorprim=smpte240m transfer=smpte240m colormatrix=smpte240m deblock=-1:-1 direct=temporal min-keyint=30 keyint=180 level=4.1 me=umh merange=12 crf-max=48 min-keyint=auto mixed-refs=1 no-mbtree=0 partitions=p8x8,b8x8,i8x8,i4x4 psy-rd=0.5:0.0 rc-lookahead=0 ref=1 scenecut=50 subme=6 threads=12 vbv-maxrate=3600 ratetol=20 vbv-bufsize=0 trellis=1 weightb=1 weightp=2

    Adjusting CRF to ones liking is key here. But values are heavily optimized to deliver maximum IQ per bitrate within capability of 2700X.
    I can't wait to see what I'll be able to do with 3900X.
     
  6. Extraordinary

    Extraordinary Guest

    Messages:
    19,558
    Likes Received:
    1,638
    GPU:
    ROG Strix 1080 OC
    Didn't this happen a couple years ago too with GPU benchmarks?
     
  7. dorsai

    dorsai Guest

    We don't know that though because the CPU's haven't been released...if AMD's IPC improvements prove to be true then it's entirely possible AMD will at least equal Intel in gaming which is really impressive IMHO considering where they were just 3 years ago. Add in that on pretty much everything besides gaming the Ryzen 7's were already beating Intel in many cases I think there's a lot to be excited about...at the minimum 3000 series should push Intel to improve it's own products.
     
  8. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    Stop spreading gossip and rumor as fact! You have no idea how these chips will actually perform, nor does anybody else.
     
    dorsai likes this.
  9. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    You're not interested in the truth. All you're doing is cherry-picking certain results that fit your beliefs (while ignoring everything else that doesn't) and claiming them to be the truth. Spewing this garbage over and over again isn't going to make it true - it just makes you look like a desperate Intel shill who wants to spoil the party before release day arrives.
     
    Aura89 and Fox2232 like this.
  10. norton

    norton Master Guru

    Messages:
    214
    Likes Received:
    56
    GPU:
    GTX970 G1 Gaming
    overclock3d said that the source of this fake crap is videocardz.net and when i go to vieocardz.net i can't find any of this fake crap so stop spreading this useless fake websites here
     

  11. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    Like I said, you are not interested in the truth - spreading unsubstantiated rumor and speculation as fact is not what fact-based people do. You are looking desperately for any and all reasons to bash Ryzen so that Intel's Core looks better by comparison. The most sensible thing to do is to take any pre-release numbers with a truckload of salt and wait for the official benchmarks for real data.
     
  12. MegaFalloutFan

    MegaFalloutFan Maha Guru

    Messages:
    1,048
    Likes Received:
    203
    GPU:
    RTX4090 24Gb
    We know one thing for sure and it makes me sad, ZEN 2 or ZEN 3 or whatever will SUCK BALLS for emulation, thats a fact and especially PS3 emulation that uses Intel TSX to get IMMENSE performance boost.
    and that makes me SAD.
    I game at 4K and use OLED as my PC monitor so im limited to 60hz, as soon as Nvidia adds HDMI 2.1 i will get 2019 OLED [or probably 2020 model] and will have an option to do 4K/120 but even with my 2080ti and future 3080ti 4k/120 is a dream so ill be gaming 4k/60 with VRR, so honestly most of the benchmark wars are irrelevant to me, im GPU limited
    Ill be getting maybe the 12 core ryzen BUT i would prefer the x590 mobo over 570 and 16 core, i will be very angry at amd if they release Ryzen 3000 and dont say nothing about z590 and just 2 months later when 16 core comes out they will release it with x590.
    I want AMD to be at least honest with us and on the day of ryzen 3000 release, let ome big wbe sites the 16 core, just so WE the buyer could make a choice if we go with 12 core [that potentially has higher overclock] or we wait for 16 core if Ryzen 3000 doesn't overclocks above extra 100-200 mhz and people rather get more cores then such lame overclock
    I could wait extra 2 month, i wont die, but if im buying the 12 core I want to know if 590 is real, i NEED MORE PCIE lanes
     
  13. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,040
    Likes Received:
    7,381
    GPU:
    GTX 1080ti
    Shill just keeps shilling.
     
  14. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Every time you mark your blatant lie as fact you look more like a fool.
    I noticed you as one long time ago. But you try hard to let everyone else to notice too.

    I think that staying out of AMD related threads could do you some good.
     
  15. MegaFalloutFan

    MegaFalloutFan Maha Guru

    Messages:
    1,048
    Likes Received:
    203
    GPU:
    RTX4090 24Gb
    You can google it yourself, Intel is faster for emulation, especially PS3 since PS3 emu uses TSX, so even intel CPUs without TSX get lower performance.
     

  16. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,040
    Likes Received:
    7,381
    GPU:
    GTX 1080ti
    theres nothing in cemu that ryzen would fall signficantly behind the core i architecture.

    now if las had said rpcs3, then sure, TSX is used there - but there are multiple ryzen owners in the discord with no problems with the emulator.
     
  17. Alessio1989

    Alessio1989 Ancient Guru

    Messages:
    2,952
    Likes Received:
    1,244
    GPU:
    .
    'cmon rpcs3 is still in alpha stage...
     
  18. MegaFalloutFan

    MegaFalloutFan Maha Guru

    Messages:
    1,048
    Likes Received:
    203
    GPU:
    RTX4090 24Gb
    How is this relavnt? It uses TSX to speed up emulation, for some reason its Intel function that inst shared with AMD [all otehrs are]
     
  19. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    Oh i'm so sorry that higher core count which tend to have lower frequencies then their lower core count systems (up to a point) has a worse gaming system on AMD, it's totally not what Intel does........

    Oh wait, nevermind, that's exactly what intel does, yet you don't fault intel for it. In fact, you can't fault either company for it, since obviously, as the core counts go up, the wattage and temperatures go up, the lower the base and boost frequencies can be. It's a trade off, for higher overall CPU performance at the cost of lower frequencies. Again, both companies do this.

    So for you to come in here and spout the nonsense you're spouting only shows your true colors: You work for intel.

    Unless you want to explain to me how unspeakably impossible it is that the Core i9-9980XE ($2000 processor) doesn't perform in games as well as the 9900k ($499 processor) and how horrible intel most be since they can't even match their lower end processors yet charge so much more for them.....
     
  20. guachi

    guachi Guest

    I think AMD even said in the presentation that the encoding choices were silly but even at silly encoding choices the AMD chip had no problem. Basically, that ludicrous settings weren't ludicrous because their CPU is just. that. good.
     
    Fox2232 and dorsai like this.

Share This Page