Intel Core i9-9900K 5GHz with Cinebench R15 test

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 14, 2018.

  1. beta-sama

    beta-sama Member Guru

    Messages:
    139
    Likes Received:
    12
    GPU:
    AORUS GTX1080Ti WF
  2. asturur

    asturur Maha Guru

    Messages:
    1,376
    Likes Received:
    503
    GPU:
    Geforce Gtx 1080TI
    Again those statement :(
     
    AlmondMan likes this.
  3. xIcarus

    xIcarus Guest

    Messages:
    990
    Likes Received:
    142
    GPU:
    RTX 4080 Gamerock
    Jesus, this abhorrent horseshit again?

    First off, you didn't even read the article properly. This is already running CB at 5GHz, which makes your overclock score estimate hilariously inaccurate.

    Secondly, 5.2-5.4 on all cores is extremely optimistic like I told you last time. This CPU at stock boosts towards 4.7GHz which means that it's already overclocked towards 5GHz in this test, and if it could do better than that frequency this leak would probably show it.

    Thirdly, the 2700X is way ahead of Sandy single-thread wise: http://hwbench.com/cpus/amd-ryzen-7-2700x-vs-intel-core-i7-2600k

    And lastly I want to put you on a pedestal for this particular piece of bullshit:
    Gaming is one of the only workloads where the 8700K is above the 2700X because of its slightly better single-thread performance.

    Dayum dude, stop fanboying over a piece of hardware.
     
    AlmondMan, Embra and -Tj- like this.
  4. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,759
    Likes Received:
    9,649
    GPU:
    4090@H2O

  5. jura11

    jura11 Guest

    Messages:
    2,640
    Likes Received:
    707
    GPU:
    RTX 3090 NvLink
    That's great result on 2700X,my 5960X would do at 4.5Ghz 1795CB and I would probably need to run well above 5.0GHz to achieve such CB result

    Would you be able do Corona Render benchmark?

    Thanks, Jura
     
    AlmondMan likes this.
  6. xIcarus

    xIcarus Guest

    Messages:
    990
    Likes Received:
    142
    GPU:
    RTX 4080 Gamerock
    Bullshit. The vast majority of productivity workloads scale vertically with both frequency and number of cores, and often (edit) horizontally by networking machines together. Just check any review and notice the productivity software they choose.
    It's true that Intlel's solutions might be better for some single threaded scenarios, but you're vastly overestimating the amount of productivity apps which can't leverage multiple cores.
    Why don't you enlighten us with some examples?

    Not to mention that even in gaming, DX12 and Vulkan frequently come and save the day - and this will get better with time. There's a big difference between current and 2 year old games, check out Shadow of War or Shadow of Tomb Raider. Game developers can't afford to ignore AMD's growing marketshare, they are expected to optimize their games as well as they can. And Intel being forced to go into the core count war will make sure that those same developers will need to optimize well for both sides' sake.
    You should get used to seeing such small differences because that's where we're headed at with these new APIs, even though it bothers you.
     
    Last edited: Sep 15, 2018
  7. Webhiker

    Webhiker Master Guru

    Messages:
    751
    Likes Received:
    264
    GPU:
    ASRock Radeon RX 79
    So it's official I'm a casual gamer. / scrub.

    Are you seriously suggesting I should lower resolution to 1080p and buy an i5 to play a game ?
    A game like Shadow of the Tomb Raider for example ?
    Please explain to me how a game running 1440p / 144 Hz refresh, will be CPU+ Memory bound on a Ryzen system ?
    Why are you so focused on Samsung b-die ? I have an 1800X with Hynix ram and a 2700X with b-die ram. The 1800X performs just as well
    with the Hynix ram as it did with the b-die ram.
     
    AlmondMan and chispy like this.
  8. xIcarus

    xIcarus Guest

    Messages:
    990
    Likes Received:
    142
    GPU:
    RTX 4080 Gamerock
    It's true that QEMU heavily prefers Intel.
    But Adobe suite performance on Ryzen is beyond crap? Holy jesus, have some Photoshop comparisons: https://www.pugetsystems.com/labs/a...zen-2-vs-Intel-8th-Gen-1136/#BenchmarkResults
    The stock 2700X is within 15% of the 8700K, and this is software which heavily prefers single-threaded performance. And I'm in denial? Holy jesus the hypocrisy.

    Yeah? And which Intel gives you this at the current time? The 8700k is moderately slower than the 2700X in multithreaded: http://hwbench.com/cpus/intel-core-i7-8700k-vs-amd-ryzen-7-2700x
    Once again you prove that you're biased as all hell.
     
    AlmondMan and chispy like this.
  9. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    It is literally less than 6% slower than the 8700K, in Photoshop (which is notoriously single threaded).
    [​IMG]
    In Premier, it wrecks the 8700k.
    [​IMG]


    Are talking about the CPU that lost 12% in "responsiveness" performance, according to Intel, after a couple of security patches, as the "hit and miss" one?

    With NVIDIA you get mostly good performance if NVIDIA keeps remembering writting a driver for your card for each game release. This has happened in multiple generations now, and I do not believe it is even subject to debate at this point. AMD has their sucky DX11 driver (although they have improved it a lot the last three years), but most AA games come with DX12 these days, so it is usually better than NVIDIA at these.
     
    Embra, xIcarus, carnivore and 2 others like this.
  10. mtrai

    mtrai Maha Guru

    Messages:
    1,183
    Likes Received:
    374
    GPU:
    PowerColor RD Vega
    Really...all this back and forth...seriously my best score on my Ryzen 2700X in Cinebench R15 is 2045 and I am not an extreme overclocker. I use a Corsair H110i GT AIO for my CPU. Here are a couple of examples. There are many others who have much higher Ryzen Cinebench scores them myself.

    Take a look at this HWBOT competition from this summer and look at these Ryzen scores in various things.

    Also one has to know if all the spectre and meltdown were applied or NOT. YES it really does matter.

    http://oc-esports.io/#!/round/roadtopro_challenger_season4_division4_round2

    https://d1ebmxcfh8bf9c.cloudfront.net/u117888/image_id_2031878.jpeg


    [​IMG]

    Seriously take a look at just the top Cinebench 15 Score from this competition.
    It was 2630 on just the Ryzen 2700X User is
    Oldscarface


    [​IMG]

    At the end of the day...unless your just chasing numbers...or have some specific application need then it does not really matter.
    [​IMG]
     
    Last edited: Sep 15, 2018
    Embra and jura11 like this.

  11. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    10,451
    Likes Received:
    3,130
    GPU:
    PNY RTX4090
    I get 1967 points in CB with my 2700X OC @ 4.3GHz across all 16 threads and with my RAM running OC @ 3466MHz CL14 1T timings.

    I could push for 4.4GHz but clock speed isn't key with Ryzen, its memory speed hence why I have OC my RAM from 3200MHz to 3466MHz. It should do 3600MHz which should push me over 2k points easily.

    I have such a hard time recommending Intel to anyone nowadays. Sure their IPC is king, and their CPU's are the gaming king overall at 1080p. But Ryzen is not far behind and when rising resolution to 1440p or 4K that gap closes big time. AMD have better thermals, better socket support, more cores, better upgrade path, cheaper hardware, heatsink included with most CPU's.

    The only reason to recommend Intel is for sole gamers who only care about fps at 1080p.

    This is coming from someone who has been on Intel since 2006 and the conroe days, until Ryzen came out. Intel now need to seriously get their act together, these 9000 series chips don't really look that great. With the rumours yields are bad as well means prices will have to rise. Only good thing is they are now using solder rather than toothpaste. I expect these chips to overclock VERY well with the right cooling. AMD just seem to have come out the gate with all arms swinging and Intel were just sitting there sipping on a cold beer when AMD knocked it out their hands.

    Competition folks.... we have been waiting for this for a long time and its finally here! :D
     
    AlmondMan, Embra, chispy and 4 others like this.
  12. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Yes, ultimately that game is snake chasing pixel across that screen. I think I am right and you really imply that in chase of high fps, everyone will turn all details to minimum.
    Then tweaks driver and finishes with shader wrapper, to remove them.

    Finally 240fps puke on screen. Oh no, because I have Ryzen. So it is 220fps puke only.
     
    xIcarus likes this.
  13. mtrai

    mtrai Maha Guru

    Messages:
    1,183
    Likes Received:
    374
    GPU:
    PowerColor RD Vega
    Twice now you have stated it is fact...but not provided any reference. Reminds me someone claiming 'it is fake news" all the time.
     
  14. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    So, can you write 99 games where Ryzen does not achieve 240fps for each game which I write where it does reach it?
     
    xIcarus, Amigafan35 and chispy like this.
  15. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    Dude, it literally showed 5 GHz across all cores in the video. Did you even watch it?

    No, you cannot. I have a 4K monitor and even on low settings my 1080 Ti can barely do 60 FPS @ 2160p in AAA games. The idea that you can just lower settings and achieve a CPU cap at any resolution is demonstrably false.

    Saying things that are blatantly false means that nobody is going to believe you.
     
    Last edited: Sep 15, 2018
    AlmondMan, xIcarus and chispy like this.

  16. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    The thing is, when some of us posted facts about stupid crap you said before (like for Adobe), you chose to behave as if we didn't post specific numbers.
    I am not a Ryzen owner, and I probably will not be unless they go above 8/16 on the mainstream, but please define sucks, regarding to Ryzen gaming performance. Channels like hardware unboxed have found an aggregate of the 8700k being 9% faster at 1080p.
    Interestingly enough, the AMD driver seems to treat Ryzen much better, and since Turing seems to be a much more Dx12 friendly architecture, I wonder if Ryzen will see better days there too.
    Bonus video from Hardaware Unboxed. If you remove ancient engines and outliers, the actual difference is closer to 5%.
     
    xIcarus and chispy like this.
  17. Webhiker

    Webhiker Master Guru

    Messages:
    751
    Likes Received:
    264
    GPU:
    ASRock Radeon RX 79
    I never said I accept 60 FPS. You implied that Ryzen owners shouldn't expect more when in fact the difference in 1440p between architectures is virtually non existent.
    I'm not a casual gamer which is the whole point. Sorry you didn't get the irony. I simply don't understand what you men by : "Ryzen has shown AGAIN and AGAIN to limit fps when target is high fps"
    How does Ryzen limit FPS ? When the previous Tomb Raider game was optimized for Ryzen it showed a HUGE boost in FPS and if Adobe would get of their ass I'm sure we would see a boost
    in performance on Ryzen too. Just remember the Intel compiler tricks and how it disabled certain features on competing platforms.

    I went from an i7 4790k to a Ryzen 1800x and never looked back. I'm now on a Ryzen 2700x + X470 chipset and when Ryzen 2 comes out I will be on that too.
    When I owned the i7 4790K I was begging Intel to release a 6 core CPU, which they never did even though I was standing with money in hand.
    Then Ryzen comes out and suddenly everything is possible. Sure you could get a 6 core and 8 core Intel CPU prior to that, but not without selling
    one off your kidneys.
     
  18. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    #don't feed it


    I saw him at techpowerup making the same stir.
     
    Embra, xIcarus, Fox2232 and 2 others like this.
  19. HWgeek

    HWgeek Guest

    Messages:
    441
    Likes Received:
    315
    GPU:
    Gigabyte 6200 Turbo Fotce @500/600 8x1p
    Considering that 2700X comes with nice Stock cooler - that mean that 9900K will cost even more with avg cooler. I think the Delta will be ~200 USD.
    Now do you thinks 9900K still good Value for the money?
     
  20. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    If AMD sold 2700X in variant w/o boxed cooler at reduced price...
    That would be good. While cooler they give you is not bad, very few people use it.

    CoolerMaster did good job on that cooler. Feels solid. But I can say it because I can open box where I have it, and it is in pieces as I took it apart.
    One day, I may use its RGB fan for some lovely thing... RGB Beer pedestal with cooling.
     
    xIcarus likes this.

Share This Page