Review: Core 9th Series 9900K, 9700K and 9600K Processors

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Oct 19, 2018.

Thread Status:
Not open for further replies.
  1. airbud7

    airbud7 Guest

    Messages:
    7,833
    Likes Received:
    4,797
    GPU:
    pny gtx 1060 xlr8
    How? ....by simply making friends with other people?
     
  2. airbud7

    airbud7 Guest

    Messages:
    7,833
    Likes Received:
    4,797
    GPU:
    pny gtx 1060 xlr8
    Love you @Fox2232 .....he will eventually learn how to sneak in the back door like we do....
    then he will see the light,,,,, ( . ) ( . )
     
  3. Killian38

    Killian38 Guest

    Messages:
    312
    Likes Received:
    88
    GPU:
    1060
    The first gen Ryzen 1600 is on par or faster than the 9600k. That is my point. Intel can do better and they often do. But the 9600k is a joke at that price. I can see why folks would buy the 9900k. It's the best of the best. The 9600k is a ripoff at the price.
     
    airbud7 likes this.
  4. Nealster

    Nealster Guest

    Messages:
    14
    Likes Received:
    15
    GPU:
    EVGA GTX 1080 Ti
    "Tell your chest to stop staring at my eyes." - Rodney Dangerfield


    Yeah France & Co. tanked NASCAR. I'm afraid F1 is getting there too but not quite yet. I hit the ceiling when the FIA made them get rid of the high revving V8 screamers and go with the blown V6 powerplants. Which, by the way, even Mercedes is still struggling with on reliability with all their money.

    Okay I'll leave it at that to get back on topic. I don't want to get wrist slapped in my first day.
     

  5. airbud7

    airbud7 Guest

    Messages:
    7,833
    Likes Received:
    4,797
    GPU:
    pny gtx 1060 xlr8
    hey I'm right there with you bro!

    But!...this is still an Enthusiast website! ....I can only dream of having a 9900K setup!

    I'm still going to read it is here everyday...just like I have always done.
     
  6. Nealster

    Nealster Guest

    Messages:
    14
    Likes Received:
    15
    GPU:
    EVGA GTX 1080 Ti
    Most will go for the 9700k who want an Intel rig with a balanced flavor (as I plan on doing). And methinks you should look at those game benchmarks again.
     
    Last edited: Oct 21, 2018
  7. airbud7

    airbud7 Guest

    Messages:
    7,833
    Likes Received:
    4,797
    GPU:
    pny gtx 1060 xlr8
    9700k is gonna be the go to gaming cpu....



    Only Fox2232 can see this!
    all other members please under any circumstance do not click on spoiler tab.
    I said only Fox2232! Why you Click?...:p
     
    Last edited: Oct 21, 2018
  8. Nealster

    Nealster Guest

    Messages:
    14
    Likes Received:
    15
    GPU:
    EVGA GTX 1080 Ti
    ^^Punk! LOL.
     
    airbud7 likes this.
  9. metagamer

    metagamer Ancient Guru

    Messages:
    2,596
    Likes Received:
    1,165
    GPU:
    Asus Dual 4070 OC
    I'm not going to upgrade, will wait for the next generation of Ryzen CPUs to show their hand. But, the 9900k is actually quite a bit faster than the 2700x at 1080p, which is what the "controversial" previews hinted at.

    Not many will buy these to game at 1080p though, unless they're competitive gamers who need consistent high fps. At 1440p the gap is closer, of course.

    AMD just have to work on higher clocks of their next generation, a guaranteed 4.5ghz and above 8c/16t at a lower price will be a great little CPU.
     
  10. user1

    user1 Ancient Guru

    Messages:
    2,746
    Likes Received:
    1,279
    GPU:
    Mi25/IGP
    I feel like the naming scheme could use some work ....9600k,9700k,9900k,9800x,9820x,9900x, 9920x,9940x...

    Thats not confusing at all! o_O
     

  11. S V S

    S V S Member

    Messages:
    41
    Likes Received:
    13
    GPU:
    Nvidia GTX 1080 Ti
    That was me.

    That is actually why I was so shocked Hilbert stated he was going to remove the only CPU *gaming* benchmark data he collects where the CPU isn't bottle-necked by the GPU, only because a certain segment of fanboys don't like what the data says.

    Unsurprising, the same group complaining about how 720P benchmarks aren't relevant because no one games at 720P don't also complain about many of the synthetic benchmarks for non-gaming tasks Hilbert runs because many of those benchmarks also are not reflective of real world PC usage. I'm sure that discrepancy has nothing to do with the AMD fanboys cherry picking only the data that supports AMD. Look here AMD fanboys, not everyone on this site comes here to have their fandom reaffirmed. I come here to receive unbiased data so I can make purchasing decisions. I don't care about which company makes the parts, I care about knowing if the part suits my needs. It's absurd how much your cheer lead for corporations that don't give a sh*t about you. AMD, Intel, Nvidia, etc., are all in it to make a profit. None of them care about you as an individual more than any of the other corporations. So please, stop insisting Hilbert censor his results so your "team" can look like it is winning. It looks pathetic.




    Here is the thing, Hilbert wasn't only providing 720P resolution benchmarks... he was also providing 1080P and 1440P. Hilbert provides all the data and provides CONTEXT for what the data means in every review.

    The 720P data provides us valid data on the relative performance of CPUs while gaming unconstrained by the GPU, this is no different than how the synthetic non-gaming CPU benchmarks provide relative performance for synthetic non-gaming tasks where the CPU is not constrained. Would you be happy if Hilbert evened the playing field for Intel by locking threadripper parts to the same number of cores as Intel's corresponding processors? No? It really isn't any different, you are demanding Hilbert only show CPU gaming benchmarks where the CPU is constrained. That is misleading.

    All the non-720P CPU gaming benchmarks are there to show that when gaming at higher resolutions you are almost always GPU bottlenecked and may be able to get away with a slower CPU. That data is provided by Hilbert and he even explains that in his reviews.​

    The 720P numbers don't hurt anybody (as Hilbert always provides 1080P and 1440P and CONTEXT for what the data means), except apparently the feelings of a small portion of vocal cry babies in the AMD fan club.

    On the other hand, removing the 720P data makes the *CPU* gaming benchmarks worthless. There is zero point to even having CPU gaming benchmarks if the benchmarks are all GPU constrained. Hilbert just could state if gaming at 1440P any modern CPU will do and not run any gaming benchmarks at all. I'm happy to settle on that as a compromise, AMD fans. It also would save Hilbert time.

    I think we all know why AMD fanboys are demanding Hilbert remove the only benchmarks that show non-GPU bottlenecked CPU gaming performance. The actual result of Hilbert conceding to the demand of one biased segment of his membership is to render his CPU gaming benchmarks misleading and worthless because all gaming CPU benchmarks will be GPU bottlenecked. This is self-serving to the AMD fans and it is ABSURD Hilbert would agree to it.

    Further, by doing this, Hilbert essentially has admitted he will post biased and misleading benchmarking numbers because a segment of his members demand he be biased.

    I will not be able to trust his reviews going forward if the contents can so easily be biased purely by one vocal segment crying. All Hilbert has to do to actually continue posting unbiased reviews is to continue providing all relevant data and context where needed. Exactly what he was doing before.
     
    Last edited: Oct 22, 2018
    Nealster likes this.
  12. rl66

    rl66 Ancient Guru

    Messages:
    3,924
    Likes Received:
    839
    GPU:
    Sapphire RX 6700 XT
    I think it's a nice move to remove 720P even smartphone are over 1080P resolution right now... So it fit real life condition as no one would ever use high end GPU for these, no more..
     
  13. rl66

    rl66 Ancient Guru

    Messages:
    3,924
    Likes Received:
    839
    GPU:
    Sapphire RX 6700 XT
    i might test my Xeons with Quadro at 240P to see if i get some bottleneck... ((joking of course))
     
  14. user1

    user1 Ancient Guru

    Messages:
    2,746
    Likes Received:
    1,279
    GPU:
    Mi25/IGP
    Don't be so hyperbolic, 720p testing is pointless, no one their right mind would so such a thing with 1200$ graphics card.
    It's fair to remove them, they are pretty much a waste of time, since 1080p achieves the same effect, but could be considered not a waste of time.

    I think most people would agree tests with more hardware to compare would better than wasting time with 720p. seriously. i'd much rather see a 2600k on the results page than 720p results

    That said more data doesn't hurt anyone, the only thing i would suggest is maybe putting the 720p results on a separate page or something if you really must have them, since they are not reflective of reality for anyone, and can considered "synthetic" as you said.

    If you want to see the latency advantage of intel look at the ddr4 page, that is pretty much a direct analog for 720p results.
     
  15. S V S

    S V S Member

    Messages:
    41
    Likes Received:
    13
    GPU:
    Nvidia GTX 1080 Ti
    Did you actually read my post?

    Whether or not a CPU gaming benchmark is run at an actual resolution one would game at has no bearing on the validity of the data, this is the same situation as the numerous non-gaming synthetic CPU benchmarks Hilbert runs which have no real world significance *at all*.

    The non-gaming synthetic CPU benchmarks are comparing unconstrained CPU performance, the same as the 720P CPU gaming benchmarks show unconstrained CPU performance in games.

    If you remove the 720P data, all gaming CPU benchmarks will be GPU constrained. That is stupid, again.. at that point, why are we even including gaming CPU benchmarks? A single statement that all modern CPUs are "good enough" because games are most likely GPU constrained is sufficient.

    It is so funny how a website full of supposed tech "enthusiasts" is really arguing that we shouldn't get to know the relative performance of CPUs when gaming, we should just treat all CPUs as being equal. This is really a sad discussion.
     

  16. S V S

    S V S Member

    Messages:
    41
    Likes Received:
    13
    GPU:
    Nvidia GTX 1080 Ti
    This is a *gaming* oriented hardware website, the synthetic non-gaming benchmarks are more "pointless" by far than the single benchmark Hilbert provides that actually shows unconstrained CPU performance when gaming. Note, I am not actually arguing to remove those benchmarks... I like that Hilbert runs them. Just that they actually are pointless to judging how a CPU will perform at gaming, which is the use case of the majority of the members here.

    Again, it is pretty clear who is arguing to remove 720P *CPU* benchmarks and why (AMD fanboys, because it makes AMD processors look bad).
     
  17. sverek

    sverek Guest

    Messages:
    6,069
    Likes Received:
    2,975
    GPU:
    NOVIDIA -0.5GB
    let it go @S V S. Been there, tried to show the meaning of 720p before, and yet people just want to see REAL gaming numbers, relying on highest end gpu to deliver frames. And echo chambering how CPUs are all the same. "Jee, guess my Pentium 4 is good enough for 4K ULTRA" :D

    gamers nexus at least is sane enough to set 1080p settings to medium (which is simply the same GPU load as 720 on highest settings), so it deserve a credit for that.

    This is the direction benchmarking CPU should go. Not going for 720p, but higher resolution with lower video settings.
    This way community is not triggered by 720p, yet stupid enough to miss "normal" settings.
     
    Last edited: Oct 22, 2018
    airbud7 likes this.
  18. airbud7

    airbud7 Guest

    Messages:
    7,833
    Likes Received:
    4,797
    GPU:
    pny gtx 1060 xlr8
    An you want intel to look better when everyone already knows there faster at gaming...
    even hard core AMD Fanboys here will tell you that...

    [​IMG]

    I see nothing wrong with 1080p being the new low benchmark.

    per core performance will be amd's weak point until? ....Well I don't know but if they ever pass intel you can bet your azz it will be front page news everywhere!

    PS: your Viewpoint is a valid one though and it does make AMD look slow ...very slow.
     
    Last edited: Oct 22, 2018
  19. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    And the opposing point is you want to hide AMD is *slower in gaming? That seems dishonest.
     
    airbud7 likes this.
  20. airbud7

    airbud7 Guest

    Messages:
    7,833
    Likes Received:
    4,797
    GPU:
    pny gtx 1060 xlr8
    They are slower...I said that.

    You Drinking again Hemi?....:p
     
Thread Status:
Not open for further replies.

Share This Page