Intel Core i9-9900K, i7-9700K, i5-9600K specifications also exposed

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 24, 2018.

  1. HonoredShadow

    HonoredShadow Ancient Guru

    Messages:
    4,280
    Likes Received:
    13
    GPU:
    Gigabyte 1080 ti
    So if these specs are true. Which one would be best for gaming only?

    Also does anyone have a rough clue as to the release time frame?
     
  2. jwb1

    jwb1 Master Guru

    Messages:
    725
    Likes Received:
    156
    GPU:
    MSI GTX 2080 Ti
    For gaming, 6 cores is all you need.

    New CPUs should be around August/September.
     
  3. HonoredShadow

    HonoredShadow Ancient Guru

    Messages:
    4,280
    Likes Received:
    13
    GPU:
    Gigabyte 1080 ti
    Thanks for the reply.
    So the 9600k then? HT not needed for gaming I guess.

    Would I see much of an increase in performance compared to my 2700k @4.6ghz @1080p
     
    Last edited: Jul 26, 2018
  4. jwb1

    jwb1 Master Guru

    Messages:
    725
    Likes Received:
    156
    GPU:
    MSI GTX 2080 Ti
    Yes.

    And you should see a big difference from a 2700k, especially if you overclock the 9600k.
     

  5. TLD LARS

    TLD LARS Member Guru

    Messages:
    198
    Likes Received:
    58
    GPU:
    Asus Vega 64 DIY
    ------------------------------------------------------------------------------------------------------------------------------

    The 95W TDP is at baseclock 3.6Ghz, making it 100Mhz slower then Ryzen 2700X baseclock, so that is how they managed to keep it at 95W TDP.
    The allcore 4.7Ghz full speed would be around 130-150W yes.
    5.0Ghz would be around 200-250W.
    Best guesses from a AMD Fanboi with a 100W overclocked Ryzen 1700.
     
    Last edited: Jul 27, 2018
  6. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,531
    Likes Received:
    278
    GPU:
    GTX1070 @2050Mhz
    Well it depends on how long your want to keep your CPU for, how many years. I personally think it's a bit foolish buying a CPU with only 6 thread capability (i.e. 9600K), because yes it's good for gaming but praps not in 2 or 3 yrs time - I'd buy a CPU today with at least 8 threads - I think the 9700K would be the one to get (8 real cores and no HT), and the 8700K is good too because that's 12 thread.

    EDIT: And given that you've already got an 8 thread CPU in the form of your 2700K @4.6Ghz, then I'd probably be tempted to get the i9-9900K, to get a bump in thread count as well as a bump in frequency & IPC in comparison to your 2700K. Or maybe even the 8700K to retain a bump in thread count (12) while being not as expensive. BAH(!), any way you look at it the 8700K/9700K/9900K are & will be pretty awesome gaming CPUs - check your budget! (8700K benefits if you delid it, and if you're not doing that, then get the other ones).
     
    Last edited: Jul 26, 2018
    HonoredShadow likes this.
  7. HonoredShadow

    HonoredShadow Ancient Guru

    Messages:
    4,280
    Likes Received:
    13
    GPU:
    Gigabyte 1080 ti
    True. I'm just looking at the 8700k as we speak ;)
     
    Robbo9999 likes this.
  8. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,531
    Likes Received:
    278
    GPU:
    GTX1070 @2050Mhz
    I edited my post above while you were typing. If you're willing to delid & overclock the 8700K then I think that's a solid choice. If you don't want to delid, then I'd consider the 9700K (solder) - just can't get over the fact that it's the same number of threads that you currently have!
     
    HonoredShadow likes this.
  9. Dazz

    Dazz Master Guru

    Messages:
    916
    Likes Received:
    102
    GPU:
    ASUS STRIX RTX 2080
    Of course thats manual overclocking you will be pushing past the recommended TDP and thats at the end users own risk, but at stock the CPU's are meant to adhere to the TDP set if the CPU's are outside the specifications then the TDP is useless and there is no point in marketing something that is an outright lie, and if you can't judge it then running the CPU could mean pushing your motherboards VRM to the point of it throttling or worse case the motherboard failing or the TDP of the cooler is no longer being sufficient or if you have a smaller case and clock speeds end up spiking all over the place. As a overclocker you take the risk of exceeding the TDP but for the average user and also bare in mind motherboards typically have boost enabled by default without user intervention as such will always exceed whats set in place and i am not talking about MCE.

    My point is no point in marketing something as a 95w TDP when it's really 150w when you slap it into a board without making any changes what so ever.

    My point being for people using a compact system TDP is very very important. regardless if they use a Intel/AMD. It's like buying a car with a set MPG at a specific speed but the engine runs at higher rpms than it should in turn costing more in fuel per gallon and at which point there is no point in advertising a fake mpg or in this case tdp *cough*BMW*
     
  10. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,810
    Likes Received:
    3,363
    GPU:
    6900XT+AW@240Hz
    There is no better time to wait for next release of CPUs than now. Because with all information that's floating around, one should expect better behavior and longevity from next chips from both intel and AMD.
     
    airbud7 and HonoredShadow like this.

  11. HWgeek

    HWgeek Master Guru

    Messages:
    441
    Likes Received:
    315
    GPU:
    Gigabyte 6200 Turbo Fotce @500/600 8x1p
    Any info about when Intel changing to multi Die like Ryzen? how long after Epyc 7nm/Zen2 release?
    I am asking to know how much time AMD would have to take market share with their 48~64 cores 7nm design befor Intel start to "Glued together" their Dies to release same high core count CPU's.
     
  12. TLD LARS

    TLD LARS Member Guru

    Messages:
    198
    Likes Received:
    58
    GPU:
    Asus Vega 64 DIY
    -------------------------------------------------Quote does not seam to work--------------------------------------

    I totally agree that it is false marketing to write 95W and 4.7Ghz in the same sentence and i am sure that there will be burned ITX boards out there, if the 9900K is compatible with them.
    It could be the same success as some of the first X299 boards, if this is not done correctly.

    It is the same problem on my Ryzen 1700, if it runs full XFR the TDP is higher then the advertised 65W, but that is only 200Mhz ekstra boost speed and not 1100Mhz as the 9900K.
    I hope the Intel chip downclocks itself to baseclock at 80 degrees or something like that, so it does not boost to 4.7 all the way to 95-100 degrees.
    Looking forward to see what coolers are needed to cool the 9900K, my Noctua has a some headroom left for more, but in the 32 degrees ambient we have now in Denmark, I think I would need watercooling for something like a 9900K at full speed.
     
  13. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,810
    Likes Received:
    3,363
    GPU:
    6900XT+AW@240Hz
    I've read somewhere that intel thinks chillers are popular with PC crowd these days.

    /just their 2c
     
  14. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,810
    Likes Received:
    3,363
    GPU:
    6900XT+AW@240Hz
    You are spinning same trash over and over again. But you are spinning out of control. Last time you claimed that Ryzen is not suitable for 100fps, and as proof you used benchmark where it did minimum 101 and average around ~140.

    Now your use of "fine" implies that it is barely enough for 60~75 fps? Please, kindly pull top of your backside from your chair interface.
     
  15. Embra

    Embra Maha Guru

    Messages:
    1,145
    Likes Received:
    344
    GPU:
    Vega 64 Nitro+LE
    You do not need 150+ fps, you want 150+ fps. There is a difference.

    The only really need in gaming is playability.
     
    Last edited: Aug 1, 2018
    HonoredShadow likes this.

  16. Goiur

    Goiur Maha Guru

    Messages:
    1,097
    Likes Received:
    378
    GPU:
    ASUS TUF RTX 3080
    Buy a G-Sync monitor and you wont need 150+ fps lol

    Overall Intel has better fps in games BUT is it a huge gameplay experience diference from 120 to 140 or 150 to 200 fps? I dont think so or im lucky enough not to have some high-tech bionic eye.
     
  17. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,404
    Likes Received:
    921
    GPU:
    ASUS 3080 Strix H20
    Unless you have a refresh rate that will support 200 fps, the no you will not notice a difference from a static 140 to 200fps.

    But FPS oscillation from 100fps to your max refresh rate of say 144hz , it's quite annoying to the eye when you are used to a fluid 144fps.

    VRR displays help this, but when you are used to 144fps or higher, drops to 100fps are quite annoying.

    Similar effect of 60 fps drops to 40fps.

    The 'lag' term is pretty generic.

    There are multiple latencies that add up to the 'total' lag in a PC setup.

    For the ms lag you are referring to, it correlates directly to FPS.

    Higher fps reduces that lag; that 'lag' is the time it takes to draw a single frame.

    For a locked 60fps, it takes 16.67ms to draw a single frame.

    The higher the fps, the lower time it takes to render a single frame.
    This helps visual perception of smoothness and reduces image persistence(reduces the amount of blurring on screen)

    So with high fps, you get reduced lag because the game is able to respond to input quicker.

    As for your question about 'can you see less then 5ms of lag', the answer is yes.

    If it were possible, 0ms lag would have zero blurring when you pan your crosshair on screen. IT would look as if there were perfect pictures without the 'motion blur' effect.
     
    Last edited: Aug 1, 2018
  18. Goiur

    Goiur Maha Guru

    Messages:
    1,097
    Likes Received:
    378
    GPU:
    ASUS TUF RTX 3080
    Im glad I have no high-tech bionic eye to see an annoying difference when frames drop from 144 to 120 or 100, or maybe its just the freesync that helps... *shrug*
     
  19. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,531
    Likes Received:
    278
    GPU:
    GTX1070 @2050Mhz
    If you're playing online competitive multiplayer shooters then you can notice the difference of higher refresh rates above 120+. I have an overclockable 144Hz monitor, and I've run it at 180Hz from time to time - I can notice a difference between 144Hz and 180Hz, but only during fast camera panning in close quarter combat situations, it's a competitive advantage - you can see more accurately what is happening in the game and can respond to it more accurately.
     
    Fox2232 likes this.
  20. Goiur

    Goiur Maha Guru

    Messages:
    1,097
    Likes Received:
    378
    GPU:
    ASUS TUF RTX 3080
    We are talking about 0,2% of the user base here.
     

Share This Page