Intel Core i9-10900 10-core Processor Poses for Camera

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Feb 14, 2020.

  1. squalles

    squalles Maha Guru

    Messages:
    1,006
    Likes Received:
    106
    GPU:
    Galax RTX 4070ti
    Caesar likes this.
  2. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    They don't even state (unless i missed it) what the quality settings of the games they play at, and you're worried about playing games at 1080p with an RTX 2080? There's your problem, not what CPU you use.

    Plus, they have a relative difference here:

    [​IMG]

    So you're just cherry picking results to suit your needs, since the relative performance, aka the average between games, doesn't really matter unless you care about literally a few FPS. Good for you.

    [​IMG]

    "Hey guys lets go find one game that really has an issue and lets make it appear that there's an issue everywhere!" - Smart.
     
    HandR likes this.
  3. nizzen

    nizzen Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,157
    GPU:
    3x3090/3060ti/2080t
    Tell me what game "glorious stutter fest" and I'll compare 9900k vs 3900x :) Or did you mean 7600?

    Some said 3900x is faster than 9900k in games. Why can't I get 200+ fps all the time in Battlefield v with 3900x in 1080p (240hz), when my 9900k can? Using 2080ti on both systems.

    PS: I'm using 3900x and 9900k for different tasks ;)
     
  4. squalles

    squalles Maha Guru

    Messages:
    1,006
    Likes Received:
    106
    GPU:
    Galax RTX 4070ti
    Yeah sure, its site fault, configuration of the game fault, game fault, and nobody uses a 144hz monitor

    I need play a specific game put a 4k resolution and target 60hz just because amd guys want says amd wins, amazing kkkkkk
     

  5. beedoo

    beedoo Member Guru

    Messages:
    149
    Likes Received:
    126
    GPU:
    6900XT Liquid Devil
    According to TechPower and their article on tuning the ThreadRipper 3960x system, 221 FPS (187 min.) is achievable at ultra settings 1080p. This is up from the 170 FPS at stock settings.

    Obviously, you have to be a real overclocker - as opposed to a tinkerer, in order to realise this.

    As far as gaming is concerned, it seems there may be a better option than the 9900K(KS) - if you have the money.
     
    Last edited: Feb 16, 2020
  6. IchimA

    IchimA Maha Guru

    Messages:
    1,346
    Likes Received:
    280
    GPU:
    7800XT Asus TUF
    @vbetts right now we need programs to take advantage of so many threds....
    but I am afraid it will be more years till then !
     
    Last edited: Feb 16, 2020
  7. nizzen

    nizzen Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,157
    GPU:
    3x3090/3060ti/2080t
    Did they test BF V multiplayer?

    edit: LOL, they used 0.5 resolution scaling! Half 1080p :D
    Singleplayer, and still under 200 min fps :p
    Looks like we have to wait for next generation...

    Try again....

    edit 2: I tried now just for fun with my workstation. 7980xe + 2080ti. Getting over 250-320fps as infantery with 50% res scaling LOL. Average in plane was 320fps :p

    Knowing is better than thinking ;)
     
    Last edited: Feb 16, 2020
  8. beedoo

    beedoo Member Guru

    Messages:
    149
    Likes Received:
    126
    GPU:
    6900XT Liquid Devil
    All I heard is, there are numerous better options to the 9900K - if you have the money ;)

    I think a lot, and know a lot - but I never assume I know everything...

    Edit. Your 7980xe + 2080ti is doing well then. I've just found one site that had the 3960x and 7980xe in BF V (High) at 1080p on a 2080Ti, both doing 175 FPS.
     
    Last edited: Feb 16, 2020
  9. nizzen

    nizzen Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,157
    GPU:
    3x3090/3060ti/2080t
    If you play 24/7 Cinebench, there is clearly many better options to the 9900k :)

    I'm using threadripper as my main server. F@H when "idle"
     
  10. Mundosold

    Mundosold Master Guru

    Messages:
    243
    Likes Received:
    108
    GPU:
    RTX 3090 FE
    Who are these people with RTX 2080+ and 9900Ks but play on a 1080P monitor? And then use HALF 1080P (960x540) which is literally lower than 1024x768 that I was using in 1998? "eSpOrTs gAmErs!!!"

    Intel is not better for gaming. That is a meme. Might as well compare Quake 3 at 640x480.
     
    Fediuld likes this.

  11. Tiny_Clanger

    Tiny_Clanger Guest

    Messages:
    333
    Likes Received:
    347
    GPU:
    igpu
    ive got 2c4t, 1 4c4t, 2 6c6t, 2c/4t is ok for light stuff, 4c is enough, 6c is really overkill but with enough overhead for whatever tomorrow brings, i dont need and cant justify the price of 8c8t.
     
  12. nizzen

    nizzen Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,157
    GPU:
    3x3090/3060ti/2080t
    So what are YOU talking about? Do you know what we are talking about, or are you one of the russian facebook trolls?

    Gotta love all raging people that just puking out words, without knowing what's going on...

    PS: who are those people playing with high end cpu and gpu, but playing in 60hz v-sync? ;)
    ...or the people playing cinebench 24/7 :D
     
    Last edited: Feb 16, 2020
  13. Chess

    Chess Guest

    Messages:
    390
    Likes Received:
    57
    GPU:
    ASUS GTX1080Ti Stri
    I used to think so with my 3930k as well. When I bought it in April 2012 I felt really stupid because a mate's 4770k had the same performance for a far cheaper price.

    Meanwhile, I can still game comfortably on it ( not top performance obviously ) and he went trough 2-3 systems by now.
    6c/12t, 12MB L3 and quad channel helps a bunch in future proofing. So no, I'll not go 3600x / 10600, but I'll splunge on the lower types of HEDT cpu's and hope it can run for 8years as well.
    Looking at 8-10c, min 20MB L3, quad channel DDR5 in a year :). Expensive, but less so than buyung a new Motherboard and CPU every 2-3 years.

    On the other hand, I have a Win98 and WinXP system for some light retro gaming as well. Sooooo :D
     
  14. nizzen

    nizzen Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,157
    GPU:
    3x3090/3060ti/2080t
    I don't do stock cpu or memory ;)
    You can gain litterally huge gains of overclocking x299 with both cpu and memory. Memory latency is the main key for gaming with x299. Stock is "80ns" and overclocked is "50ns".
     
  15. Tiny_Clanger

    Tiny_Clanger Guest

    Messages:
    333
    Likes Received:
    347
    GPU:
    igpu
    I dont do gaming or video rendering.
     

  16. nizzen

    nizzen Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,157
    GPU:
    3x3090/3060ti/2080t
    Posting with the phone on facebook and forums only? :p
     
  17. Tiny_Clanger

    Tiny_Clanger Guest

    Messages:
    333
    Likes Received:
    347
    GPU:
    igpu
    bugger off :p
     
  18. H83

    H83 Ancient Guru

    Messages:
    5,512
    Likes Received:
    3,036
    GPU:
    XFX Black 6950XT
    Ah, okay. My PC is mainly for gaming so i guess PCIe is no that important for me, at least for now. But for those who need it´s (very) bad that Intel is still lagging behind AMD when it comes to the adoption of PCIe4. Many will just skip Intel because of this.
     
    tunejunky likes this.
  19. nizzen

    nizzen Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,157
    GPU:
    3x3090/3060ti/2080t
    We are all waiting for next gen ;)
     
  20. Geek

    Geek Guest

    Messages:
    5
    Likes Received:
    1
    GPU:
    Asus R9 290x
    We don't want that either. We don't need another processor monopoly like the one Intel held for so long. What we need is what we have, some decent competition. It's good for product development, its' good for prices, it's good for the end users.

    Intel have already been forced to bring there prices down, by a good margin. However, they should have done this years ago. They were simply capitalizing on their monopoly. I don't know what the real 10nm story is, but I imagine they are trying to milk every possibly penny out of 14nm. Is it so they have something to sell next year?
     

Share This Page