Is my i7-5820k too old for the 2080 Super/Ti?

Discussion in 'General Hardware' started by Thebrave85, Apr 22, 2020.

  1. Thebrave85

    Thebrave85 Active Member

    Messages:
    72
    Likes Received:
    2
    GPU:
    GTX 690 4GB
    In other words will I have too much of a bottleneck for my GPU/CPU if I upgrade to a 2080 Super or Ti at this point with my CPU at stock settings? I could try to OC it as well, but I really don't want to push it too much because it's also my work computer. I am possibly upgrading to 1440p/2k and getting a new monitor and want to know what you guys think about it?

    Current setup:
    i7-5820k (STOCK)
    16gb ddr4 3200
    Corsair AX850 Platinum Certified
    EVGA GTX 690 4gb (NEEDS UPGRADE)
     
  2. sverek

    sverek Guest

    Messages:
    6,069
    Likes Received:
    2,975
    GPU:
    NOVIDIA -0.5GB
    Bootleneck in games depends on different factors:
    1) Game Engine
    2) Resolution
    3) Graphic settings

    I think you can bottleneck your 5820k with GTX690 if you set resolution and 720p and graphic settings to LOWEST.
    Boom, 5820k now struggles to deliver 150fps+, since you lift all the load from the GPU.

    If you plan to play modern AAA titles with highest graphic settings @1440 and above, 2080 Super might easily bottleneck.
    However CPU intensive games might cause lag spikes due to slow CPU (clock speed).

    You can upgrade CPU and other components after try 2080 Super with your current rig.
    The safest way is to actually use your 5820k with 2080 Super and see if there actual frame drop due to CPU.
    Nobody can accurately tell if 5820k will be enough for your use case.
     
  3. Undying

    Undying Ancient Guru

    Messages:
    25,473
    Likes Received:
    12,881
    GPU:
    XFX RX6800XT 16GB
    @XenthorX is the user here with 2080ti and 5820k and i think it was fine. Was waching his SS streams but hes at 4k. He'll probably can tell you more if he sees this.
     
    XenthorX likes this.
  4. Thebrave85

    Thebrave85 Active Member

    Messages:
    72
    Likes Received:
    2
    GPU:
    GTX 690 4GB
    My current CPU and GTX 690 work well together, no issues other then I am starting to feel the oldness of my card and need for more FPS in newer titles. I am also feeling like possibly upgrading to 1440p/2K resolution, so it's why I'm asking if it will really bottleneck my CPU if I get a 2080 Super or Ti to go with my CPU, is it a waste to go to a 2080 Ti with my current CPU basically? Should I just get a 2080 Super for 1440p? I'm not needing the HIGHEST graphics possible, but I def want to get high fps and everything to look pretty good, not going 4K or anything but 2K so I can get high smooth fps performance and hopefully most settings turned up.
     

  5. Fender178

    Fender178 Ancient Guru

    Messages:
    4,194
    Likes Received:
    213
    GPU:
    GTX 1070 | GTX 1060
    You will be ok according to this article.
    https://www.gpucheck.com/gpu/nvidia-geforce-rtx-2080-ti/intel-core-i7-5820k-3-30ghz/ it talks about pairing a 2080 Ti with a 5820k CPU and it runs games fine at stock speeds. So it should be fine with either a 2080 Super or a 2080 Ti.
     
  6. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    Overclock to 4.2-4.6Ghz and it will be ok
     
  7. Thebrave85

    Thebrave85 Active Member

    Messages:
    72
    Likes Received:
    2
    GPU:
    GTX 690 4GB
    I overclocked my CPU a bit to 4.4 right now @1.29v which is fine for 24/7 oc apparently, so I should be good?
     
  8. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,750
    Likes Received:
    9,641
    GPU:
    4090@H2O
    Currently running a 5930K @ 4.5 and it's pretty rad so far at 1440p. Will still upgrade the CPU at some point because the itch is there, and if the cards / screens have the right price, I'll maybe consider jumping to 4K at some point in the mid future.
     
  9. Solfaur

    Solfaur Ancient Guru

    Messages:
    8,012
    Likes Received:
    1,532
    GPU:
    GB 3080Ti Gaming OC
    Well, I went from a 4.2 5820K to a 3900X last year and the difference in some games was night and day with the same 1080Ti (@1440p/max settings/g-sync) in games like BFV, SWBF2, GTAV. In these not only were any stutters/framedrops fixed, but the actual FPS went up. In other, less CPU intensive game the difference was less visible but I did feel a "smoothness" that I didn't have before. Mind you, I had that 5820K build for over 5 years, the OC was kinda stable at best (used to be I could do 4.4 too in the first 2-3 years) and memory might have had some issues too.

    But yeah, it depends on the game really. If it's a GPU bound game, and you go for max settings all the time, you will most likely be fine with that 4.4 5820K. But honestly, if you manage to get the money for such a GPU, I'd recommend upgrading CPU+RAM+MOBO sometime this year as well. The market is so good these days when it comes to CPUs (competition is gold), you will have especially good deals on AMD 3xxx series and Intel 9xxx when the new stuff come out later this year. So if there's a good time to upgrade, it sure is now (or even better later this year, as said).
     
  10. Corrupt^

    Corrupt^ Ancient Guru

    Messages:
    7,270
    Likes Received:
    600
    GPU:
    Geforce RTX 3090 FE
    Similar to Solfaur, switched from a 5820K @4.3Ghz to a 9900KS (still using my 1080GTX) and my impression so far:
    • If you're into eye candy and 60 fps is sufficient for you, then you can probably sit tight and wait for Ryzen 4000
    • If you're less into eye candy and more into higher FPS (120+) and higher Hz, then, yes, you're getting bottlenecked
    Short term, look into squeezing at least 4Ghz out of it, 4Ghz should be fairly easy. Beyond that results on 5820K's vary wildly. I've seen 4.8Ghz 24/7 users, I've seen people who can't even get past 4.2, ... It's silicon lottery I'm afraid.

    I get where you come from though, a lot of newer games actually utilize 6 cores, making the 5820K still quite a viable CPU, but at the same time, anything pre-skylake seems to be lacking the IPC these days to get really high framerates.
     
    Last edited: Apr 22, 2020
    Solfaur likes this.

  11. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    Compared to the newest processor and memory on the market. You will lose between 20-35% in FPS depending upon title and game settings resolution. While resolution is the big one for FPS, shadows are often what pounds and exposes a weaker CPU along with game AI.
     
    Solfaur likes this.
  12. Thebrave85

    Thebrave85 Active Member

    Messages:
    72
    Likes Received:
    2
    GPU:
    GTX 690 4GB
    So maybe should I do what I originally planned and stick @1080p for now still and just get a 2070 Super, would that make more sense?
     
  13. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    It all depends upon your timeline for upgrades. If the GPU is supposed to last you through another processor upgrade then, you'd be silly to settle for a 2070 IMO. Get the best you can afford now, if you will be able to use it for a future upgrade.
     
  14. Solfaur

    Solfaur Ancient Guru

    Messages:
    8,012
    Likes Received:
    1,532
    GPU:
    GB 3080Ti Gaming OC
    Ideally you should wait until autumn and get a next gen card. The RTX 2000 cards are soon to be "obsolete", so it would be like throwing out money for a new 2080Ti right now (the card is almost 2 years old), when less than half a year from now you will get it for much, much cheaper. However, if you can get a really good deal on a used card now, then sure.
     
    GarrettL likes this.
  15. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    Yeah.

    Maybe don't go completely overkill and get just 2080super. It's still a great card and doesn't cost over 800€/$.
     

  16. XenthorX

    XenthorX Ancient Guru

    Messages:
    5,057
    Likes Received:
    3,435
    GPU:
    MSI 4090 Suprim X
    The 5820K is perfectly fine for a 2080-ti, the higher resolution the better of course. I've been streaming/playing games at 4K with a single PC powered only by the 5820K and a 2080-ti since the card released. I would also argue that the 2080-ti really is a 4K gaming card and is largely underused for lower resolutions.


    Let's do the theory Math here, with highly conservative numbers and find out the worst case scenario, 4.5Ghz 5820K vs 5Ghz 8700K (both 6c/12t):
    • frequency : (5.0-4.5)/4.5 -> 11.1% more computation per second.
    • IPC improvements : (Intel claims in this departement are pure commercial statements) let's add 5% to be conservative

    CyberFunk 2077 is an incredible imaginary game, unfortunately there's a lot of object on the screen and the game is using an older API like DX11 or OpenGL 4.6, for each object on the screen my CPU has to send a draw command to the GPU in a single threaded manner.
    At 4K your 5820k is your bottleneck because your imaginary GPU can eat anything up to 18K before starting to worry and your computer take 16ms to compute the gamethread, meanwhile your best buddy Bobby has a more recent 8700K overclocked at 5Ghz which takes 16.1% less time to compute the frames, therefore Bobby compute the frame in 13.424ms.

    Bobby plays at 74.5FPS, and you play at 60FPS.


    Now realistically, the retail 8700K is at 4.7Ghz max turbo frequency and i would argue that if there's any IPC improvement we're talking 1% top. Moreover higher frequency does not provide a 1:1 performance improvement, not at all:

    if you're CPU bottleneck at 60FPS - not a single game on the market today, hardest game being dense open world games with a lot of simulation happening - then Bobby is playing at (4.7- 4.5)/4.5 -> 4.4% more computations per second which scale to 2.2% actual more performance for the given program, +1% IPC improvement : 64FPS


    Of course framerate isn't scaling linearly with computation time, the higher the framerate, the more noticeable the bottleneck will be, as far as modern game goes this bottleneck start to be really noticeable if you're looking for 120/144frames per second.
    Meanwhile it's totally possible that modern API like Vulkan and DX12 make this bottleneck less and less of an issue.

    Edit: As far as bottleneck goes, as you already own a 5820k you can already make some testing yourself to define your CPU bottleneck - pick a game as reference (i just did it with BF5 for instance) set it to a lower resolution to have a clear bottleneck appear:

    [​IMG]

    [​IMG]

    Clearly, with my 5820K at 4.539Ghz, i'm CPU bottleneck around 7ms | 143FPS for a 64player map of Battlefield 5, while at 1440p the 2080-ti could in theory reach 4.24ms | 235FPS, and at 1080p 3.34ms | 321FPS.

    Now taking this video using 2080-ti + 9900K at 5.1Ghz :

    From the benchmark in the video, you notice that the average framerate achieved is around 6.41ms | 156FPS

    Meaning that it's safe to say that at 1440p every CPU on the market bottleneck the 2080-ti (in BF5).


    Edit2
    : One interesting thing to do is reverse the process to figure out a more realistic scaling of performance with frequency on latest intel ships, earlier in the realistic approach i considered 1% faster frequency provide 0.5% better performance. Sadly the 9900k has 2 more cores so our results will be biased.

    5820K 4.5Ghz : 7.1ms
    9900K 5.1Ghz : 6.41ms

    The 9900K compute 9.7% faster despite IPC improvement claims by Intel and frequency increased by 13.3%.

    IPC improvement of 1% : 1% higher frequency provide 0.65% better performance
    IPC improvement of 2% : 1% higher frequency provide 0.58% better performance
    IPC improvement of 3% : 1% higher frequency provide 0.50% better performance
    ...


    Edit3: My 5820K is fairly close to its best possible overclock, i'll refer my cpu-z validator as a best case scenario. Also this benchmark is to take with a grain of salt, as far as stability is concerned you gonna have to take my word for it.
    This computer is also my work station, i'm doing game developement, lengthy code compilation, video encoding from time to time, and occasional 7H+ stream using only this computer CPU and the 2080-ti - no capture card.

    https://valid.x86.fr/mib9a5
     
    Last edited: Apr 23, 2020
    -Tj-, GarrettL, Undying and 1 other person like this.
  17. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    Compared to the newest processor and memory on the market. You will lose between 20-35% in FPS depending upon title and game settings resolution. While resolution is the big one for FPS, shadows are often what pounds and exposes a weaker CPU along with game AI.
     
    XenthorX likes this.
  18. XenthorX

    XenthorX Ancient Guru

    Messages:
    5,057
    Likes Received:
    3,435
    GPU:
    MSI 4090 Suprim X
    Actually shadows are mostly a GPU issues. In games, the GPU creates textures from the point of view of light named 'shadow maps', those maps are then stamped on the gameworld and then projected to the player point of view, CPU isn't involved.

    CPU work load is all about real-time physic/collisions/destructions..., AI simulation, gameplay code (you fire a weapon in X direction etc...), managing players inputs etc...

    I think it's all about having the right tool for the job as always, your CPU defines the maximum level of performance you could achieve. If you want the highest performance in competitive 1440p games, then as shown above, every CPU on the market bottleneck the 2080-ti at 1440p, you're gonna have to buy the best CPU out there.
     
  19. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    Im still at 4770K 4.6GHz, cache 4.2 (had 4.5, no diff) and enhanced 2400mhz in mem-z efficency 40.5GB/s and I don't really see any cpu bottleneck, even if I try to make it.

    Only GTA5 online is more cpu bound (excluding older pre 2012 titles), guess they used some avx stuff and it somehow isn't efficient enough on my cpu or idk (downtown there are now parts that it's stuck ~ 70-90fps and gpu 70-80%, cpu 40-70% per thread). It happened a while ago and it's like that ever since. I think dx12 would help a lot here.


    That said if I had the money now I would deffo get 2080ti, and DSR the crap out of it :D,

    Later get new system with DDR5, I saw it can do up to 8.4GHz. 6.4Ghz would be ok for starters and a 8core 16thread cpu,.. this should be turbo overkill for the next 5-7yrs :)
     
    XenthorX and fantaskarsef like this.

Share This Page