2500K Or 2600K for gaming ?

Discussion in 'Processors and motherboards Intel' started by Infinity, Jul 29, 2011.

?

which is better For gaming ?

  1. Core i7 2600K

    34 vote(s)
    31.8%
  2. Core i5 2500K

    73 vote(s)
    68.2%
  1. BlackZero

    BlackZero Guest

    All I see is you trying to justify why someone who can shouldn't buy a 'better' cpu... as for caches.. threads compete for cache usage the more cache you have the more efficient the operation... is that basic enough?

    I don't see the need to go into much detail but what's so difficult about percentages that people must use hypothetical numbers like 10 fps?

    lol, most games don't even use 4 cores so why buy a 2500k why not a dual core?

    I think this discussion has run it's length:)

    ps. go check the folding at home section, the 2600k scores more than 30% over the 2500k on average, is that theory as well?
     
    Last edited by a moderator: Aug 2, 2011
  2. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,628
    Likes Received:
    1,119
    GPU:
    4090 FE H20
    I think i5 users and amd users are against the 2600k, as another poster said . :3eyes:

    On one note, i just ordered the 2600k an hour ago :nerd:
     
  3. Exodite

    Exodite Guest

    Messages:
    2,087
    Likes Received:
    276
    GPU:
    Sapphire Vega 56
    The latter.

    HT only provides an actual benefit to a subset of tasks, of which I've to see any game qualify. And even then that extra performance comes at reduced per-core performance due to the resources being shared.

    If one wanted to make sure whether or not HT had an impact on performance one would have to run the same set of game benchmarks with HT on and off and compare the FPS, not merely assume there's a benefit because all virtual cores are being loaded.

    In addition I don't know how load on a virtual core in calculated, since a physical core is always under load as soon as one of its two virtual cores are.
    That's only true if you don't use any applications that make use of it.

    For me personally the 2600K was definitely worth it, though that likely isn't true for the general consumer. HT does have its uses, gaming just isn't one of them.
     
    Last edited: Aug 2, 2011
  4. deltatux

    deltatux Guest

    Messages:
    19,040
    Likes Received:
    13
    GPU:
    GIGABYTE Radeon R9 280
    All I've been saying is that the price difference for the 2600K isn't worth it for a highly dependent application performance. Personally I can't even really comment on Bulldozer's performance because on paper it looks great but I personally think there might not be real world gains with the architecture. I know it'll work better than SMT, but to say will it work better than Sandy Bridge? I really don't know. I'm hoping it does, but it wouldn't be surprising to me if all it does is to meet it since Bulldozer is essentially a quad core with 2 dedicated schedulers and integer units.

    We're talking about gaming, that has been the whole issue here "2500K or 2600K for gaming?" is the topic, that's why I've been adamant about the framerates while you've been either saying "Intel's engineers are smarter than you" or throwing 30% everywhere when you won't see that across the board. By throwing in Folding@Home, that wouldn't really help your argument as it's not a game either ... and if you're the OP and may be strapped of cash, which would you rather take, $100 off or 30% faster folding? Folding can't justify the cost tbh because it's not productive to the owner of the 2600K but only to the F@H project. Your GPU will fold way better than that 30% faster. You can always spend that $100 on a better GPU than to get the 2600K.

    Lastly, like I've been saying this whole time, the 30% increase is application dependent, you won't always see that 30%, most of the time you see no performance increase or sometimes a performance decrease. It can't justify the cost unless the OP is using applications that is known to take full advantage of HT.

    deltatux
     

  5. Sever

    Sever Ancient Guru

    Messages:
    4,825
    Likes Received:
    0
    GPU:
    Galaxy 3GB 660TI
    lol, im a 2600k owner that isnt really backing it up because of HT. tbh, i dont really back it up at all. for most users, a 2500k will perform more or less similar.

    the only app ive really gained performance with HT is 3dmark vantage and 3dmark11. in vantage, gained about 50% on cpu score. if you calculate backwards, it ends up meaning that when HT is running, each logical core is 30% slower, but overall, the cpu is faster. but, these are benchmarks, so its really a bit of a useless gain for a gamer like me. in handbrake, there is very little to gain from HT even when all cores are stressed, (like maybe an improvement of 10-20fps encoding rate when youre already encoding at 300-400fps). so i'd say theres no real benefit.

    but with regards to the games showing performance loss from HT, that only really occurred with the first gen i7s, because with those, if you switched on HT, it will remain in permanent HT mode. whereas with my second gen i7, if my PC use doesnt demand the use of more than 4 cores, it will just behave as if HT is switched off, and just run 4 full cores instead of 4 hyperthreaded cores. i cant really explain it too well, but this is just from my observations. but the long story short is that anything that runs on less that 4 cores will no longer experience a notable performance loss from leaving HT on (there was a thread about this with actual solid framerates recorded).

    and as for the other argument about cache, the 2mb doesnt really give the i7 an edge in much of anything. from my experimenting, if you turn HT on and run the i7 at the same clock speed as the i5, you get more or less the same performance. its nothing gamebreaking. delta understands more about cache and SMT than i do so i'll leave all the complex stuff to him.

    but more or less, at the moment, there is little to gain from HT in gaming. not enough games use the frostbite engine for that to be relevant to a lot of gamers.
     
  6. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    The reason I use World of Warcraft as a reference is due to the fact that it can actually run 8 threads. This is an observation I've made. As per Blizzard's devs and through my own personal experience, if WoW is permitted to run 8 threads, performance is actually worse than running only 4 threads on the 4 physical cores with upwards of roughly a 25% performance loss (100fps with 8 threads vs 120fps with 4 threads on 4 physical cores). I have an attribute set in the config file for WoW to force it to run on the 4 physical cores, which reduces the performance loss from having HT enabled. Blizz decided with a previous patch that WoW should be able to make use of all available cores (including HT)...so, the more cores available, the more threads it can/will run. If HT is enabled on an i7 2600K, it will incur the same performance penalty from HT as my i7 870 does, for the same reason.
     
  7. BlackZero

    BlackZero Guest

    Interesting quote from hardocp:

    "We have kept the first Lost Planet benchmark around simply because it is one of the best scaling gaming benchmarks in terms of threading. It does a tremendous job up to the 8-core mark and even beyond.

    The one big thing that sticks out to me here is how the stock 2600K comes in almost neck-in-neck with the overclocked 2500K. "

    http://www.hardocp.com/article/2011/01/03/intel_sandy_bridge_2600k_2500k_processors_review/4



    And I think a poster mentioned handbrake, according to hilbert the 2600k performs exactly 32.6% faster than a 2500k.

    http://www.guru3d.com/article/core-i5-2500k-and-core-i7-2600k-review/15


    @deltatux

    Seeing how incoherent some of the arguments have gotten, especially considering that I've already answered almost everything regarding what I thought about gaming on a 2600k and completely seperated it from my views on applications i don't see why I need to go around in circles when people can simply read what was said. Also fyi it would take 2-3 gtx 580's to score as highly as a 2600k running big wu's not to mention the enormous increase in power usage and i don't see where I related f@h to gaming, it was a reply to another poster's comments, but was clearly taken out of context.
     
    Last edited by a moderator: Aug 3, 2011
  8. Sever

    Sever Ancient Guru

    Messages:
    4,825
    Likes Received:
    0
    GPU:
    Galaxy 3GB 660TI
    if you look carefully at the benchmark for lost planet in reference to the comment you quoted... its benched at 640x480. sure, its a valid argument that the i7 2600k is faster at 640x480... but given that the OP has a gtx580, i highly doubt anyone is silly enough to buy a gtx580 to game at 640x480, so i dont think the OP cares which is faster at 640x480.

    http://www.anandtech.com/show/4083/...core-i7-2600k-i5-2500k-core-i3-2100-tested/20

    this would be a more useful comparison since its at a resolution that is closer to what the OP is probably gaming at. performance is similar.

    i guess handbrake depends on what kind of file youre converting. but for me, i havent noticed any benefit in encoding from leaving HT on, so i dont see much of a benefit for choosing a 2600k over a 2500k.
     
  9. BlackZero

    BlackZero Guest

    Before pulling out the old 640x480 is too low a resolution argument, perhaps it would be wise to consider that buying a processor is a 2-3 year investment for most people and considering how a processor compares to another at a lower resolution not only demonstrates the actual difference between two processors without the graphics card interfering, but also, and more importantly, determines the differences that can be expected with newer more powerful graphics cards which are yet to be released, afterall people upgrade graphics cards a lot more often than their cpu.
     
  10. Xtreme1979

    Xtreme1979 Guest

    Am I the only who correlates that if CPU A performs better then CPU B at a CPU bound low resolution (that's why you bench CPU's at low res to take the GPU out of the equation) it will cont. to perform better than CPU B as games mature and become more demanding of the CPU, regardless of resolution? It's not rocket science! I am tired of people throwing out CPU benchmarks because uhh duhh, no one games at that resolution. It's not about current titles it's about the future of gaming moving forward, and which processor will be faster when needed. /Rant OFF

    P.S. Well said BlackZero your post was quicker than mine. :)
     
    Last edited by a moderator: Aug 3, 2011

  11. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    This argument is pointless. If you want a 2500k buy 1. If you want a 2600k buy 1. I happen to own both. Tbh the 2600k is only better for me benching. What I am seeing here is 2600k bashing by 2500k owners. Wanna be system builder advisors. Etc. Enjoy your chip whicheverone you have. and remember buy INTEL:thumbup:
     
    Last edited: Aug 3, 2011
  12. Xtreme1979

    Xtreme1979 Guest

    Oh boy! You've gone and done it now. Hears distant AMD battle cries. :behead:
     
  13. BlackZero

    BlackZero Guest


    Lol, yeah that somes it up nicely :grin:
     
  14. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    No need for Intel owner in fighting BZ remember who the enemy is...
     
  15. BlackZero

    BlackZero Guest

    I hear ya lound and clear TK
     

  16. mmicrosysm

    mmicrosysm Guest

    Messages:
    743
    Likes Received:
    0
    GPU:
    Cirrus Logic GD5430 1Meg
    Cyrix?




    hehe
     
  17. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    I thought it was via the second best CPU maker in the world.
     
  18. ZiiiP

    ZiiiP Member Guru

    Messages:
    153
    Likes Received:
    0
    GPU:
    Gigabyte R9 380 4GB
    Hey guys, I got 15153 points in the Vantage CPU test for the 2500k at stock speed. Isnt this a little low? The guru3d review says it can do 17000.

    Btw the options preset is performance.
     
  19. The_Fool

    The_Fool Maha Guru

    Messages:
    1,015
    Likes Received:
    0
    GPU:
    2xGIGABYTE Windforce 7950
    I wish I could say the same for my Phenom II :p Hopefully Bulldozer delivers the performance and satisfies the expectations that Sandy Bridge set.
     
    Last edited: Aug 20, 2011
  20. flow

    flow Maha Guru

    Messages:
    1,023
    Likes Received:
    17
    GPU:
    Asus TUF RTX3080 oc
    In the end of the day we are left with opinions and impressions. I can tell you from experience, the higher cpu is the better one, period.
    When I bought the E8500, it needed a certain voltage for a certain overclock, at the time the cheaper E8400 could run the same speeds, thus being just as fast but cheaper.
    The thing people don't talk about is they needed more voltage for stability, this is normal since intel would sell such a chip at a higher rating if this was stable. I found the more expensive E8600 overclocked better, with lesser voltage.
    So now we got 2500k and 2600k, with some exceptions I'm already seeing many 2500k users that need much more vcore to get the same overclock as I need, further more, there are 6 and 8 cores in the pipeline from intel.
    You can bet they will start utilizing HT and more than 4 cores from next year on, in games and apps.
    So while you could be satisfied today with a 2500k, and certainly would make a bang for buck, next year could paint a different picture, especially with the amd chips standing at the door.
    HT is going to be utilized, denying this is shortsighted.

    So as to the threads question, for low budget 2500k wins hands down.
    Long term it's 2600k, which will stand in the shadow of the IB and SB-E chips.
    How far in the shadow remains to be seen, time will tell, but I'm guessing they will still do fine for gaming in general the next three years.
     

Share This Page