Gaming benchmarks: Core i5 2500 vs Core i5 6500 vs Core i3 6320

Discussion in 'Benchmark Mayhem' started by Artas1984, Aug 12, 2017.

  1. Artas1984

    Artas1984 Active Member

    Likes Received:
    GTX 780
    Since this is a benchmark section, i will post some of CPU battle results, no mater if it is under video card category, since this seems to be the only subforum with "benchmarks" title on it.

    I had the chance to test the Skylake Core i3 6320 and Core i5 6500 chips against the legendary Sandy Bridge Core i5 2500.

    Open Core i5 2500 vs Core i5 6500 vs Core i3 6320 gaming benchmarks to see the results.

    For those who have a Sandy Bridge chip like Core i5 2400/2500 with a basic motherboard, this is good news (i guess). If you have a Z motherboard with Core i5 2500K overclocked to 4.5 GHz, it will beat Core i5 6500 all over the place.

    I found Core i3 6320 to be a great chip, it has got enough frequency to pull games up and can be faster than Core i5 6500 in 2 core threaded games. Core i5 i found a bit disappointing, the fact that it can loose to Core i5 2500 in more than a few games does not justify it as a gaming CPU.
  2. RealNC

    RealNC Ancient Guru

    Likes Received:
    EVGA GTX 980 Ti FTW
    Most of the results do seem to make sense, since these games aren't known to be memory-bandwidth bottlenecked. For example, if you bench Fallout 4 or Watch Dogs 2, you should see the 2500K falling behind.

    But some of the results do not make sense. The Metro Last Light Redux and Rainbow Six Siege resuls look weird. Why would the 2500K be faster than the 6500 here?

    Another thing: as we move towards 1440p as "the new 1080p", it might make more sense to bench that one. This is of special interest to Sandy Bridge users, since 1440p means a GPU bottleneck in most cases, leaving the CPU at lower loads (meaning less reason to upgrade from Sandy.) In other words, if you need to choose between a new platform or a new, large, shiny 1440p monitor, the monitor might make more sense (if you have the GPU for it, or plan to get one.)
    Last edited: Aug 12, 2017

Share This Page