AMD FX-8150 vs Core i7 2600K

Discussion in 'Frontpage news' started by Xzibit, Aug 26, 2011.

  1. Xzibit

    Xzibit Banned

    Messages:
    4,382
    Likes Received:
    0
    GPU:
    7970 Windforce
  2. Stukov

    Stukov Ancient Guru

    Messages:
    4,899
    Likes Received:
    0
    GPU:
    6970/4870X2 (both dead)
    Very interesting, showing it trading blows with the 2600k, may be actual legit testing.
     
  3. scoter man1

    scoter man1 Ancient Guru

    Messages:
    4,930
    Likes Received:
    217
    GPU:
    MSI GTX 1070ti
    I'm wondering if its got embedded gpu cores....
     
  4. PhazeDelta1

    PhazeDelta1 Guest

    Messages:
    15,608
    Likes Received:
    14
    GPU:
    EVGA 1080 FTW
    chiphell has been know to post FUD in the past. so take everything they put out with a grain of salt. but if that is true and that amd chip is running 32m wprime on all 8 cores. then it is a total FAIL.

    heres my score from my other rig runing 32m wprime at the same clocks with all 6 cores.
    [​IMG]

    i really hope they ran wprime on just the 4 core default settings


    as for the intel. i have very little experience with them, so your guess is as good as mine.

    edit:

    what 3d mark is that, vantage? and are they using a descrete gpu or does that particular cpu have the igp
     
    Last edited: Aug 26, 2011

  5. Xzibit

    Xzibit Banned

    Messages:
    4,382
    Likes Received:
    0
    GPU:
    7970 Windforce
    It's Vantage, but, for Core i7 2600 they used updated version(1.10) because aero works(VC2010 interface), 1.10 disables GPU acceleration(that was on release notes for 1.10 version), for AMD FX was used outdated version of 3DMark Vantage 1.02 or 1.01...
    First result for CPU test (CPU Test 1=2500~), is very low, but 2nd test and 150 OPS.... OMG
    Very interesting.:nerd:
     
    Last edited: Aug 26, 2011
  6. PhazeDelta1

    PhazeDelta1 Guest

    Messages:
    15,608
    Likes Received:
    14
    GPU:
    EVGA 1080 FTW
    i clicked that link you posted. all the tests were done using 8 cores on that amd chip. pitiful. i REALLY hope that this is fake :wanker:
     
  7. Sever

    Sever Ancient Guru

    Messages:
    4,825
    Likes Received:
    0
    GPU:
    Galaxy 3GB 660TI
    hmm, from what i can see, the amd setup is using a diff gpu (as evidenced by the different scores). the second cpu test is the physx test iirc, so if the amd setup is using an nvidia gpu, it could skew the score a fair bit.

    it would be a lot more useful if the scores were submitted to 3dmark and we were linked to that page instead. the test wasnt run using any of the default presets (hence no E, P, H, X scores) so we also have no idea about any possible differences in the settings between the setup (with regards to what gpu is being used, resolution for the gpu test and whether or not physx is enabled). the 3d mark page would tell us everything about the tests being run and the system setup.

    i wanna see a better benchmark so i can actually see if its worth building my brother a bulldozer system or giving him my system instead, and building myself an x79 setup later.
     
  8. deltatux

    deltatux Guest

    Messages:
    19,040
    Likes Received:
    13
    GPU:
    GIGABYTE Radeon R9 280
    Something's not right here, the CPU-Z detection on the 8130P states that the L3 cache is 64-way but according to the latest slides, the L3 cache should be 16-way associative.

    [​IMG]

    Should this be believed, the CPU-Z detection may be wrong and same with its max TDP. I highly doubt it'll be 223W.

    Source: http://www.computerbase.de/bildstrecke/35954/7/

    deltatux
     
  9. thatguy91

    thatguy91 Guest

    Its also an engineering sample, and the processor name says FX-8130 whereas the specs are reporting it as a FX-8150P!

    The Bulldozer will be a great performer, I have no doubts about that. Interestingly enough it won't be long after the Bulldozers release will people be talking about the next version of the Bulldozer which is a revised edition and runs a different socket (socket FM2). The first release of Bulldozer does not support PCI-E 3.0 however the Radeon HD 7000's do, so its something they'll want to get fairly soon. No doubt the PCI-E controller will become integrated with the 2nd gen Bulldozer. Not long after that the 'in' topic will probably be the third gen Bulldozer which is meant to be heavily updated and still run on the socket FM2 process. The 3rd gen is also likely to be where DDR4 support comes in...

    This info is pretty much the best knowledge of what the future holds thats out there at the moment. Looks like AMD don't want to slip again like they did after the original Athlons/Athlon 64 etc.

    AMD also have the advantage if their AMD 7000 series are as good as they sound, especially since they are likely to perform better on Bulldozer than Intel.
     
  10. Chillin

    Chillin Ancient Guru

    Messages:
    6,814
    Likes Received:
    1
    GPU:
    -
    Ignoring everything else you said for a moment, why exactly would an AMD 7xxx GPU work better on an AMD system rather than an Intel?
     

  11. alanm

    alanm Ancient Guru

    Messages:
    12,269
    Likes Received:
    4,472
    GPU:
    RTX 4080
    The Vantage scores were a pathetic comparison.. what were the system specs for each test? What GPUs were used? As mentioned, if Nvidia - which looks like it was with physix enabled - it would give such a lopsided CPU score. Which automatically puts everything else into question. Just pathetic. Wait till the damn thing comes out. Looking at how AMD is soooo tight with giving out pre-release performance figures to at least instill some faint hope for those waiting, other than completely useless Dirt3 multi-monitor examples, I really dont expect to be impressed by BD. Hope I'm wrong.
     
  12. IcE

    IcE Don Snow

    Messages:
    10,693
    Likes Received:
    79
    GPU:
    3070Ti FE
    Garbage, fake, etc.
     
  13. The Chubu

    The Chubu Ancient Guru

    Messages:
    2,537
    Likes Received:
    0
    GPU:
    MSi GTX560 TwinFrozrII OC
    This. Those benchmarks are ETC.
     
  14. pimp_gimp

    pimp_gimp Ancient Guru

    Messages:
    6,702
    Likes Received:
    98
    GPU:
    RTX 2080 Super SLI
    I don't believe that, and won't believe until Bulldozer is released.
     
  15. TheHunter

    TheHunter Banned

    Messages:
    13,404
    Likes Received:
    1
    GPU:
    MSi N570GTX TFIII [OC|PE]
    yea..

    And its still a engineering sample, so those results aren't very legit anyway..
     

  16. thatguy91

    thatguy91 Guest

    Potentially, but only if some of the new instruction in the new Bulldozer only instruction sets XOP, FMA4, and CVT16 prove useful. Intel does not support these nor plans to. Future Intel CPU's include a very similar but not as powerful or instruction compatible instruction set to FMA4, FMA3.

    The Bulldozer supports all the instruction sets the i7-2xxx etc support including AVX and SSSE3.

    So if there are useful instructions in XOP, FMA4, or CVT16 that prove useful in the drivers, they will be supported one would think seeing that both the Bulldozer and Radeon's are AMD.

    Of course, this alone does not mean they'll perform better in comparison on Bulldozer, I'm just making the point that its a possibility. Even if its not true, it probably will be with the 2nd Gen Bulldozer with the integrated PCI-E 3.0 controller etc...

    All speculation of course, but it looks like the 2nd (especially) and 3rd gen Bulldozers, all a year apart from the first (supposedly) are going to be just as interesting too.
     
  17. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,793
    Likes Received:
    1,396
    GPU:
    黃仁勳 stole my 4090
    An older 4 core CPU being compared to a newer 8 core CPU... how thrilling. Sigh, it's a sad state of affairs when there are only two major mainstream CPU companies and one is severely behind. The only glimmer of hope left is the possibility of heavy overclocks on Bulldozer, but it seems unlikely to top SB's 5GHz on air.

    Just think, Intel got into the CPU business because the RAM business was too cutthroat, what would the world's computers be like if they didn't enter the CPU industry?

    What would the world be like if the CPU and GPU businesses weren't severely limited as they are now? A hell of a lot more advanced, that's for sure.
     
  18. Psychlone

    Psychlone Ancient Guru

    Messages:
    3,686
    Likes Received:
    2
    GPU:
    Radeon HD5970 Engineering
    More spin, more speculation...
    God(s) I wish this would stop. You guys have either got to wait until release day, or hope that someone with verifiable credentials and actual screenshots screws the pooch and breaks their NDA.
    I'll tell you from experience, I've seen people break their NDA - and they'll never work in pre-media beta testing ever again. Chances of talking someone into showing you their prize is nil, so the only thing you all have left is to wait. Period.

    But please, stop with the conjecture. All it does is raise everyone's concerns. I cringe every time I see a new thread stating any type of BD benchmark - you're NOT going to see any legitimate benchmarks until media release... so why keep posting things like this at all, let alone in the news section??!


    Psychlone
     
  19. thatguy91

    thatguy91 Guest

    When you compare CPU's, like with graphics cards:

    COMPARE EQUIVALENT PRICE MODELS not what you think is the equivalent. Comparing a $300 CPU to a $300 CPU for example is completely valid. Say if the FX-4100 cost $170, comparing that to a $300 CPU doesn't really make much sense does it?!

    Now if the FX-4100 performed a little bit slower than a i7 2600K, it does not mean Bulldozer sucks, it fact it would suggest its absolutely brilliant, because a $170 CPU performs almost as well as a $300 CPU. Imagine what the FX-8150P can do, seeing it has twice the number of cores! :)
     
  20. deltatux

    deltatux Guest

    Messages:
    19,040
    Likes Received:
    13
    GPU:
    GIGABYTE Radeon R9 280
    lol, while on shift one day (I work at a McD's that's very close to AMD Markham) I was talking to an AMD engineer asking when we would see the light of day for Radeon HD 7000 and I knew I wouldn't get an answer but at least I tried :p. All he said was "It's NDA man" and then I sadfaced :p.

    I then asked a non-NDA question to another engineer a few days later which was: do you see that within the next 5 years we could see GPUs with (real time) ray tracing? The engineer's like, ya, the tech is here now. The problem is that the hardware that's required would probably be too high for consumers and heat output would be insane. I then ask what if we kept shrinking the chips? He was like, you can only shrink so much. So at least I got a short discussion I had which lasted about a minute lol.

    deltatux
     
    Last edited: Aug 27, 2011

Share This Page