AMD A10 6800K benchmarked, tested

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 5, 2013.

  1. scoter man1

    scoter man1 Ancient Guru

    Messages:
    4,930
    Likes Received:
    217
    GPU:
    MSI GTX 1070ti
    What the heck? I just bought a 5800K like 2 weeks ago and I had no idea this was coming out. Thankfully it doesn't look like it has much of a leg up.
     
  2. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    The point is - AMD can't.
    AMD selling their silicon for only a fraction of what Intel gets, is not by their own choice.
    Think about that when you hear Richland is competing with i3, FX is competing with i5 and AMD does not compete with Iris Pro at all.


    WTF... I mention Power Consumption in CPU thread, and you're thinking screw polar bears and oceans? We're talking about competitiveness, not green Earth.
    Not to mention that that selfish rugged attitude is not even funny anymore; it's more like long past annoying. And I'm not very eco-conscious :)
     
  3. deltatux

    deltatux Guest

    Messages:
    19,040
    Likes Received:
    15
    GPU:
    GIGABYTE Radeon R9 280
    Saying it in general, it's flexible that it can be changed easily since it was designed to be modular which follows AMD's design principle of M-SPACE. The PS3 had one PPE and 8 SPE (1 reserved for the OS). The PPE was really just a really smart controller was all. The SPE did all the heavy lifting, they were not a "core" per se, but a specialized processing unit, just like the "cores" in your graphics card, they aren't really cores, yet they're sometimes marketed as such.

    It's likely the GPU cores taking up most of that power consumption. Remember, AMD's iGPU is pretty powerful and are Radeon parts with up to 384 streaming processors. Intel's architecture only has 16 of such units. For Kabini, you get up to 128 stream processors.

    We all know that AMD is graphics heavy while Intel is serial process heavy. With proper hUMA support, I'm sure people can squeeze a lot more out of Kabini than they can do now.

    I didn't say ARM does at the moment, but they are making headway and at a pretty fast pace while still consuming less power than Intel's current offerings. Once ARMv8 comes out along with the Cortex A53 and A57 next year, ARM is expected to jump in performance. Of course these are just expectations with no publically available solid data.


    I was talking about mobile, all those are desktop processors which are not tweaked for mobile use. It's like me taking Haswell desktop chips and saying that they're not as power efficient as their laptop counterparts.

    deltatux
     
    Last edited: Jun 6, 2013
  4. Titan29

    Titan29 Master Guru

    Messages:
    306
    Likes Received:
    12
    GPU:
    Vega 64 | 7900XTX
    I think Kaveri (coming later this year) will be the real deal. Steamroller cpu with GCN cores.
     

  5. Chillin

    Chillin Ancient Guru

    Messages:
    6,814
    Likes Received:
    1
    GPU:
    -
    Here is an excellent graph to show AMD's problem:

    [​IMG]
    [​IMG]

    That shows the amount of energy used to complete a workload.

    This means that not only does the Intel CPU run faster and use less peak power, but since it finishes the workload faster it uses up even less energy than is usually noted in graphs.

    Not to mention that the new Intel Iris Pro (5200) runs circles around AMD's top APU, including the GPU portion, while using far less energy.

    [​IMG]
    [​IMG]
     
    Last edited: Jun 6, 2013
  6. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,803
    Likes Received:
    1,402
    GPU:
    黃仁勳 stole my 4090
    I don't know anything about M-SPACE (unless it's the shared memory space, what are the odds that's what M-SPACE stands for :wanker:), all I've learned about Jaguar however points to it being significantly weaker than even ancient CPUs.

    As for the PS3 part...

    That's a load of crap though and that's my problem, using that terminology then an ALU or pretty much any other core component is a specialized processing unit that can be "sometimes marketed" as a core. The SPEs were nothing more than off-core mini components. And the PS3 might as well have had 5 in best case scenarios and 0 in 99% of cases because that's what the reality was. 1 disabled for selling defective chips, 1 reserved for "security", 1 for outright spying on you, sending logs of EVERYTHING it can log to Sony's server every time you're connected online whether or not you log in. That's why PS3s with custom firmware would become banned even without logging in.

    Sony abused the ignorance and stupidity of the general public and perpetuated the myth that the PS3 has 8 cores while the 5 available vector units were almost never used. I can count on one hand the amount of games I know used the available 5 for a fact, and they're all exclusives. It would sound awfully bad if the general public knew the 360 used the same core type but had 3 of them and 6 threads versus 1 core and 2 threads.

    The PS3 had a third of the cores of the 360 and significantly weaker GPU, yet almost every console owner believes the PS3 has more powerful hardware that was never fully utilized. :puke2:

    I know you know this stuff, I just felt like writing it out.

    The killer part about this is, those horribly weak consoles seem relatively reasonable now (for the time) compared to what's coming out. I have a toaster with more processing power than the upcoming consoles.
     
    Last edited: Jun 6, 2013
  7. chojin996

    chojin996 Guest

    Messages:
    19
    Likes Received:
    0
    GPU:
    AMD 5870 1GB
    You are the one writing nonsense false info on the PS3 Cell.

    The Cell SPE units are not only fully functional but full DSP units for very fast vectorial processing.
    The Cell architecture was derived from IBM high-end expensive Power CPUs but used the PowerPC core because it was cheaper to include.
    Sony and IBM spent billions of dollars and quite some years on R&D for Cell.
    When Cell got released Intel didn't have anything able to compete with it, it was far behind. And AMD was already going to lose big time with Intel Centrino derived CPUs on the rise, after the Tejas/Prescott mess and billions lost on fake projects in India with some managers inside Intel that stole a lot of money and almost caused Intel to go bankrupt on that, but the Centrino R&D Intel team in Israel not only saved the Corporation but gave it such a boost that AMD still has no change to even remotely achieve the some results that Intel now can.
    Still the Cell was far too advanced when it hit the market. Sony and IBM managers were idiots not being able to market it properly and still inside the PS3 the Cell is used at less than 70% of its true actual performance.
    Anyone can see the upcoming "The Last Of Us" PS3 game that thanks to Cell and proper coding can achieve high quality graphics better than anythng seen on expensive PCs with multiple GPUs in SLI mode. And that on old hardware with an old GPU.. Cell allows programmers to offload a lot of complex maths on the SPE and get massive speed gains, much more than even the latest Intel AVX2 vectorial units can do.
    With Cell IBM once again just proved that if they only ever wanted they could have smashed Intel and caused AMD to go bankrupt in a blink of an eye.
    IBM expensive Power CPUs are extremely advanced. IBM as the R&D resources to release better products than Intel. Fact is that IBM managers are narrow minded and they lost a huge opportunity with the Cell to invade the desktop and server markets for real.
     
  8. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,803
    Likes Received:
    1,402
    GPU:
    黃仁勳 stole my 4090
    What did I write that was false?
     
  9. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    10,486
    Likes Received:
    3,178
    GPU:
    PNY RTX4090
    http://www.escapistmagazine.com/new...Us-Squeezes-Every-Last-Drop-of-Power-From-PS3

    Neo Cyrus is right in what he said. Cell was very powerful, but it could never run on the desktop as its not an x86/64 CPU its an IBM PPC. IBM would have to pay for the x86 license. It was very powerful, but to say that games done on the PS3 could not be done better, and more efficiently on a PC and its hardware is just ludicrous. Even mid range PC's from two years ago were leaps and bounds above the Cell in PS3. Cell was a bombshell for Sony and the rest involved, it cost them far too much money and its taken seven years for a first party company to get 100% out of it. I don't know about you but to me that just shows it was built wrong in the first place. It was far too complex for most devs. Sony also shot themselves in the foot by sending out instructions to devs in order to better understand the Cell and they sent them out in Japanese to EVERY dev even those in the UK and USA.

    Also, I have played The Last Of Us Demo on PS3 on a VERY good 50" Plasma TV and to be honest it looks pretty dated. Low textures, basically zero antialiasing, and a pretty poor draw distance. Also the A.I gets pretty confused from time to time. Its a stellar game in terms of acting, script, gameplay, but technically it looks pretty washed out. Still pretty decent for seven year old tech.
    The part about the toaster was a little white lie ;)
     
    Last edited: Jun 6, 2013
  10. chojin996

    chojin996 Guest

    Messages:
    19
    Likes Received:
    0
    GPU:
    AMD 5870 1GB
    The Last Of Us looks dated to you ? Low textures ?
    Expensive power hungry PCs with 3 200Watts GPUs in SLI can't run better 3D engines that the one of The Last Of Us on the Cell.
    That is reality.
    Then one can be blind and go around claiming that due to antialiasing their expensive multi-SLI PCs are more powerful because they are expensive and the games look better because it's expensive and new... BUT that is not the case. It's just not true.
    Games running on those expensive configurations are far from optimized in any possible way, the 3D engines are outdated..
    Using huge textures with less or even no compression doesn't automagically mean that everything looks more photorealistic, new and not dated.

    If you seriously believe that Sony didn't get a profit for 7 years on the PS3 due to Cell R&D costs then you surely are naive or you are writing nonsense following an agenda.
    So do you even believe the claims that Warner Bros told the press about the Harry Potter franchise ?

    http://www.cinemablend.com/new/Leak...st-Money-On-2007-Harry-Potter-Film-19433.html
    Leaked Report Claims Warner Bros. Lost Money On 2007 Harry Potter Film
    Author: Eric Eisenberg
    | published: 2010-07-06 22:17:15

    Seriously? Yeah, sure... WB can claim to have lost money BUT anyone trusting such an obvious lie must be really naive, blind and living under a rock... indeed.

    Sony lost money on the PS3 as much as WB lost money on Harry Potter.

    Then what is with all the babbling about Sony or IBM needing an x86/x64 license for the desktop market ?
    If ever they tried to do what they just didn't, trying to compete against Intel and AMD with Cell invading desktop, server and notebook markets... why should have they needed an x86/x64 license ?
    They could have just built their own UNIX OS just like Apple OS X for their own hardware. Also IBM has plenty of UNIX experts so delivering a new OS on par with OS X to attack Microsoft could have been done with not so much effort, they just needed to put the money needed to pay designers,coders and sign agreements with hardware manufacturers.
    They had and still would have the money to do it but when the managers are just stupid cowards and managers not willing to take the risk..they really can't make a group expanding or getting back to lost market segments.
     
    Last edited: Jun 6, 2013

  11. NAMEk

    NAMEk Guest

    Messages:
    658
    Likes Received:
    5
    GPU:
    Gainward/GTX560/2GB
    Or we need new technology for batteries, something better than li-ion, or li-polymer. Higher capacity in small area with high current. Somehow i don't see batteries advancement since first li-polymer hit the market. I don't talk exactly about this processor.
     
  12. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    15,140
    Likes Received:
    1,743
    GPU:
    GTX 1080 Ti
    You pretty much just answered your own statement. Newer hardware will easily out do the cell and xenon no issue. The cell has what it's good at, vector computing. But general computing that the general public uses, it's behind. For visuals, look at the difference between bf3 on the ps3 versus the PC. That's a newer optimized engine, even my low/mid 5800k CPU and gpu wise runs miles around the ps3.

    Also there's a reason why apple switched from ppc to x86/64. Ppc is harder to code for, uses more power, high cost for stable ppc core yields, and is less efficient for general computing versus x86/64. It has its niche in super computers, but with amds server designs I wouldn't be surprised to see more apu based servers. If IBM were to ever want to make an x86/64 platform, they would have to get licenses from intel since they own rights to it.

    You make it out to be one giant conspiracy, but it's not. It's simple aging and hardware limitations. That's just basic computing evolution.
     
  13. blkspade

    blkspade Master Guru

    Messages:
    647
    Likes Received:
    34
    GPU:
    Leadtek Nvidia Geforce 6800 GT 256MB
    I was about to make a post about everything wrong with your post. Then I looked at your name. Seeing as you're likely the same chojin that post on extremetech, I'll choose not to feed you.

     
  14. blkspade

    blkspade Master Guru

    Messages:
    647
    Likes Received:
    34
    GPU:
    Leadtek Nvidia Geforce 6800 GT 256MB
    Hell with it. Because X86 is the entire desktop market. It would take way more than a theoretically powerful chip, and UNIX kernel to unseat the guys at the top of this space. Apple is barely successful there when compared to the entirety of PC OEMs and Microsoft, and abandoned Xserve. The lack of third party software and hardware support, and a likely slow adoption would leave them DOA. Their area is will probably forever be specialized systems.
     

Share This Page