AMD A10 6800K benchmarked, tested

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 5, 2013.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,717
    Likes Received:
    19,185
    GPU:
    AMD | NVIDIA
    In this review we look at the new AMD A10 6800K APU. Based on Piledriver cores this processor slash graphics hybrid symbiosis called APU remains hard to beat in terms of features performance and well ...

    AMD A10 6800K benchmarked, tested
     
  2. killer_939

    killer_939 Guest

    Messages:
    2,597
    Likes Received:
    0
    GPU:
    Radeon 7950 @ 1100/1500
    Doesn't make me feel bad about buying the 5800k for my backup rig a few months back. Phew... :)
     
  3. thatguy91

    thatguy91 Guest

    DRAM timings of 13-13-14-32?

    Also, it appears the A10-6800K supports DDR3-2133 natively, it is the lower models that only support DDR3-1866 natively. I'm sure with a couple of sticks of DDR3-2133 with decent timings (13-13-14-32 is quite disgusting at 1866!) it would give a bit better results.

    A10-6800K: http://products.amd.com/en-us/DesktopAPUDetail.aspx?id=79
    A10-6600K: http://products.amd.com/en-us/DesktopAPUDetail.aspx?id=81
    A10-6700: http://products.amd.com/en-us/DesktopAPUDetail.aspx?id=80
     
    Last edited by a moderator: Jun 5, 2013
  4. Multi-threaded Video Transcoding looks funny.

    I still think DDR3 is way too limited for heavy APU's. You have to spend good money for fast ram in order to achieve good performance. And it comes into discrete GPU teritorry then.

    But as a number crunching cluster, yeah it rocks. I hope some linux windows kernels natively will include OCL acceleration on certain dumb tasks, not via the GPU driver layer.
     

  5. BLEH!

    BLEH! Ancient Guru

    Messages:
    6,416
    Likes Received:
    428
    GPU:
    Sapphire Fury
    Looking at one of these for a home server. A85X is an awesome chipset and one of these I could well leave running 24/7 quietly in another room.
     
  6. WAROQ

    WAROQ Guest

    Messages:
    48
    Likes Received:
    0
    GPU:
    Club Radeon HD 6870
    I think I'm gonna buy this one too..for my gaming rig
     
  7. Speed Weed

    Speed Weed Guest

    Messages:
    1,066
    Likes Received:
    0
    GPU:
    GTX 260+
    Will you be able to play Crysis 3 on your new rig, WAROQ?
     
  8. sverek

    sverek Guest

    Messages:
    6,069
    Likes Received:
    2,975
    GPU:
    NOVIDIA -0.5GB
    Wouldn't it be enough for puzzle or Indie game? or minecraft? or for Source engine games? etc...? no?

    Every single person playing Crysis 3 these days?
     
  9. AdeelEjaz

    AdeelEjaz Member

    Messages:
    17
    Likes Received:
    0
    GPU:
    XFX Radeon HD 6890 1GB BE
    I agree. I was gutted that the reviewer just did not bother to do the test on 2133MHz (I assume the board didn't support it). Still happy with the performance and ordered it!
     
  10. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,066
    Likes Received:
    4,448
    GPU:
    Asrock 7700XT
    damn 2 new processors released within the same week and both of them were disappointing. At least the 6800K seems to have more performance gains over it's predecessor compared to the i7-4770k to the 3770k (though, only Windows seemed to not get any performance gains with the 4770k, Linux makes the upgrade more worth it).


    Also, I personally would find it interesting if there were to be a dual-socket FM2 board. Crossfire the 2 IGPs, get a 6670, and you've got yourself a decent system. Probably not that cost effective though.
     
    Last edited: Jun 5, 2013

  11. mystvearn

    mystvearn Guest

    Messages:
    25
    Likes Received:
    0
    GPU:
    GT730,GTX770ti,GTX 780...
    Nice article. Can you please do a test with the discrete GPU which the 6800k supports? How much performance increase can we see from a combo in crossfire support.

    Looks like I'll be waiting for Kaveri then.
     
  12. deltatux

    deltatux Guest

    Messages:
    19,040
    Likes Received:
    15
    GPU:
    GIGABYTE Radeon R9 280
    Unlike Haswell, Richland is meant to be a slightly tweaked version of Trinity. If Kaveri turns out like Haswell in terms of relative performance increases, then ya, you can say that. However, since this is not the case, don't think you can make that statement.

    Honestly, there isn't really a need for Richland, but I guess AMD wanted something "new" to tie into the back-to-school sales as Kaveri would not be ready by then for sure...

    deltatux
     
  13. Chillin

    Chillin Ancient Guru

    Messages:
    6,814
    Likes Received:
    1
    GPU:
    -
    It's not the performance that AMD needs to focus on, it's the out of control power consumption of their chips.

    That more than anything else is killing their market share and preventing OEM's from going with them.

    I'm still shocked that they don't show a phone SOC on their roadmap at all. Their tablet SOC is already dead in the water (no major OEM wins), and is in further trouble now with the 7w Haswell chip and the upcoming Silvermont.

    Their selling of Imageon and BitBoys was perhaps the most suicidal move by AMD ever considering that they are exactly what AMD needs right now.

    Intel is really hammering the mobile market right now, which is what it needs to do. This is reflected by the fact that ARM stock over the past months dropped 20% while Intel went up 20% in the same time frame. People have confidence that Intel is on the right track to start to dominate the mobile market.

    Intel already owns 85% of the shrinking desktop market, it doesn't need to do anything there. But it has almost no representation in the ultra mobile market, and thus is missing out on almost all the money in that growing market.

    The laptop market is still growing, but it too is pretty much dominated by Intel.

    The worst part for AMD is that during all this, they are hemorrhaging $146 Million dollars a quarter (!), while Intel makes a $2.04 Billion dollar net profit in the same period.
     
    Last edited: Jun 6, 2013
  14. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    The performance is actually pretty reasonable for entry-level and budget markets... Can't wait for Kaveri though....
     
  15. deltatux

    deltatux Guest

    Messages:
    19,040
    Likes Received:
    15
    GPU:
    GIGABYTE Radeon R9 280
    If you've looked at Trinity, it has very respectable power consumption for mobile. It competes with Ivy Bridge in terms of battery life. Not sure what you're talking about for power consumption. Power consumption remains an issue for their enthusiast platforms, but honestly, the FX series were never mean to be low power either so it's really a non-issue. The Kabini APUs also do very well in terms of power consumption as well.

    AMD just recently released a true tablet SoC with Temash, it'll take a bit before they'll hit the market. Previous designs based on Zacate was like tablet manufacturers forcing Android 2.x on to tablets, it just doesn't work well.

    I personally do wish Intel a lot of luck in trying to break the ARM dominance, but at the same time, it would be an uphill battle since ARM is gaining performance while Intel is trying their best to push the power usage down. I wish both great luck to diversify the market.

    The Imageon sell-off was short-sighted to say the least but at the time, they didn't have the resources to convert those TV image chips to mobile and Qualcomm did. What AMD sold became Qualcomm's prowess along with their mobile radios. Qualcomm wouldn't be as successful without AMD selling off their Imageon line. AMD needed the cash too. Yes, they shot themselves in the foot, but who knows, they might return to the cell phone game in the future via their Radeon line. It's already in tablets, if they keep pushing GlobalFoundries to shrink their processes down more along with better power management design, I don't see why AMD couldn't get into the mobile chip game.

    AMD has been recovering through many missteps from its past and with the new CEO and the return of AMD veterans (who they were missing during AMD's critical time) are slowly turning the boat around and there's still much to do. AMD has drastically improved power consumption with the advent of Piledriver and now with Kabini. Things are slowly looking up for AMD. With the design wins from both Microsoft and Sony, the Jaguar architecture proves that it's flexible, relatively powerful and relatively low power enough to fit in a console. Remember, the 7th generation consoles only had a 150W - 175W power supply. Low power chips are expected for consoles as well, not only the mobile market. It's just that they don't need ultra low power.

    You seem to be very pro-Intel. Personally, I like market diversification so I'll root for anyone who can break market dominance. AMD is the only one who can for the x86 market and Intel's the only viable one for the mobile chip market. This philosophy is the same with the service providers I use.

    deltatux
     

  16. BLEH!

    BLEH! Ancient Guru

    Messages:
    6,416
    Likes Received:
    428
    GPU:
    Sapphire Fury
    Well said mate.
     
  17. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,805
    Likes Received:
    1,405
    GPU:
    黃仁勳 stole my 4090
    ... Relative to what? A PowerPC based CPU with 1 core and 2 threads that was used in 2006 (and earlier in 2005 in the Xbox 360)? I always find it hilarious when people think the PS3 is this powerhouse with 9001 cores, they don't seem to be able to grasp the concept of what a core is to understand that the PS3 only has a single one. Anyway, relative to anything used in a desktop it's abysmal.
     
  18. Chillin

    Chillin Ancient Guru

    Messages:
    6,814
    Likes Received:
    1
    GPU:
    -
    I like to be realistic. I've had almost as many AMD systems as Intel, I don't particularly care which color logo my chips have. Like I said, my second to last build was an AMD.

    Kabini (Jaguar) uses more power than an Intel (IVB) i7-3517u while offering nowhere even close the performance:

    http://www.anandtech.com/bench/Product/604?vs=823

    Trinity fares little better, it uses up more power than an i7-3517u while also offering inferior performance:
    http://www.anandtech.com/bench/Product/729?vs=600

    ARM still doesn't have "performance" per se. Their top of the line A15 chips perform near the 2006 Intel Core Duo chips; what they do have is lower power usage for that performance (at the moment) while still lacking X86. Atom as it is (which is a horribly dated design) is still competitive with the latest ARM designs (except for the GPU), and Silvermont looks to completely destroy that performance in every way.

    The console win for AMD is a victory not over Intel, but over IBM and Nvidia. Besides, the margins there are horrible.

    I still have high hopes for Temash, but the lack of any major OEM wins is horribly worrying.
     
    Last edited: Jun 6, 2013
  19. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    [​IMG]

    It remains an issue EVERYWHERE. Power consumption is everything.
    If you command power consumption, the easiest thing you can do is trade it for some performance.

     
  20. Elder III

    Elder III Guest

    Messages:
    3,737
    Likes Received:
    335
    GPU:
    6900 XT Nitro+ 16GB
    ^^^ considering the A10 has a graphics chip that you can do something with, whereas the Intel HD 4000 (let alone the HD 2500) is pretty poor for any kind of gaming at all.... I can live with a higher tdp.

    *for the record I personally do not care about power consumption. If my PSU can handle it then I'm perfectly content. I don't want to hear about saving the polar bears either, mankind will all kill each other long before the oceans rise to swallow the world. :realmad:
     

Share This Page