AMD responds to FX line negativity!

Discussion in 'Frontpage news' started by maleficarus™, Oct 14, 2011.

  1. mR Yellow

    mR Yellow Ancient Guru

    Messages:
    1,935
    Likes Received:
    0
    GPU:
    Sapphire R9 Fury
    For all you haters out there. Here's an interesting topic over at xtreme-systems
    that show good performance increase with BD in single threaded apps (old games and progs).

    You basically disable one of the shared shared cores on each cluster. So u end up with a 4 logical cores. Doing this u bypass the shared cache.

    See here for more info:
    http://www.xtremesystems.org/forums...ew-(4)-!exclusive!-Excuse-for-1-Threaded-Perf.
     
  2. Chillin

    Chillin Ancient Guru

    Messages:
    6,814
    Likes Received:
    1
    GPU:
    -
    We already saw this, even with this you are still running well below the i5-2500k in performance and just about equal Phenom 2 performance per core.
     
  3. k1net1cs

    k1net1cs Ancient Guru

    Messages:
    3,783
    Likes Received:
    0
    GPU:
    Radeon HD 5650m (550/800)
    That's great!
    So whenever I want to use single-threaded apps effectively, I could just restart my PC, go to BIOS, turn off the cores, and then reboot the PC!
    And whenever I want to use multi-threaded apps effectively again, I could just restart my PC, go to BIOS, turn on the cores again, and then reboot the PC!
    Simple!

    Wait-a-minute...
     
  4. Enmity

    Enmity Maha Guru

    Messages:
    1,309
    Likes Received:
    0
    GPU:
    EVGA 1080ti FE SLI
    the performance increases in most of the multithreaded apps too - so its like having a 2600K and turning HT off like alot of people do due to the negative effect that can have in some situations. Same, but different lol.
     

  5. fr33k

    fr33k Ancient Guru

    Messages:
    1,982
    Likes Received:
    55
    GPU:
    ASUS STRIX RTX2080
    But we shouldn't need to do such drastic measures to make things work right.
    Its not even financially sound to BUY the chip that cost 100$ more than the chip that lacks that feature already. Unless M$ announced a fix to shared resources right now then i'd say go for it, its totally worth it. But no where do we have any idea that its an issue with windows or with bios other than people trying to solve amd's problems for them.
     
  6. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    7,924
    Likes Received:
    357
    GPU:
    Zotac GTX1080Ti AMP
    Oh great is this the best that they can come up with???

    Their brand new processors, with brand new architecture, with more cores, faster stock clock speeds. Is only a frame or two better than previous models!!!

    If they can not sort this out through BIOS updates, firmware, or some form of patch. Then I am afraid to say it, but AMD are gone. They are dead in the water now. And we, the consumer, are the ones that are going to be stuck with Intel who are just going to milk their current lineup for all its worth.

    They even heldback the launch of these CPU's to tweak and fine tune them. What on earth were they doing!!!???? Some tests even show the old Phenom II 1090t beating these chips!

    I soooooo wanted these chips to wipe the floor with Sandy Bridge, but no they do not. Intel must be sitting back in their solid gold chairs, ironing million dollar bills, smoking a fat cuban cigar, and chuckling to themselves intently, whilst unemployed ex-AMD staff grovel at their feet for a job!

    It angers me, but it also saddens me to think that I once loved AMD (socket 939!!) and recently loved how they offered great bang per buck. But when you come to market with something brand new from the ground up, that is beaten in some tests by their own previous gen tech you have nothing else to do but... FACE PALM!!!
     
  7. IcE

    IcE Don Snow Staff Member

    Messages:
    10,693
    Likes Received:
    73
    GPU:
    Zotac GTX 1070 Mini
    It's tempting to think of Intel as some evil bad guy, but they're really just a successful business. The people in charge are intelligent and know exactly what they're doing. The same can't be said for AMD. I'm pretty sure Intel will in no way be tempted to just stop making new processors, even though they could. They have to pay their brilliant engineers to do something after all.
     
  8. WaroDaBeast

    WaroDaBeast Ancient Guru

    Messages:
    1,963
    Likes Received:
    0
    GPU:
    Gigabyte HD7950
    'Course it is − they'll resort to anticompetitive practices whenever they can.
     
  9. mR Yellow

    mR Yellow Ancient Guru

    Messages:
    1,935
    Likes Received:
    0
    GPU:
    Sapphire R9 Fury
    Maybe the chip is a little to future orientated now. Once software devs start supporting it things might look better.

    The thing that i dislike the most about BD is the power draw. For me that'd fail.
     
  10. thatguy91

    thatguy91 Ancient Guru

    Messages:
    6,648
    Likes Received:
    98
    GPU:
    XFX RX 480 RS 4 GB
    No, but what it does mean is Intel can bring out new CPU thats only 10 percent better per clock than the previous gen rather than 20 percent, and charge more for it.

    So, if Piledriver is 10 percent faster per clock than Bulldozer, and Ivy Bridge is 20 percent faster per clock than Sandybridge (and yes, both of these are Intel and AMD claimed amounts) then AMD are going to be even more screwed as time goes on.

    Example:
    Say if Bulldozer is designated the number 100, and Sandybridge 140 (as in 40 percent faster per thread).

    Bulldozer --> Piledriver
    100 --> 110 (10 percent)

    Sandy Bridge --> Ivy Bridge
    140 --> 168 (20 percent faster)

    Now if you divide the performance of Piledriver and Ivy Bridge by 11 (same ratio between the two, but makes Piledriver designation 100), you have Piledriver at 100 and Ivy Bridge approximately 152 (actually 152.73).

    So, in this example Sandy Bridge is 40 precent faster than Bulldozer per thread, and close to 53 percent faster for Ivy Bridge compared to Piledriver.

    Now for multithreaded apps. if 8 threads for Bulldozer is 7 times faster than single thread (not 8 due to scaling), and with hyperthreading Sandy Bridge is 5 times faster for its 8 threads compared to running 1, you end up with:

    Bulldozer = 700
    Sandybridge = 700 (5x the 140 from above)

    Meaning they are approximately on par.

    Now, if the same figures are transferred to Piledriver and Ivy Bridge:
    Piledriver = 700 (using the Piledriver base figure of 100)
    Ivy Bridge = 760 (5x the Ivy Bridge figure of 152)

    So dividing this by 7 for ease of comparative purposes:
    Piledriver = 100, Ivy Bridge = 108.6

    So, if the same scaling applies between the generations, then comparatively Piledriver has falled behind by 8.6 percent for a good multithreaded app compared to Bulldozer.

    Whats worse is most apps don't make full use of 8 threads. In these examples, Bulldozer only equals the performance if making full use of its 8 threads, otherwise it falls behind. For Piledriver its even worse compared to Ivy Bridge.

    So, unless the Piledriver is significantly more than 10 percent faster per thread than Bulldozer, its going to be even more of a disappointment than Bulldozer. Even if they migrate to 10 cores, sticking with an AM3+ compatible socket is a bad idea since the scaling will be affected - instead of 7x 1 core performance for heavily multithreaded app, you may get only say 8 to 8.5x the 1 core performance as you push the limits that the socket is capable of.

    Really, keeping Socket AM3+ and having backwards compatibility was just plain stupid, really wish AMD moved to a new well designed socket and made proper use of it! Also having faster integer cores, and with this 'new' socket the PCI-E 3.0 controller built in, as they plan to in the future, they could have really made an impact with sales, even if it were $50 more.
     

  11. IcE

    IcE Don Snow Staff Member

    Messages:
    10,693
    Likes Received:
    73
    GPU:
    Zotac GTX 1070 Mini
    And what business doesn't do this?
     
  12. FearFactory

    FearFactory Master Guru

    Messages:
    815
    Likes Received:
    0
    GPU:
    MSI RX570 X 4GB
    Adam Kozak product marketing manager can say whatever he want's.. and you can disable cores, and you can wait for a software patch, or you can wait for Windows 8...but this will remain a failure. Why..? just forget the flagship model for a while and look at the FX4100.. this thing is slower than my Phenom II X4! If it's a new generation cpu and replaces the current X4 models, it could never be slower!
     
  13. PhazeDelta1

    PhazeDelta1 Moderator Staff Member

    Messages:
    15,616
    Likes Received:
    14
    GPU:
    EVGA 1080 FTW
    come on guys. no need to get anal about this. lets be civilized.


    AMD fu*ked it up. thats no shocker. they are trying to save face with this PR bullsh!t. everyone knows just as much as i do that AMD overhyped and under delivered with BD. its time do get over it.
     
  14. Mannerheim

    Mannerheim Ancient Guru

    Messages:
    4,794
    Likes Received:
    18
    GPU:
    Gigabyte RX580 8GB
    I believe BD will run better in future... For now. who cares if u get 100fps or 130fps`in games? not me.
    Its not the best cpu but it has potential to come better.
     
  15. PhazeDelta1

    PhazeDelta1 Moderator Staff Member

    Messages:
    15,616
    Likes Received:
    14
    GPU:
    EVGA 1080 FTW
    goodwill :p
     

  16. Texter

    Texter Ancient Guru

    Messages:
    3,145
    Likes Received:
    233
    GPU:
    Club3d GF6800GT 256MB AGP
    Yep it's real...performance per thread per clock is over twice as high for the 2600K so without a GPU bottleneck you get that result in tri-fire. I expect to see the same significant differences when HD79xx and GTX6xx come out. Bulldozer may be tomorrow's CPU, but it's for tomorrow's gamer. You'll probably see relatively poor performance with all the classic single threaded or bi-threaded games as well. I wonder how Morrowind and Oblivion run on a Bulldozer.
     
  17. allesclar

    allesclar Ancient Guru

    Messages:
    5,641
    Likes Received:
    90
    GPU:
    GeForce GTX 1070
    i was dissapointed, i was hoping for a great comeback from the FX range. Sadly its a win for the i7 YET again.
     
  18. hallryu

    hallryu Don Altobello

    Messages:
    11,386
    Likes Received:
    14
    GPU:
    2x HD7970
    This thread has been owned. :thumbup:

    It's an AMD failure for me. You don't by a CPU for the future, not really, you buy for the hear and now. You buy it for the benchmarks, the gaming and the tweaking, NOW. Not on the off chance that in the future it may OWN. We need it to OWN now and it does not OWN, it has been OWNED.
     
    Last edited: Oct 15, 2011
  19. k3vst3r

    k3vst3r Ancient Guru

    Messages:
    3,371
    Likes Received:
    21
    GPU:
    Zotac 1070 Amp
    Thing is IB chips will be even more power efficient because of finfet design, they getting rid quite alot of leakage which hurts any chip when comes to power usage. Wonder how AMD engineers going to deal with leakage as they keep shrinking the process down.
     
  20. davetheshrew

    davetheshrew Ancient Guru

    Messages:
    4,089
    Likes Received:
    0
    GPU:
    some green some red
    Considering your a cop you dont half spell bad mate haha
     

Share This Page