AMD FX-4350 and FX-6350 Piledriver CPUs

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 1, 2013.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,392
    Likes Received:
    18,564
    GPU:
    AMD | NVIDIA
    AMD issued a small refresh on its lineup of FX processors with the launch of the FX-4350 and FX-6350. Both based on the "Piledriver" architecture, feature an AM3+ socket, and support the ...

    AMD FX-4350 and FX-6350 Piledriver CPUs
     
  2. MadGizmo

    MadGizmo Guest

    Messages:
    1,396
    Likes Received:
    0
    GPU:
    MSI R9 290X 8GB 2560*1440
    Just a heads up, Hilbert. Looks like the links don't work. Both on the site and the forum.

    "Error. The story does not exist."
     
    Last edited: May 1, 2013
  3. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,392
    Likes Received:
    18,564
    GPU:
    AMD | NVIDIA
    thanks man, fixed ;)
     
  4. SmileMan

    SmileMan Guest

    Messages:
    78
    Likes Received:
    0
    GPU:
    AMD hd7950 3GB
    12 & 14 Mb l2 and l3 cache...
    Don't think that's right, or am I wrong?
     

  5. BLEH!

    BLEH! Ancient Guru

    Messages:
    6,402
    Likes Received:
    421
    GPU:
    Sapphire Fury
    That's total cache...
     
  6. SmileMan

    SmileMan Guest

    Messages:
    78
    Likes Received:
    0
    GPU:
    AMD hd7950 3GB
    Yeah, I thought that already. Just 4+8 and 6+8.
     
  7. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,975
    Likes Received:
    4,342
    GPU:
    Asrock 7700XT
    What are their stock speeds? I was considering getting a 6300, which I intend to overclock. Whichever overclocks better on air is likely the one I'll be getting.
     
  8. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,780
    Likes Received:
    1,393
    GPU:
    黃仁勳 stole my 4090
    Protip: Don't.

    Haswell, June 4th.
     
  9. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,975
    Likes Received:
    4,342
    GPU:
    Asrock 7700XT
    The only thing that interests me about Haswell is the power consumption. Otherwise, I'm not paying that much for something that I have little need for. I'm actually choosing the 6300 over the 8350 because I don't even need the extra performance of the 8350, and the 6300 seems to be the best valued AMD CPU today in terms of processing performance. And I'm choosing the 6300 because it'll make a nice last-upgrade to my AM3 system (I have a beta bios that supports AM3+ CPUs).

    Also, considering that PS4 and likely the new Xbox are both going to be Piledriver based, it won't surprise me games will perform better in a 6300 than any Haswell i5, maybe even i7. I doubt the first year of games will do that though since devs likely won't figure out the kinks in micro-optimizing.

    Anyways, I currently own an Athlon II x3 at 3.7GHz and so far it's difficult to justify upgrading that. It doesn't max out in any live tasks such as gaming. It's relatively slow when it comes to other things like compression or encoding, but I can deal with the wait since I don't do those very often. I have no need for 8 cores but I'm likely going to need more than 4 pretty soon. Also, being mostly a Linux user, BD/PD processors tend to perform a little better in linux than they do in Windows.
     
  10. k1net1cs

    k1net1cs Ancient Guru

    Messages:
    3,783
    Likes Received:
    0
    GPU:
    Radeon HD 5650m (550/800)
    PS4 and Xbox are likely to have this technique implemented.

    Your system and Piledriver-based desktop CPUs, however, won't have it.
    The games are likely unable to perform better on a 6300 than a similarly spec'd i5 system, much less an i7.
     

  11. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,975
    Likes Received:
    4,342
    GPU:
    Asrock 7700XT
    I don't see how huma would have that black and white of a difference. it's still based on the same general CPU architecture, so the only performance I would lose due to not having huma intel would lose as well. So I don't see how that's a valid point. Also note that huma is not ideal for discrete GPUs - you would lose performance if you attempted to make a discrete GPU use your system memory.
     
    Last edited: May 1, 2013
  12. Taint3dBulge

    Taint3dBulge Maha Guru

    Messages:
    1,153
    Likes Received:
    12
    GPU:
    MSI Suprim X 4090
    In games like bf3 crysis 2 using an I5 compared to a 6300 is barley noticeable. Some games the AMD will be faster by a few frames and some games the i5 will have afew frames more, not like having an intel will give you 50fps more lol.. So unless your playing old games that only use 1 core i wouldnt spend the extra $100+ bucks on the intel. Plus old games that only use 1 core will prolly run 200+fps on any newer system.. ;)
     
  13. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,975
    Likes Received:
    4,342
    GPU:
    Asrock 7700XT
    Very valid point, which is one of the reasons I'm doing a CPU upgrade rather than a system upgrade.
     
  14. k1net1cs

    k1net1cs Ancient Guru

    Messages:
    3,783
    Likes Received:
    0
    GPU:
    Radeon HD 5650m (550/800)
    My point was the games are designed to be run on systems likely to be based on hUMA, and not on systems (with discrete GPU) like yours.

    You just said it yourself that you'd only lose whatever advantage(s) hUMA would provide just like any other Intel systems, so why would a 6300 has the advantage over a similarly spec'd Haswell i5, or even i7, when the games aren't likely to be optimized for non-hUMA systems?
    Besides, your first argument was that games would run better on (vanilla) AMD systems because PS4 and Xbox are based on AMD hardware, but that was before hUMA entered the picture.
    Care to elaborate?

    And it seems you kinda miss the point with hUMA.
    It's not about the GPU sharing the same memory being used by CPU, it's more about the CPU being able to read what the GPU has already processed (and vice versa) without having to wait for the result being copied back and forth between VRAM and system's RAM.

    On a traditional shared memory system, which you were thinking about, CPU and GPU still can't see what's in each other's share of the memory space.
    So there's still the overhead to copy over what's in the GPU's memory share to the CPU's, whenever there are CPU-GPU processes that need to be worked on, actually resulting in duplicate contents whenever there are such processes.
    With hUMA, it's both reducing the memory usage and removing the overhead of copying values.

    Also, hUMA would still be relevant on systems with discrete GPU.
    The gfx card can still keep their VRAM for graphics processing, but it'll be faster for CPU-GPU processes if the GPU can just read off and write on the system's RAM, which means no more wasted cycles to copy values back and forth between system RAM and VRAM.
     
  15. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,975
    Likes Received:
    4,342
    GPU:
    Asrock 7700XT
    Because I said it's the same CPU architecture. When you optimize a program to work on Ivy Bridge, whether you have an i3 or an i7, you'll see a noticeable performance gain in both platforms, but an AMD CPU with the same instruction sets likely won't get that performance bonus because it's architecturally different.
    Again, huma is not going to make that immense of an impact. On an APU, it would give a noticeable performance improvement, but you're acting like games for PS4 or Xbox will lose all performance gains on an AMD system solely because of huma. That's like saying a sprinter gets nearly all his speed through his shoes. Shoes make a difference but it's the legs that do all the work.

    I know the point of huma, but I don't think it's going to have as dramatic of an impact as you think it will on higher-end systems. At best, it fixes latency problems - which are a big deal, but there's probably a point where huma can't improve performance.

    Agreed. But I don't see that happening any time soon since such a scenario would be way too restricting or variable. huma works nicely on an APU because the CPU and GPU are fused together and you don't get another option. Suppose there was a time when there was a CPU, north bridge, and discrete GPU that were huma compatible. It wouldn't surprise me if the next generation of any one of those parts would break compatibility.
     

  16. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,780
    Likes Received:
    1,393
    GPU:
    黃仁勳 stole my 4090
    I dunno what games you're playing, but even a first gen i7 (my CPU) at 4GHz gets maxed out in some games. You probably think your CPU is not a bottleneck because you see it at 12 or 25% usage spread across all the cores. That's 1-2 threads maxed out then the times slices distributed among all the cores, you're still limited by the maximum 1 or 2 cores can pull off.

    Do whatever works for you, but the single threaded performance of AMD CPUs is too much of a handicap for my uses. And it's just too weak in anything that requires high FPU usage, take a look at the mighty FX 8350 in this FPU benchmark compared to my CPU at the same frequency with screwed up RAM timings (no idea why, don't care, upgrading to Haswell):

    [​IMG]
    Wondering how it did in all the other ones? Abysmal.
     
    Last edited: May 1, 2013
  17. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,975
    Likes Received:
    4,342
    GPU:
    Asrock 7700XT
    That's a possibility, but also I have a HD5750 which could be the real bottleneck. As long as I get 45-60FPS at near full detail, I don't really care if something is pushed to its limits. Any more than that for my single 1080p 60Hz screen is pointless to achieve. I'm sure if I got a 2nd 5750 or something to replace it with, my current CPU might be the real bottleneck.

    As of right now my computer is almost head to head with gaming performance compared to a PS3, with equal or better visual detail. The FX 6300 may have 2 less cores than the APUs in PS4 and Xbox, but each core is considerably more powerful. Since they're based on the same architecture, even if I were to benefit from 75% of the optimizations, an overclocked 6300 ought to perform very similarly, albeit considerably more power consuming.
     
  18. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,780
    Likes Received:
    1,393
    GPU:
    黃仁勳 stole my 4090
    Oh yes the 5750 would be the bottleneck before the CPU in any new/intensive game. My entire point was that in some things your CPU is going to be a bottleneck and if you upgrade to a Bullpile CPU you're going to have the same problem.

    You really shouldn't compare to consoles, for various reasons. As the person before said you're not going to gain any advantage over an Intel CPU, at all, and it's not just because of the fUMA that he pointed out. I can say with certainty Intel CPUs will still have the obvious lead in any gaming until it becomes a thread race. So unless the new games are 8 threaded when ported to PC, Intel CPUs will still win by a landslide, and then it'll only be a matter of time before Intel just makes octo cores the standard to match AMD. Throw in the HT and Intel wins harder.

    Anyway, you got my advice already (to avoid Bullpile like it's a disease), I don't need any convincing or justification of why you're choosing a Bullpile CPU. Just remember that we told you it won't hold any advantage in future games due to its architecture.
     
    Last edited: May 1, 2013
  19. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,975
    Likes Received:
    4,342
    GPU:
    Asrock 7700XT
    I compare to consoles in terms of what's considered acceptable gameplay. Aside from huma, what makes you think modern AMD processors won't gain an advantage? Tests have proven that BD/PD are potentially as good as i7, IF the software is optimized for it. Most software isn't, so the architecture is considered to be generally crappy in the PC world.

    In intel's terminology, AMD has more of a tick-tick-tick-tock strategy so I highly doubt that the APUs are going to vary too drastically (architecturally) compared to their regular x86-64 products, therefore, optimizations for consoles should carry over nicely. Not perfectly, but enough to make a noticeable difference for AMD users. I'm not saying going for intel will be a bad choice, what I'm saying is due to the console optimizations, AMD users could probably pay considerably less than an intel user and get a near identical experience. Obviously you need to consider more than just gaming on a PC, but since Linux is very BD/PD friendly, I don't have much to worry about.
     
  20. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,780
    Likes Received:
    1,393
    GPU:
    黃仁勳 stole my 4090
    I stopped reading there. I tried to help and you don't want to listen, that's fine. If you actually need some answers you can Google from here, I don't have the patience for this.
     

Share This Page