AMD FX-4350 and FX-6350 Piledriver CPUs

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 1, 2013.

  1. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,973
    Likes Received:
    4,341
    GPU:
    HIS R9 290
    You tried to help me by simply telling me I'm wrong and to go for a CPU that's unnecessary. You have yet to prove why or how optimizations won't work when they make all the difference. IIRC, you had this same problem in another topic on these forums - a topic I wasn't as involved in.

    Look at Resident Evil 4 for gamecube for example. Abysmal hardware but can play a game that looks decent even for today's standards. A game like that on a console that crappy can only be achieved through really system-specific development. PC games demand significantly higher specs because they need to be generalized between all hardware, but, if the new gen consoles are roughly x86 based, devs won't have to work so hard on generalizing their code since it's already nearly half way there. Since the new consoles have similar pipelines to AMD's current models, AMD users therefore get many of the optimizations the consoles get, on the CPU side anyway. Not sure how that's such a hard concept to grasp. It's obviously a lot more complicated than that, but I didn't say it'd be a seamless transition.
     
  2. mohiuddin

    mohiuddin Maha Guru

    Messages:
    1,007
    Likes Received:
    206
    GPU:
    GTX670 4gb ll RX480 8gb
    Are these only overlocked version of 4300fx and 6300fx? Nothing else?
     
  3. deltatux

    deltatux Guest

    Messages:
    19,040
    Likes Received:
    13
    GPU:
    GIGABYTE Radeon R9 280
    schmidtbag has an 890FX motherboard, it doesn't support any AMD FX series CPU...

    deltatux
     
  4. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,780
    Likes Received:
    1,393
    GPU:
    黃仁勳 stole my 4090
    I don't have to prove anything, you're defensive and don't want to listen, that's your problem. Prove optimizations won't work? LOL, if you don't realize what's wrong with that sentence right there you really should stick with consoles. Not to mention you're slinging dirt at the wrong person. Enjoy your Piledriver CPU, you deserve it.
    I know.
     
    Last edited: May 2, 2013

  5. deltatux

    deltatux Guest

    Messages:
    19,040
    Likes Received:
    13
    GPU:
    GIGABYTE Radeon R9 280
    I wasn't noting it for you, I was referring to schmidtbag directly. Should have phrased my sentence better lol.

    I think it'll be cheaper if schmidtbag upgrades to a Thuban. Even if he buys it used, it'll still be a worthwhile upgrade (depending on how he uses his rig). Else, just overclock the darn thing and wait until he can rebuild it.

    EDIT: Something like this: http://www.ebay.com/itm/USED-AMD-Ph...LER-/330916582148?pt=CPUs&hash=item4d0c2a5304

    deltatux
     
    Last edited: May 2, 2013
  6. k1net1cs

    k1net1cs Ancient Guru

    Messages:
    3,783
    Likes Received:
    0
    GPU:
    Radeon HD 5650m (550/800)
    And yet, unless you're running an application that can truly utilize a multi-core CPU with parallelism, an AMD CPU doesn't really stand out much compared to an Intel.

    Moreover, you strictly compared the 6300 to the upcoming i5 & i7 Haswell.
    If you have a source that has performance comparisons on games between 6300 and current i5 & i7, I'd like to read that.


    And hUMA isn't exactly the shoes.
    It's an optimization technique/standard, i.e how to work out the legs so you can run efficiently.

    And again, I'm not 'acting like games for PS4 or Xbox will lose all performance gains on an AMD system solely because of huma', but more about 'if ported games can't/doesn't utilize hUMA, why would it run faster on an AMD CPU than on an Intel CPU'.
    The point of contention here is your statement regarding the performance advantage of AMD CPUs against Intel's on ported games from next-gen consoles.
    Feel free to keep nitpicking about 'unnoticeable performance difference with/without hUMA', but that's not something I wanted to discuss about in the first place.


    AMD's advantage in core count and parallelism against current Intel CPUs have been proven in many articles that they don't really bring improvements outside specific applications that do parallelism extensively.
    My point is, unless the game devs utilize hUMA extensively on consoles, the ported games won't have much of a difference in terms of performance compared to the console version.

    And since the cat hasn't even been let out of the box yet, both of us can't know how much impact will hUMA make (if it's really going to be implemented on said consoles, mind you).
    Exactly why I don't discuss it, and rather focus on your statement about AMD CPU's advantage on ported games.


    APUs can be paired with a discrete GPU; most laptops with A8 or A6 APUs are usually configured this way, like Samsung's 5 series IIRC.
    hUMA will work nicely on that setup as well if it's being implemented properly.


    Break compatibility as in what?

    Hardware-wise, Intel has changed sockets more often than Taylor Swift breaking up, and AMD has also changed sockets for their CPUs though (way) less often; not many people seem to mind that.
    We also had ISA to make way for PCI, AGP for PCIe, and so on.

    Software-wise, apps are often hardware agnostic, unless you compile it for a specific optimizations and/or extensions for a CPU generation, in which usually it's either run unoptimized or fail to run.

    All I'm saying is, hUMA will 'break compatibility' as much as AMD or Intel adding specific instructions to their CPUs.
    It's also nothing different than using AHCI for your HDD/SSD or not.
    Apps being run on a hUMA'd platform would simply not use hUMA features if not being coded specifically for it, not rendering it to stop working altogether.
     
  7. signex

    signex Ancient Guru

    Messages:
    9,061
    Likes Received:
    303
    GPU:
    RTX 2080 Ti
    FX-6300 is really great, it also doesn't get that hot.

    I'm using Scythe Grand Kama Cross Rev.B and my temps are really low. :)

    It only idles at 26c, full load 52c.
     
  8. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,251
    Likes Received:
    232
    GPU:
    EVGA GTX 1080@2,025
    This claim is 100% false. Since 1999 when AMD released the "Slot A" cartridge processor, they have had a total of 12 different sockets for the desktop market. Intel has only had 7.

    I don't fault you for the inaccurate statement because I'm sure you're just repeating what you've heard a hundred times over on various forums from people who just make sh*t up. :bang:

    btw... I counted from 1999 when AMD entered the market for obvious reasons.
    http://www.hardwaresecrets.com/article/A-Complete-List-of-CPU-Sockets/373/2
     
  9. Pill Monster

    Pill Monster Banned

    Messages:
    25,211
    Likes Received:
    9
    GPU:
    7950 Vapor-X 1100/1500
    The difference is that AMD CPU's have retained backward compatibility since AM2.

    EDIT- FM1/FM2 are designed exclusively for APU's while Skt 940/F are for servers.... so can't really be counted in that list.
     
    Last edited: May 3, 2013
  10. Taint3dBulge

    Taint3dBulge Maha Guru

    Messages:
    1,153
    Likes Received:
    12
    GPU:
    MSI Suprim X 4090
    LOL you really hate AMD dont you. makes me want to puke sometimes that you cannot even see that using a amd system to a intel system for just "gaming" is maybe a 5% difference in almost all games out there.. Well maybe 4 year old games that are single threaded intel would be better at.. pff. We are talking about gaming here not using the pc to make arts and crafts and edit porn. If I wanted to have best single threaded performance Id diffidently sidegrade to a intel. But since I dont do anything regarding editing and what have you I'll stick to something that i can have fun with and end up blowing it up for half the cost of a new top end intel.. I do like amd's Iv never had a problem with one and iv done nasty things to every one of them.. Sometime im sure ill upgrade to an intel system... But thats only if there is a true reason to.. So far i see no reason to spend 100$'s more.

    schmidtbag

    Do whatever you see is fit for you. If you go AMD you will have 100's left over for upgrading to a better video card..
     
    Last edited: May 3, 2013

  11. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,780
    Likes Received:
    1,393
    GPU:
    黃仁勳 stole my 4090
    Remember the last time we were having a discussion and you said that... and posted a biased GPU limited benchmark (where the FX chips were still causing frame loss)... then you got about a thousand benchmarks in response proving otherwise and you had nothing to say? Yeah, I remember.

    Edit: Here you go -
    5% difference and HUNDREDS of dollars saved? Thanks for the lulz man, I needed it.
     
    Last edited: May 4, 2013
  12. Pill Monster

    Pill Monster Banned

    Messages:
    25,211
    Likes Received:
    9
    GPU:
    7950 Vapor-X 1100/1500
    Don't be a dick, I've said this already, but do you have any idea how old those games are? Even Skyrim is based on a 5yr old engine.

    And as for Crysis3 - since when was Piledriver competing with the i7??
     
  13. Pill Monster

    Pill Monster Banned

    Messages:
    25,211
    Likes Received:
    9
    GPU:
    7950 Vapor-X 1100/1500
    Btw I should add I'm actually playing Skyrim as we speak, everything maxed with all HD mods etc and my fps haven't dropped below 65.

    And I'm capped at 65fps using afterburner.....

    OK they did drop to 50fps, but my cpu cores are only hitting 60% max on 2 cores, the others are not being used.
     
    Last edited: May 4, 2013
  14. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,780
    Likes Received:
    1,393
    GPU:
    黃仁勳 stole my 4090
    He throws out accusations and makes up numbers, I gave him some hard facts. Those games are relatively new, and ones which are regularly played. Personally I never got around to finishing Skyrim, waiting for a new card, my current one fails too hard. But post some more recent benchmarks if you want, I just pulled those ones up since they were already posted.

    Those games are far from the worst case scenario. What will happen when something is demanding on the FPU? In those cases he'd wish the results were equal to those benchmarks posted, the FX chips only have 1 weak FPU per module and that includes the upcoming Steamroller or whatever it's called. That's why they're destroyed in those FPU benchmarks. Best we can hope for is that future PC ports are 8 threaded. We'll see by early next year hopefully since the consoles should be out by the end of the year.
    Are you sure that's actually what's happening and it's not distributed time slices? How many threads does Skyrim use anyway?

    Something seems wrong since even being GPU limited never resulted in dips to 50 for me. I just want it to always be at 60 (or higher) since it's such a long game, if I'm going to spend so much time on one thing I want it to be the best experience possible (on a reasonable budget).
     
    Last edited: May 4, 2013
  15. Kaleid

    Kaleid Ancient Guru

    Messages:
    2,826
    Likes Received:
    353
    GPU:
    7900xt
    Gamecube doesn't even run in HD.
     

  16. k1net1cs

    k1net1cs Ancient Guru

    Messages:
    3,783
    Likes Received:
    0
    GPU:
    Radeon HD 5650m (550/800)
    Hahah, yeah, "100% false".
    I also don't fault you for the inaccurate statement because I'm sure you're just repeating what you've heard a hundred times over on various forums from people who just make sh*t up. :bang:

    It's good you didn't fault me because you'd be wrong.
    That statement, which you regard as from copypasting across several beeellion forums, was actually from my experience on tinkering with PCs from before 1999.
    So yeah, you could actually blame me for not being specifically start from 1999, but sorry you missed that opportunity.

    For obvious reasons?
    There's more than just the one reason that AMD actually started their own socket design in that year?
    Tell me, tell me; I wanna know!

    Btw, you probably missed the fact that AMD has entered the market from before 1980.
    Or the fact that K6 was released around...1996, I think? Or was it 1997?
    Of course they didn't use their own socket, but I just found that claim about '1999 when AMD entered the market' 100% false.


    Shush, give the guy a break, eh?

    =b

    To be honest, LGA 1156 and 1366 can also be counted as one because they're targeting different performance segments within the same generation.
    But yeah, I concur with your statement that AMD sockets (in the last few years) do have more backward compatibility than Intel's.


    @schmidtbag
    Personally, I'm not going to insist you're better off going Intel as I never intended to.
    I'm just disagreeing your claim about 6300's performance compared to Haswell i5's (and i7's) performance, when you have in no way known the hard numbers to compare it with.
    If you're still going to get AMD for your upgrade then by all means; get what's more sensible to your current needs and available funds.
    Just don't carelessly spew out questionable statements and being so defensive about it the next time around.
     
  17. Pill Monster

    Pill Monster Banned

    Messages:
    25,211
    Likes Received:
    9
    GPU:
    7950 Vapor-X 1100/1500
    I don't know, probably. I'd have to open up Process Explorer to see.

    Anyway Skyrim has fps drops on the best PC's at certain places on the map, it's just the game.

    That's what happens to me, 95% of the time I'm running at my 65fps cap then if I look at a particular building or something it'll drop to 50. Still not really noticeable though...it doesn't lag or stutter.

    I'm over the game anyway, mainly due to the poor voice acting. Not only that but I think Bethesda used about 4 actors to play 100 different characters. lol

    Gimme Mass Effect anyday....
     
    Last edited: May 4, 2013
  18. IPlayNaked

    IPlayNaked Banned

    Messages:
    6,555
    Likes Received:
    0
    GPU:
    XFire 7950 1200/1850
    Lol..what.

    You're oversimplifying it, you know you're oversimplifying it, and you're misrepresenting it in an effort to "be right"

    Counting sockets, yes, AMD has more sockets. That said, AMD's upgrade path for each segment of the market has also been much cleaner. The sockets also tend to be separated by computer types. APUs work in the APU socket, and desktop CPUs work in their own socket. Rarely would you need an upgrade path between them.

    While each Intel iteration requires a socket upgrade, AMD's has not.

    AM2 could socket AM2+ and AM2+ could socket AM2.

    Once you had your AM2+ system, AM3 processors would work on an AM2+ system.

    So now you can upgrade to AM3 motherboard at your leisure. And finally, there is also some interoperability between AM3 and AM3+. AM3+ is expected to support Bulldozer, Piledriver, and Steamroller by the time all is said and done.

    Flexibility. You knew this. Come on.
     
    Last edited: May 6, 2013
  19. nhlkoho

    nhlkoho Guest

    Messages:
    7,755
    Likes Received:
    366
    GPU:
    RTX 2080ti FE
    zing!
     

Share This Page