Yeah b-die. im around that vccsa, ok a little lower 1.20v for both and ram 1.45v and yeah max i saw was 43c Ok I tested trfc 450 and it passed,also got better aida64 memory and cinebench single as well, best so far
Honestly I don't understand how this waste of sand could get a recommended rating. It makes absolutely no sense as a product, is terrible value and consumes a ton of energy.
Does it consume a ton of energy in games? No If you don't play Cinebench 24/7 it's totally fine. There is noe problem at all to cool 11700k/11900k, even 11900k @ 5300 all core is easy to cool with 360 aio. People that don't have the products tend to have the most problems with them Powerdraw in games with 3090, is pretty much the same for my 5900x VS my 11900k Just saying For playing Cinebench 24/7, 5900x/5950x is the way better choice
The 5800x is $100 cheaper, performs basically identically and uses less power when idle, less power in gaming and way less power when doing any kind of rendering. There is no argument to be made for this CPU. You sitting here defending it is eroding any credibility you have.
is 5800x 200$ cheaper than 11700k? I'm defending the fact it is better than people say. Not my fault younoobers don't know how to test 11700k/11900k is not the best product in the world, but 5800x isn't either The powerdraw is not an argument for most people. Look at the powerdraw of 2020/2021 gpu's. When you can easy cool the hardware, 20w difference in powerusage is nothing in gaming
Why would you think I'm comparing it to the 11700k? I found a best buy price for $550 so I edited my post from the initial $200 to $100 but the same argument applies. It's 18% cheaper for basically the same processor and a lower TOC. Techpowerup has it as only 2% faster than 5800x at any real resolution and Guru3D is barely more at 4%.
I think we are in a bit better situation today, back then, prescott's heat and power issues worsened due to no enough available cooling solutions and motherboards with VRMs that could withstand its power draw. I don't remember exactly but in 2004, the AiOs to cool such a thing were not sold, at best some dudes made their waterblocks at home. Even high tower air coolers were not mainstream until about 2008 or so and AiOs did become common around 2012/13. You cannot have a high hopes of cooling 130 W and higher TDPs with just low-profile downward-draft heatsink like were used back then in smaller cases with at best 2 case fans. Sure the performance was bad, but if it would not fail due to overheating or power problems, someone might have a use for it, just like they do now with 11900K.
you're running gear 2 mode depends on your spd and vcsa/vccio I got 4133 16-16-16-31 trfc 400trfc stable at 1.45v
It's the limit on my 11700k. Haven't testet max 1:1 on 11900k yet, because I tweaking high speed 1:2 now I've seen 2933mhz 1:1 from Cstkl1 at OCN, but high speed memory is better performance. Bandwidth is king for RocketLake
If I only knew how good those DFI P3 440BX boards were back in the day. We used them for OFFICE computers :x To credit, they were nice computers, and those were good boards - they survived for a few years until we replaced them with Willamette 1.7ghz Pentium 4's (should have been AMD, but this was before AMD really was *CLEARLY* on-top). I was too busy working 60~70 hours a week and saving up for my 1st house to really look into it. I had a Asus P3BF, Then a P4 (whatever it was for Willamette w/ RD-RAM) after it. -------------------------------------------------Secret of Mana had this saying in the intro in the early 90's on SNES/SuperFami Time flows like a river and history repeats. It couldn't be more true now. If people don't learn from the past, they are doomed to repeat history's mistakes. That's why we're taught this in school. Evidently intel wasn't doing any homework on the subject, regardless of having a pile of gold-plated pens and fancy paper. Lucky for them they are 1.) still riding on the coat-tails of their reputation, and 2.) There's just no way AMD could have ever planned to sell this many processors even if they PLANNED to have over-shot their previously-expected sales by double or triple. Everything online is flying off virtual shelves as fast as it's replenished, and everything in town flies off shelves every pay day. The only stores that had PC stock for a while were places like Staples where absolutely no one except a few office workers go (because they don't carry anything anymore for the hardware folks; 10 years ago they used to have sound cards, power supplies, RAM, other various goodies - not anymore. To think, I bought a 3.5GB MAXTOR HDD back in 1996 from them IIRC, never could keep that drive alive though). Eventually as the pandemic dies off, and the demand is satiated, maybe this will eventually die down and I can replace my 2070 Super with something with MOAR video memory (I need* it, designing Los Injurus for BeamNG Drive = 8gb is not enough for max detail). That said, it's a bit of a first-world issue here and I'd have to be 'spoiled' to truly complain as at this rate it's not killing my work-flow yet - just more of a minor inconvenience. I was worried when I got my R9 3950x last summer that maybe I should have waited 6 months for a 5950x. Not feeling a bit of buyer's remorse though, and that's before anything pandemic-supply-issue related is put into play. Worth every penny. I can't brag as I keep it working daily, but if you're looking at a 3900x/3950x/5900x/5950x for doing said work, they're the Bee's Knees as they say & you really can't go wrong with these. You can get all threads going and then tab over to youtube and watch videos and read news and you'd never be the wiser (aside of any increased fan noise or heat) that something else is using the CPU. Will be nice when this is over in a year (I HOPE!) or so. *For those who want to parrot 'oh, you don't NEED 8gb, it's just allocating it all', save it, sit on it, and spin a bit for good measure. Just because that's what you hear in a Youtube video, that doesn't make you a tech expert per other's work-flow. Just relish the fact that I have yet to figure out how to punch parrots through the internet. *sigh* When things go from 60~100fps where they should be (40~50fps would be fine honestly), and suddenly my 500mb tunnel model comes into view and it goes down to 10~20fps or something equally as jarring, it NEEDS it. Currently I have texture detail set to medium until I can lasso my current development version down below 8gb again. Currently it needs around 9~9.5gb on max texture detail, plus whatever the OS needs for overhead. Textures + models = lots of VRAM use, plus more if your game uses DLSS or Ray Tracing goodness (mine does not). That said, I knew 8gb wouldn't be enough for long when I got it in DEC 2019, was having some AMD driver issues at that moment which necessitated the upgrade, and didn't feel like spending on a Radeon VII 16gb if it risked driver issues though would have entirely preferred the card otherwise. To that end, the Nvidia 'studio ready' drivers still have issues, not always as drastic, but they're still there just in different places. Playing Quake II RTX on my 'work' card was fun though.
I think the biggest problem for this CPU is not just to do with AMD Chips, its the fact that Intel are trying to promote this as a amazing gaming chip when its hardly faster than a 10900K, and less cores. and I presume RTX 3090, CPU and NVME are running at PCIE 4.0 when the 10900k runs a PCIE 3.0. I recently upgraded to a 10900K at a good price with a Asus Maximus XIII Hero hoping this new CPU would be great for games and to upgrade when the price went down later. Will be sticking with the 10900k for a couple of years now.