Discussion in 'Operating Systems' started by Danny_G13, Aug 5, 2022.
ok then show me where. cause it doesn't. and no review ever has foun out it does.
Wanna play that game ? There are way more examples of E-waste cores lowering game performance, even on your favourite polish site...
margin of error stuff while cpu load is consistently down even on 13900k that doesn't need e-cores
and where in that video do e-cores cause frametime spikes, you didn't answer
pointless conversation as you don't even understand what it's about.
121 fps with e-waste core disabled and 111 fps with the enabled is margin of error stuff? I think we'll just leave the conversation at that.
where are the spikes in the video I apparently "didn't watch" ?
you're so one dimensional and obtuse it's just hilarious. I too think e-cores do little for 12900k in gaming,but
1. not every cpu is 12900k running on top of the line ddr5
2. you're limiting application performance at the same time while games lose almost nothing,while cpu usage will be way down for core/thread limited processors,like 6/12 in new games.
3. rpl-s will have improved e-cores too.with double the cache size.and even 13500 will have 8, same as 12900k. that was my point, further down the road, a 6/12 will choke while e-core will help 13400/13500 in heavily mt games. that scenario just flew right over your head,cause 12900k is the only cpu that exists. what a waste of time you are in those cpu related threads. you have one catch phrase for earning likes,from other Debbie Downers who like everything that's unfavorable for intel. like you're counting it almost.
This is true, but we have to remember that each P-Core will have 2MB of L2. While each cluster of 4 e-cores will have 2MB of L2, meaning just 512KB per e-core.
Im not going to go through the vid to give you all the time stamps... if you want the time stamps, watch the vid you linked. But the fact that nearly all comments on the vid are about it is evidence enough. And i like that you posted a vid that shows the oppersite result of what you thought xD
Ram has no impact on e-waste core performance contribution, or rather lack off... so rather mute point. Aside of that, only a fool would buy the top line cpu without getting equivelant ram.
Here you go again with the cpu usage... yes, there are a ton of e-waste cores barely being used, cause you don't under any circumstance want the game to use them... they give lower overall cpu load, but it means absolutely NOTHING - zip, zero, nada. They do not contribute to gaming performance in any possible way for the vast majority of games - on the contrary they lower the performance, increase power consumption, increase chip cost, and lower the amount you can OC the chip.
As for e-waste cores increasing productivity performance... yes, but im talking gaming chips here, and they are absolutely unconditionally a waste for gaming.
The simple solution is that intel ought to make seperate gaming chips, like amd is doing with the 5800x3d... no e-waste cores, just as many P cores as can be crammed onto it (be it 10 or 12).
I did and found nothing, you didn't but advise me to watch the videos I post, how pathetic is that.
pointless to continue, i'm going to move on since you don't know what you're talking about.
do not care about the rest of your s***post since lower cpu usage with e-cores enabled is the whole reason I wrote here.
Dunno what you are trying to show with this? Ddr5 being faster than ddr4? Intel 12th gen arch being faster than ryzen 3 ? Neither has anything to do with the fact that e-cores do NOT improve gaming performance, and that the intel cpu's would be faster in games with dedicated gaming cpus with more P-cores and / or more cache.
Totally agree but I'm still on 7 now, I plan on switching eventually but I just like 7 I never have any problems with it my current install is 12 years old now.
so much crying about the fastest cpu in both gaming and applications
12900 would be better off with more cache
now think about an i5 that's hitting near full usage on six cores,would you rather have more cache and fps with stutter and hitching too ?
Here we go again with the cpu usage... having high cpu usage is not a bad thing per say, as long as the cpu isn't the bottleneck. And having e-cores does not solve it, cause all it does it show you lower overall cpu usage... however, the P cores are still getting hammered, and as soon as any load hits the e-waste cores, you will see a dip in performance.
And With dx12 there isn't a fixed amount of cores games use / need... what matters is the total amount of cpu power, assuming the cpu isn't being limited by anything else (like cache, interconnect, etc). So theoretically, a 4 core cpu with twice as much performance per core should see the same performance as an 8 core - everything else being equal. And this goes for frametimes aswell.
Best example of this is the 12100, which is a 4 core with vast improvements in singlethread performance vs previous gens.
wrong as always,e-cores do a lot of work too and there are no frametime inconsistencies
Whats this e-core crap again here? Intel had to go to this route to be competitive again and only worthwile by specific tasks like cinebench and alike, yes great
anyway win11. I like it, but taskbar is funky.. sometimes it works sometimes i have to click twice on stuff..
The only reason i can think of is if you were a die hard super fan of ME, Vista and 8. And want to continue that trend.
anyone know when the HDR calibration will be out?
So far, the new features didn't convince me to upgrade. Maybe if I build a new pc from the scratch or are forced to replace my storage drives or something. Then I might get win11 pro from a cheap-ass key store.
Other than that, I doubt my pc needs the "upgrade", especially with the ugly phone OS looks that remind me of win 8 which killed the whole OS for me.
Ok, so I'm yet to switch from W10 to W11 but I have used my brother's PC in August with W11 installed. Hated it. Maybe it's just because I found it convoluted and different.
I'm going to stick with W10 for now. W11 does HDR much better than W10, apparently, but I don't have a decent HDR panel hooked up to my PC anyway (Odyssey G9, absolute shocker).
I switched to W11 due to the improved look (i personally like it), and the more up to date code base, better multicore support, better core scheduling, better memory management, improved fullscreen optimisations (especially when alt+tabbing), and of course more frequent updates.
Windows 10 was and is still a great OS, just use what works for you. But don't knock it until you've tried it.