I would rather buy a Xbox and go all-in on console gaming at this point before trying to go to 11 just for DirectStorage I have no numbers, but I didn't notice anything obviously different between 10 LTSC and 11 with quick glances at Task Manager. With 11 I wipe all UWP apps out, use a local account, and disable Defender, so I'm not too sure if any of that has higher memory requirements than 10.
When Vista came out, everybody said it was poor and XP was so much better. When W7 came out, everybody said it was poor and Vista was so much better. When W8 came out, ok lets skip that one. When Windows 10 came out, everybody said it was poor and W7 was better. Now W11 is out, guess what, people still hating on something new, until something newer comes out. Imagine if people refused to upgrade their CPU/GPU/MOBO, how would the drive for greater innovation work. Seriously, some people just want to dislike something for no reason other than its a new thing.
I noticed today on my W10 64 pro that when I downloads a game at say 50/60Mbps my ram usage jumps to 9/10 GB. Thats maybe why W11 comes with a 16GB recommendation, simply because of fast internet speeds and big file downloads?
Also, if i have 16gb, 32gb or even 64gb of RAM, im more than happy for the OS to use it. Its not the days of W XP and dredfull RAM utalisation.
well for now is no reason install W11, but... there no reason for stuck in W10 for 125 years if you have any modern CPU/PC from 2018+ no point stick in W10 honestly games is better on W11 (fps, stability etc) , surprisly because benchmark is beter on W10, not much but noticable for benchmarkers even 12y old laptop managed W11 very well, with 4GB+ ram full agree with mr. pegasus1 new OS new hates for no reason, all OS have bugs more or less... XP from start is full of crap , and X64 version is simply crap BTW i using W11 from early start it my own opinion and experience thats all
it only solves compatibility issues https://www-purepc-pl.translate.goo..._sl=pl&_x_tr_tl=en&_x_tr_hl=pl&_x_tr_pto=wapp
I honestly pity you ol man. Repeating the same thing over and over again in hopes that someone listens, must be though.
go see videos and you'll see cpu usage down even if performance is the same. now picture 6/12 vs 6/12+4/8 e-cores in sth like spiderman where even 8/16 is getting very high usage. i5's are gonna benefit most out of it further down the road,and even now in some cases. I wouldn't buy a 6/12 now, but 6/12 with 8 e-cores like 13500 will be fine for heavily multithreaded games.
Oh wow, how amazing... lower cpu usage... at exactly the same fps, but more hardware complixity, power usage and cost... wait, were you arguing for or against e-waste cores? And no, you do under no circumstances want the game threads to hit the e-waste cores. You will never benefit from e-waste cores in games, cause as soon as the game starts to use them, you are going to see slowdowns (aka frametime spikes), and generally speaking lower performance... so no.
this chart is unrelated to what you're claiming,never seen 12900k test that would point out inconsistent frametimes with e-cores enabled and improved with disabled. you have to resort to making stuff up now ? btw you're still not getting what im writing about, you sure you're up to it intelectually ? im talking about a 6/12 cpu vs 6/12 + ecores in a mt heavy game like spiderman, not 13900 you're like a broken record
The chart i linked was simply pointing out that e-waste cores enables lowers performance. As for the rest, have you actually looked at the vid you linked yourself? As virtually all comments on the vid also points out, the frametime spikes are gone with e-waste cores disabled, and present with e-waste cores enabled... Perhaps you should watch the vids before posting them, trying to push them for an agenda they show the oppersite result of...
Careful, he might bring up the grammar guns and then you're out. //old man you still didn't asnwer my question. I know you had time to read it.