What reason to go to Windows 11?

Discussion in 'Operating Systems' started by Danny_G13, Aug 5, 2022.

  1. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    7,188
    Likes Received:
    4,210
    GPU:
    RTX 3060 Ti
    ok then show me where. cause it doesn't. and no review ever has foun out it does.
     
  2. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,326
    Likes Received:
    3,399
    GPU:
    RTX 4090 Gaming OC
  3. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    7,188
    Likes Received:
    4,210
    GPU:
    RTX 3060 Ti
    margin of error stuff while cpu load is consistently down even on 13900k that doesn't need e-cores
    and where in that video do e-cores cause frametime spikes, you didn't answer
    pointless conversation as you don't even understand what it's about.
     
  4. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,326
    Likes Received:
    3,399
    GPU:
    RTX 4090 Gaming OC
    121 fps with e-waste core disabled and 111 fps with the enabled is margin of error stuff? I think we'll just leave the conversation at that.
     

  5. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    7,188
    Likes Received:
    4,210
    GPU:
    RTX 3060 Ti
    where are the spikes in the video I apparently "didn't watch" ?
    you're so one dimensional and obtuse it's just hilarious. I too think e-cores do little for 12900k in gaming,but
    1. not every cpu is 12900k running on top of the line ddr5
    2. you're limiting application performance at the same time while games lose almost nothing,while cpu usage will be way down for core/thread limited processors,like 6/12 in new games.
    3. rpl-s will have improved e-cores too.with double the cache size.and even 13500 will have 8, same as 12900k. that was my point, further down the road, a 6/12 will choke while e-core will help 13400/13500 in heavily mt games. that scenario just flew right over your head,cause 12900k is the only cpu that exists. what a waste of time you are in those cpu related threads. you have one catch phrase for earning likes,from other Debbie Downers who like everything that's unfavorable for intel. like you're counting it almost.
     
    Last edited: Aug 28, 2022
  6. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    5,850
    Likes Received:
    7,104
    GPU:
    RX 6800 XT
    This is true, but we have to remember that each P-Core will have 2MB of L2. While each cluster of 4 e-cores will have 2MB of L2, meaning just 512KB per e-core.
     
  7. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,326
    Likes Received:
    3,399
    GPU:
    RTX 4090 Gaming OC
    Im not going to go through the vid to give you all the time stamps... if you want the time stamps, watch the vid you linked. But the fact that nearly all comments on the vid are about it is evidence enough. And i like that you posted a vid that shows the oppersite result of what you thought xD

    Ram has no impact on e-waste core performance contribution, or rather lack off... so rather mute point. Aside of that, only a fool would buy the top line cpu without getting equivelant ram.

    Here you go again with the cpu usage... yes, there are a ton of e-waste cores barely being used, cause you don't under any circumstance want the game to use them... they give lower overall cpu load, but it means absolutely NOTHING - zip, zero, nada. They do not contribute to gaming performance in any possible way for the vast majority of games - on the contrary they lower the performance, increase power consumption, increase chip cost, and lower the amount you can OC the chip.

    As for e-waste cores increasing productivity performance... yes, but im talking gaming chips here, and they are absolutely unconditionally a waste for gaming.

    The simple solution is that intel ought to make seperate gaming chips, like amd is doing with the 5800x3d... no e-waste cores, just as many P cores as can be crammed onto it (be it 10 or 12).
     
    Last edited: Aug 28, 2022
    Wolverine2349 likes this.
  8. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    7,188
    Likes Received:
    4,210
    GPU:
    RTX 3060 Ti
    I did and found nothing, you didn't but advise me to watch the videos I post, how pathetic is that.
    pointless to continue, i'm going to move on since you don't know what you're talking about.
    do not care about the rest of your s***post since lower cpu usage with e-cores enabled is the whole reason I wrote here.
     
  9. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    7,188
    Likes Received:
    4,210
    GPU:
    RTX 3060 Ti
    5800X3D vs. Core i7-12700K & i9-12900K.png
     
  10. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,326
    Likes Received:
    3,399
    GPU:
    RTX 4090 Gaming OC
    Dunno what you are trying to show with this? Ddr5 being faster than ddr4? Intel 12th gen arch being faster than ryzen 3 ? Neither has anything to do with the fact that e-cores do NOT improve gaming performance, and that the intel cpu's would be faster in games with dedicated gaming cpus with more P-cores and / or more cache.
     
    Wolverine2349 likes this.

  11. SplashDown

    SplashDown Master Guru

    Messages:
    937
    Likes Received:
    265
    GPU:
    EVGA 980ti Classy
    Totally agree but I'm still on 7 now, I plan on switching eventually but I just like 7 I never have any problems with it my current install is 12 years old now.
     
    Espionage724 likes this.
  12. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    7,188
    Likes Received:
    4,210
    GPU:
    RTX 3060 Ti
    so much crying about the fastest cpu in both gaming and applications
    12900 would be better off with more cache
    now think about an i5 that's hitting near full usage on six cores,would you rather have more cache and fps with stutter and hitching too ?
     
    Last edited: Aug 29, 2022
  13. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,326
    Likes Received:
    3,399
    GPU:
    RTX 4090 Gaming OC
    Here we go again with the cpu usage... having high cpu usage is not a bad thing per say, as long as the cpu isn't the bottleneck. And having e-cores does not solve it, cause all it does it show you lower overall cpu usage... however, the P cores are still getting hammered, and as soon as any load hits the e-waste cores, you will see a dip in performance.

    And With dx12 there isn't a fixed amount of cores games use / need... what matters is the total amount of cpu power, assuming the cpu isn't being limited by anything else (like cache, interconnect, etc). So theoretically, a 4 core cpu with twice as much performance per core should see the same performance as an 8 core - everything else being equal. And this goes for frametimes aswell.

    Best example of this is the 12100, which is a 4 core with vast improvements in singlethread performance vs previous gens.

    [​IMG]

    https://www.techpowerup.com/review/intel-core-i3-12100f/19.html
     
    Last edited: Aug 29, 2022
  14. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    7,188
    Likes Received:
    4,210
    GPU:
    RTX 3060 Ti
    wrong as always,e-cores do a lot of work too and there are no frametime inconsistencies
     
  15. -Tj-

    -Tj- Ancient Guru

    Messages:
    17,801
    Likes Received:
    2,362
    GPU:
    3080TI iChill Black
    Whats this e-core crap again here? Intel had to go to this route to be competitive again and only worthwile by specific tasks like cinebench and alike, yes great :D



    anyway win11. I like it, but taskbar is funky.. sometimes it works sometimes i have to click twice on stuff..
     
    Last edited: Sep 14, 2022
    Wolverine2349 and Dragam1337 like this.

  16. TheDeeGee

    TheDeeGee Ancient Guru

    Messages:
    8,479
    Likes Received:
    2,481
    GPU:
    PNY 4070 Ti XLR8
    The only reason i can think of is if you were a die hard super fan of ME, Vista and 8. And want to continue that trend.
     
    pegasus1, Sylencer and 386SX like this.
  17. machete

    machete Member Guru

    Messages:
    105
    Likes Received:
    11
    GPU:
    ASUS GTX 1080/AiO
    anyone know when the HDR calibration will be out?
     
  18. Sylencer

    Sylencer Master Guru

    Messages:
    231
    Likes Received:
    85
    GPU:
    ASUS 3090 Strix OC
    So far, the new features didn't convince me to upgrade. Maybe if I build a new pc from the scratch or are forced to replace my storage drives or something. Then I might get win11 pro from a cheap-ass key store.
    Other than that, I doubt my pc needs the "upgrade", especially with the ugly phone OS looks that remind me of win 8 which killed the whole OS for me.
     
  19. metagamer

    metagamer Ancient Guru

    Messages:
    2,388
    Likes Received:
    1,019
    GPU:
    Palit GameRock 2080
    Ok, so I'm yet to switch from W10 to W11 but I have used my brother's PC in August with W11 installed. Hated it. Maybe it's just because I found it convoluted and different.

    I'm going to stick with W10 for now. W11 does HDR much better than W10, apparently, but I don't have a decent HDR panel hooked up to my PC anyway (Odyssey G9, absolute shocker).
     
  20. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    9,716
    Likes Received:
    2,196
    GPU:
    PNY RTX4090
    I switched to W11 due to the improved look (i personally like it), and the more up to date code base, better multicore support, better core scheduling, better memory management, improved fullscreen optimisations (especially when alt+tabbing), and of course more frequent updates.

    Windows 10 was and is still a great OS, just use what works for you. But don't knock it until you've tried it.
     

Share This Page