Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 4, 2019.
Holy crap is what the pope does
@Hilbert Hagedoorn Asus motherboard bios update link?
127W on idle,right? Because Core architecture is getting old.
The problem is that older CPUs will stutter, new ones wont. Experience ruined
Plus you can't have anything running background, like streaming.
Probably 127 at base clock...
I upgraded from a 2600K OCed at 4.4ghz to a 9900K at all core 5.1ghz. I have a 980 GTX and would have problems always staying over 60 fps with all settings maxed out in most games. When I upgraded to my 9900K I also bought a new 1080p 144mhz monitor (I do not see a need for 1440p because 1080p looks good enough for me) and now I am always over 60fps and most of the time locked at 144mhz or close to it. I am very happy with my upgrade to a 9900K. If I was asked if I would get that again over say a Ryzen 3000. I would go 9900K again.
My $420 9900KF arrives today. Hoping to get 5.1-5.2Ghz.
you know my next question buddy!
What you gonna do with the 8700k? .....you still got my address?.....
You got a z370 motherboard and $200?
Please put it on a Shelf Brother!....that would be a really good deal for me!....I gotta get some ddr4 too ....give me a week please!
I’m in no rush. It’s here when you want it. It’s delided too. Great temps and runs 5.0 at 65°C under a h100i.
I just ran in the bedroom and kissed my wife after reading your reply.....
She said, why you so happy today!.....
Funny how the different brands are having similar performance problems, stemming from different causes.
Intel has highly binned cpus (like this) taking away any possible (added) user oc on lower models, while amd seems to have a bit lower max clocks than they anticipated to reach, limiting oc as well.
I can do 140ish @1440p running a gtx1080 with a stock 3600 (no pbo) on balanced profile boosting to 4.1 ghz, while cpu/mb/ram cost me 420$, and even if add the 50$ i saved on the combo (MC sale), its still less then what Intel wants for just the cpu.
Im looking at getting the most for my money, and you spend significantly more, to run at a higher clock and higher ST just to see zero gains over a cheaper amd based system.
And a good friend is running a 1080p@120/144hz (1440p was a lot more when he got it a while ago) without ever dropping below 120.
You would have been better off spending the added cost on a gpu upgrade.
But that you stated that 1440 wont look any better, already told me a lot.
put a 26in (or bigger) screen on your desk running 1080p vs same model in 1440/2160, and tell me there is no difference...
aahh now I have to downlock to 5.0 just to see how many watts my "old" 9900k produces at all cores
edit : 150watts @1.264v according to Aida64
Ryzen is a fine choice for high refresh rate gaming, but there's no doubt that Intel is the better choice here. I wouldn't blame anybody for going with a 9900K/S for their 1080p/144hz+ monitor.
1440p certainly looks a lot better than 1080p, but I noticed even more of a difference going from 1440p to 2160p - which makes sense considering that the pixel ratio is a lot higher (2.25x, compared to 1.78x going from 1080p to 1440p). Although I like high refresh rates, the kind of detail you can see in 4K is often times breathtaking - I can't wait for affordable 4K/144hz monitors to arrive (not to mention a GPU that can power it). You can't really appreciate IQ mods (like 8K CBBE skin textures in Fallout 4) on a 1080p or 1440p monitor.
I was just going with my friend(s) info, as he's a competitive pvp player (his console clan was top 10 germany/top 50 world), and he switched from 7700k oc to 8C ryzen (1st gen), and his priority has always been refresh (vs size and/or eycandy like me).
And with the fact that he didn't see any improvement running a 2080ti, i know his rig isnt gpu limited..
I dropped looking at 4k monis, as they just get too expensive (past 27), so most likely gonna get a 43/49 fald 120hz tv.
Especially since all the games i play run vsync 60/75 fine, even at 4k, siege and others ill just run at lower res and have the tv upscale..
I wish you people understood how human vision works and how nonsensical 4K is on anything smaller than 50" with 20/20 vision but it's you wallet...
You are talking nonsense and have no proof of how human vision works, sorry. If you translate 20/20=1 and multiply it by 100, you get the sweet spot ppi. Acceptable win or loss, is around +10-10. ~90~110 PPI are the most acceptable limits. In other worlds, 4K TV is sweet from 40" to 48.5".
Smaller are overkill and no difference to human eyes, bigger would lead to pixelation. That's all, really.
Actually I do. I did my research long time ago, irregardless of 4K. Your eyes have center vision and peripheral vision which dramatically differ in detail perceived. I am not going to hold a lecture here since you can find all this yourself both in medical papers, science papers and even youtubers made videos on this topic. Try Vsauce for example. 20/20 vision refers to eyesight, not ppi.
but sure, feel free to impose your imaginary greatness by doing some calculations you made up.
I cannot even start talking about 80" 4K screen has difference and much worse quality to our eyes in comparison with 20" FULLHD. That's OK. End of off topic. Those medical papers, science papers and youtubers do not know anything about how screens work at all. Each to his own.