Discussion in 'Frontpage news' started by anticupidon, Nov 12, 2019.
That's where I stopped reading.
Everytime i read news like this i have to chuckle, despite owning a Swiss Cheese myself.
Oy, another one. Intel probably needs to completely redesign their CPUs to avoid these issues. Makes me glad I switched to the red team.
You'd be surprised how easily people can gain access to a restricted workplace. The office where I work deals with sensitive client data and needs a keypass to enter but people regularly allow tailgating - that is, employees hold the door open for someone else. I also see a lot of people leave their computer unlocked when they walk away, allowing anyone in the building access to the system (both of these are against the company's security policy, which we take annual courses for). People do them because a) they want to be nice and b) they're lazy.
I guess it's a good thing most people are willing to ignore security.
Like i said already before did Intel gain a huge performance boost by not adding proper security measures in the first place.
Honestly it's looking more & more likely that i'm right and Intel took short cuts to gain performance advantages over AMD.
I was looking forward to upgrading my i9-7900X to a i9-10920X or so. I'm not sure its worth ripping out my X299 MB and putting in AMD right now when I can just do a CPU socket swap. Honestly though my 7900X isn't even working hard for most stuff right now along with my 2080Ti since I run 2560x1440@144Hz VRR. So maybe I will just wait to upgrade when PCIe 4.0 becomes mainstream and I have a graphics card that actually will use it, then make the jump to 4K VRR display like that new Predator 43in 144Hz or whatever else is good at the time.
So, these cpu were fast cause they used lot of shortcuts and cheaty code?
Hope better times comes to amd. Intel needs to learn the hard way.
not really, more like what happens when you keep building on top of an existing design for 20+ years, without really checking if things need to be changed.
Phoronix benchmarked FF and Chrome
If taking the geometric mean of all these web bowser benchmarks from Firefox and Chrome, the new microcode led to an average of 4.2% lower performance from yesterday's MCU. That's just looking at the microcode update pertaining to JCC Erratum while -- separately -- Zombieload TAA tests are also being worked on for publishing soon.
Looks like older chips that have been already patched for MDS (like 8 series Coffee Lakes) are also good for this new crap.
Kinda stopped caring long ago. In games there's basically no impact on my fully patched 8700K. Benchmarks like Cinebench, CPU-Z's or AIDA64 also are within statistical error margins. What took a pretty sizable hit is NVMe speed in some specific testing, like 4KQ32T1 in Crystal Disk Mark. Almost cut in half if I remember correctly.
I expect these bugs to come out for many years forward.
Your wrong for 10 years Intel has had a considerable lead over AMD, they did not "cut corners" to pound them into the sand even deeper, they got lazy they knew flaws where there but it wasn't until the last few years that people have learned to exploit them and to keep searching for new exploits. If AMD was compettitive 5 years ago Intel would have been forced to bring a newly designed cpu to market instead of building on the same flawed design generation after generation.
Say what you will... Nvidia has continued to compete, albeit with itself, and released much more powerful hardware over the years. Intel...not so much. If it was still up to Intel...anything over 4 cores would be over a thousand dollars. Intel doesn't care about consumers or gamers. That's just marketing. They make all their money from OEMs and servers. Their former CEO didn't know what he was doing.
You really have to respect Jensen's passion for gaming. The GeForce line has always been his gift to gamers and they've always put out great products that changed the market. I don't think anyone at Intel plays PC games or cares about gaming. They're all old businessmen who just care about one thing. Not that thing...I mean money.
Until Intel actually hires some young talent who are passionate about computers and gaming, they'll never bring a product to market that will truly change the game. I have zero hope for Intel in the next few years...
AMD's got passionate people working on products and it shows. They may be going a little Apple now with developing a culture around their tech, but I'll take that over "dead" inside Intel.
End of rant.
No one cares here about physical exploits unless you have strange people using your PC but to say here that some of these exploits were never out in the wild until recently is 100% conjecture -"an opinion or conclusion formed on the basis of incomplete information".
I mean how on earth can you say for sure the exploits were not exploited. I'm sure we would never know but i'm also sure China/Russia/North Korea and maybe Iran will be investing in super computers to help them crack computers full stop and not to steal peoples info but to find ways to kill hardware or take it over.
I know it's crazy that we supply super computers to anyone with cash even if we know there's a possibility they'll use them against us while we use ours to solve "medical issues". ^^
Making the case the real security threat isn't the hardware, but the user.
I bet so did Intel.
Well if physical access doesn't mean actual contact with ones PC then the whole terminology used by Intel is wrong and shady too and that's worrying tbh even though it doesn't affect me as a Ryzen owner. I'm not an Intel hater at all but it seems the last 10 years of Intel CPUs are like swiss cheese with what's now an ongoing issue for them.
AMD CPUs were already clawing back some of the gap in desktop sales and this stuff only benefits AMD. I've lost count now on the patches/fixes they've had to resolve in the last year alone.
Here we go again. Next time i upgrade it will probably be AMD.
Why mess with supercomputers when it is much easier and faster to threaten people, to blackmail people, to brainwash people so that they will tell you everything. Or plant a mole. You see too much movies like "Hackers", I guess.
PS Also supercomputers are needed only to break cryptography, not to use exploits.