Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 15, 2019.
I'm so happy with my Athlon 200GE right now, soon to be as fast as the 9900k
You should re-read, then.
Here's a few choice one's from people.
BTW, we all seem to be forgetting the day this was announced, Intel had patches available. But nah, they are just an evil business who only cares about money. Cause ya know, AMD doesn't care about money or you know AMD didn't have its own share of security issues. This whole press piece by AMD is fanboy trash PR talk.
But lets all just focus on how much we hate evil Intel.
None of that is hate, sorry you feel personally attacked that a company that isn't you (or is it?) Have people who are rightly so upset about their practices and security flaws and prices.
Now when you start throwing around factless, blatantly wrong information, that is what is wrong.
Yes, you blatantly and factually did this.
Learn what facts are before you post.
The topic is not for discussion, if I saw something of the sort happening on both sides I would take care of it. Easy as that.
But don't listen to warnings, don't follow rules. That will help you out.
Very well. I'm done with this thread after this. I just find it really rich for people who attack Intel to defend AMD when they had their own share of security flaws. And the defense simply seems to be, well Intel has more. And Intel charges more. Whatever works for you I guess to legit your purchase. But its plain hypocrisy, IMHO. If security is upmost important to you, then you should not have a computer.
BTW thanks for being patient with people/topics in your moderation, goes a long way towards making this a great place to hang out
I ended up leaving another forum I used to like a lot due to a regularly overly authoritarian mod there who really upset me in their treatment of people. (though they never had to moderate me)
Wish I can say that too but I got my comment deleted a while back for something like that, any who WELL SAID!
I'm starting to think Intel knew about all these issues/flaws but kept silent about them just for the extra performance benefits.
Kool64 brought up an interesting point: "Seems to me like Intel may have been "ignoring" security in the name of speed."
Why is it that AMD always seems to be ahead in this aspect, even if there's a performance cost?
AMD have demonstrated real innovation...
...something Intel have not done for way too many years, instead they have incremented as low as the can possibly get away with...knowing full well the competition then was lacking. From a profit sense, I fully understand why Intel did what they did for the last number of years, however, it's a pretty pathetic way to behave. Deliberately under-innovating carries absolutely no justification when advancement is factored in, and taking the obvious importance of 'advancement' into account, there's a hell of a lot more adjectives that can justifiably be applied.
While I'm/we're a wee bit ahead of ourselves at this time, logic dictates that AMD fully deserve everything everything they are due.
Nvidia, after recently and unjustifiably raising the price bar to eye-watering levels, are another company that has very rapidly fallen out of public favour. I'd imagine AMD will have something to say here, cost initially then next gen.
Intel's stock is higher today than it was prior to the announcement. I'm sure that has a lot to do with overall market conditions and the health of the company, but it does make you wonder what non-technical people (most people and, by extension, most investors) think when they see something like this.
Personal observations from working in IT - people care FAR too little about security of their devices until something terrible happens. Most users don't even grasp the concept that their devices could be compromised on a hardware level or through no fault of their own.
While I really dislike nVidia. I'll have to correct you a bit. Turing delivers more transistors per $ than older generations.
Fact that those cards are not much faster (gaming wise) than those older generation per $ while having many more transistors in not pleasant.
But that's because nVidia added proper FP16 performance which AMD had for quite some time. And they added special new functions which were not cheap in terms of transistor count either.
They added so many transistors that Turing's gaming performance per transistor at same clock as Vega is practically same. Except that AMD still has ~30% higher compute performance and nVidia still holds ~30% lower power consumption.
(I personally waited for very long time for nVidia to deliver decent compute. Because w/o that game studios would not utilize compute. Now doors are open for whole new world of magic.)
Well, I can say from my personal experience that guys around me (server business with thousands of systems running middleware and databases) always start reply to my announcement of vulnerability of this kind with "F*!".
You can imagine client's happiness when architect designs systems for certain use and then application server becomes 30% slower on storage access or database server loses 15% performance. (Cases where query which took 20ms now takes 1.5seconds and has to be rewritten to something that's not having issues on intel.)
This is becoming problem. And if it ever gets to point we start disabling HT...
Can you imagine having 4 application systems with 8C/16T doing same type of operations in cluster (for redundancy in case one goes down) now needing 6 systems after all previous patches?
If HT goes away, 6 systems will not be enough. It will be likely 8 systems with 8C/8T then.
Not funny at all because client has to pay for it. So he pays more to get back his performance.
Working in IT and being on both sides, most of security guys are out of touch with reality asking you to implement everything to all system in matter of days.
I had too many bloody fights with them. Its important to keep security up, but at same time you need to keep business running so there is always needed balance between security and business continuity.
More transistors sure, but:
- its not something customer really cares about
- increase in price is not adequate for chip size increase, its actually far far off. Difference in cost would be bellow $50 to make bigger chip but Nvidia decided to hike price by $700, there is no sympathy at all from me for that.
My comment is directed to a company that is known for price gouging and suppressing competition. This is fact.
Karma is a bitch... Also fact?
This is not directed to anyone here.
I'm actually have 3 systems, 2 of which are Intel, and my main one is AMD.
So no fanboy stuff here, just observational comment.
You realize that's not how that works, right?
Just take GTX 1080 ti vs RTX 2080 ti, a $300 price difference (not 700 like you imply, it'd be 400-500 if you want to take non-MSRP prices which have nothing to do with nvidia, as that price hike goes to fund the 3rd party manufacturer such as MSI, or the retailer such as newegg, or both, so i'm going to stick with $300 difference)
GTX 1080 ti 471mm2
RTX 2080 ti 754mm2
60% increase in die space.
But that doesn't mean that it cost 60% more to make, or that you get 60% less dies per wafer. No, not even remotely.
Being that it's 60% bigger, that means the likelihood of defects rises fairly dramatically, this affects costs. It's likely due to how many less dies they get per wafer that the die itself costs 2-3 times more, could even be more. That being said, we can't know the exact number, since it's publicly not even known.
Then you have to factor in the fact that GDDR6 is more expensive then GDDR5x
But i want to make an extra point on the die cost. One of the biggest reason AMD has been so competitive with its high core count processors is because they don't have a monolithic die, like intel does. This helps TREMENDOUSLY for AMD in regards to costs and being able to undercut intel yet still make good money. Their dies are so small that the amount of defective ones compared to cost isn't a problem, whereas intels much larger dies have a much larger cost due to defects.
Yes, i understand, that's AMD and Intel, and not GPUs, but the principle is the same, large dies have bad yields = higher costs not in regards to the increase in cost per physical die material.
And lastly, none of this takes into consideration R&D for the features that are in the RTX lineup that are not in the GTX lineup. You can say "oh that doesn't matter" all you want, but it does. A company doesn't just put millions and possibly billions into R&D and not factor that into the costs of a product. A CUSTOMER may not care about this cost, but if the company wants to survive, they HAVE to worry about this cost even if the customer refuses to. It's not JUST the physical material costs that need to be taken into consideration but also the expected amount of units to be sold at which margins to pay for the R&D. Both of these numbers as consumers we will never fully know.
that we know of, and thats the beauty of these types of exploits, they leave no evidence
Fanboy to the max. Intel deserves any bashing they get from this.
Time to ditch my intel server. Threadripper here I come.
What hate towards Intel? This forum is so heavily Intel/NVidia biased it's embarrassing. AMD could release products that have twice the performance of anything Intel or NVidia have in their respective markets, at half the power consumption or the competing products, and people on this forum would still bash AMD.....
I have an i3 7130U based laptop and an i3 380m based laptop. The 7130U is "patched" and noticeably slower than the i3 380m..... In fact, the i3 7130U is comparable in (perceived) performance to the Celeron N4100 based laptop it was bought to replace.... I noticed the same performance loss on my i3 7100U based laptop a few months ago. I'm scared to see what the new patches will do to the performance of the 7130U....
My last couple Intel cpu's (including my current 4770K) have served me well but I am pretty certain my next PC will be back to AMD when new Ryzen comes out.
If you were a hacker an wanted to hack as many computers as you could would you first find a flaw in amd based computers or intel?
intel = 80%
amd = 20%