Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 3, 2020.
Unless AMD ends up competing with the RTX 3090 oob & beats it.
Yup going from almost 0% to 20% it's not much anyway difference between a 5700 xt and 2080ti. It did bring the 2070s to 5700xt level in the game which is ofc good.
That's a whole lota TDP.
Wasnt nv super tdp friendly last few gens and always pinpointing that bs over AMD.
Tables shifted a lot I guess and now they're trying to brute force their way out of it, no matter the cost.
Yeah I haven't seen many people take notice or talk about TDP yet. People got used to Nvidia cards running cool and quiet, but they will be in for a rude awakening once their PCs start sounding like a jet after upgrading to Ampere.
1080Ti 250W TDP
3080Ti(3090) 350W TDP
1070 150W TDP
3070 220W TDP
All those people who bought smallest possible PSU for their systems last few years will cry that they tried to save 10-20$ by going with smallest PSU.
Tbh, my 1000W PSU will still be more than enough. I share your sentiment @Glottiz that a bigger PSU is generally good for headroom, especially if one plans to overclock. (And when I bought my PSU I was still thinking and running SLI too.) And usually talking about low single digit efficiency differences between 50% and 75% draw with gold or higher classed PSUs, right?
Lately, overclocking on Nvidia was always a thing of temperature and TDP... so people flashed the higher TDP (wattage) BIOS files to get more headroom to be used by good cooling. Now Nvidia does this on themselves (to reach their own perf targets), now I wonder, will the higher BIOS files for cards like Kingpin etc. feature 400W or more for watercooled hardware?
Honestly, TDP is a thing, and yes higher power usage does heat up the room more, no one stating that is wrong of course. But... I'd have rather flashed my 1080TI and 2080TI with a higher wattage BIOS to get more FPS out of it than worry and wonder about 1°C room temperature. And a lot of fellow gurus have ACs anyway...
Keep in mind a rising tide raises all boats. The 2080 and 2080ti both increased performance too. I still think there is no way a 80CU Navi2 with an increase in ipc, frequency and memory bandwidth could only muster 50% performance increase over 5700xt. I'll bet the big navi is 40-45% or faster in ALL DX12 titles (at resolutions above QHD because FHD is going to be severely CPU bound). If it can't then don't release the 80CU just like a 64CU that should be able to get 25-30% more performance than the 2080ti and keep power consumption under control.
CRUSH the market!!!! Like they are doing with their professors!.!
1500 watt PSU's or bust! I never buy anything less than that any more.... That is since buying a 1500 watt Silverstone strider that's in my 3900x setup. And two 1500watt EVGA supernovas in my 3960X build.
"Feel the POWAH of the DARK SIDE.!.!.!.!"
If that were to happen then somebody is going to get hurt......
The kind of zen I feel when building a PC is something that can never be replaced. And if these prices keep staggering up the way they have been then I'm going to be out of a serious hobby/trade.....
I can't stand that they have been raising the prices like they have been. NVIDIA!!!! Just because of no true competition. Guess Intel learned from that mistake. Just hope the same will be said for Nvidia as well....
Come on AMD and slam some serious performing hardware down at a decent price and make Nvidia eat their pricy hardware themselves...
I hope we see the day when Intel and Nvidia learn from their "mistakes" on the competitive market. Last time I checked, they made way more money with their mistakes than their competition without, at least for now.
I still think: 3080TI cheaper than 2080TI. But if they release 3080 and 3090, 3080 will come about the same price as 2080 +/- a few $, but the 3090 > 2080TI.
since we are going into the 300W+ territory 8which I know mine is
1) a PSU is like a car engine while it can run at the rev limiter it's not supposed to, like a car unless you want to break the engine it's best to run around 50% capacity not only to spare a few watts but to prevent bad things to happen
2) which will as when battlefield 4 was released, I had no idea what "200% scaling" was (rendering 4K and displaying 1080) I had SLI at that time, my 850w psu made the sound of a helicopter taking off and EXPLODED loud boom, smoke the lot it damaged one of my gpu and the motherboard which worked but glitched had non working ports etc...it destroyed 1000$ of hardware+psu price
was it worth cheapening and buying "only" what websites recommended "only buy the watts you need" for a sli setup with a ton of fans, HDD etc..obviously not
3) after switching from a single gpu + z390 9900k oc to trx40 threadripper and finding that I was using up to 850watts I again had to upgrade to a 1200watts psu
my oc 1080ti already uses 300+watts and I mentioned the chipset on purpose because those recent workstation/enthusiasts motherboards with 1 billion phase VRMs also need a lot more power than they used too 5-10 years ago
more power isn't free please don't buy "just enough" psus and fyi 1000+watts psu are annoyingly long you'll have to take this into account
All of this is just an indication of a CPU limit (even at 1440p). The 4k results are probably much closer to what the differences between these cards really are.
I agree, you're preaching to the choir here
Still running my 1000W PSU right now for an old 5930k and a 2080TI because, headroom is safe room.
But damn, that destruction... sorry to hear fellow guru!
Need some of that 1.5k juice flowing through that chassis baby!!
Still have me a Silverstone strider 1.5k running smoothly in a 3900X setup today. OC'd 5700xt placing it sixth in the world in firestrike extreme. She's over fifteen years old now, so I'm happy.
And extremely sorry to hear of the POWAH issues in the past. I bought three 280's 1gb editions years ago and never thought to not install them due to a mosfet or something coming off of one card in the box.
Saw it (actually heard it in the box) after the whole fact of my motherboard, and all three cards going up in smoke!
CPU and PSU were okay, but mobo and all three 280's were caput.
Never did get the chance to out do Trubritar on his YouTube channel back then!!!
Sure do miss that crazy Brit!!
The difference between 2080ti and 5700xt get a tad smaller at 4K.
I think people stopped caring about TDP and power consumption a few years ago and that´s why we´re seeing every company raising the bar in those aspects, even the consoles are going the same route! People want powerful stuff and they don´t care too much about power consumption because electricity is "cheap" enough...
I think they still care when it's a direct comparison. You see it all the time with Intel vs AMD, where similar priced/performing CPUs and people make the argument that AMD's processors are half the power. I think it's definitely a factor at some point, it's just not everyone's primary factor. I also think people dislike more when it's converted to temperature. If Nvidia launched a 400w GPU that ran at 65c, I think most people would be fine with it. If they released that same GPU and it ran at 100c I think most people would call it a furnace and hate it. Which is funny because obviously both GPUs are releasing 400w of heat but it's been ingrained in most enthusiast consumers heads that anything above 90c is probably bad.
I was pretty stoked for the new GPU releases from both nVidia and AMD but very disappointed at the same time.
Of course nothing is sure yet but nVidia having screwed up with TSMC and not being able to get volume out for their 7nm process will be quite the pain in the ass for consumers that were looking to do an upgrade now that will last a bit (1080ti bought just after release, or even 2080ti bought at or after release. Good for 2 years at least).
Now they will release their Ampere on 8nm (10nm +++) which obviously isn't what it could have been on 7nm and highly likely they will refresh the line on TSMC 7nm somewhere early next year naming them Supers... That's probably why the 3080ti is named 3090 now... 3080ti Super sounds a bit silly.... but 3090 Super is perfectly viable.
So we've been waiting all this time from Pascal for something decent to upgrade with and we will still get half-assed products that will be replaceable within 6 months with what it should have been in the first place...
I'm on a 2070 Super (3440x1440p just cutting it at 50-60fps) just because I wanted to wait it out for next gen and not waste my bucks on a 2080ti. Now It would seem I can still better be holding onto my bucks once more at least until the refreshes on 7nm TSMC arrive... Sad stuff if you'd ask me. But it is what it is. I'm tired of gaming at these framerates so I'll probably gun for a 3080 now and go all out top model for the refresh unless AMD comes with something really funky.. and can compete with DLSS 3.0, Raytracing and driver stability which I'm sceptical about too.
Then there is pricing... nVidia rushes out to go first so they can set pricing (as they always do) Then AMD will just follow and undercut slightly. On release these cards will be overpriced for sure until AMD gets their stuff to the market and then still. Hoping is for the consoles to put more pressure on pricing.
All in all this release was probably overhyped as usual. And we just need to wait a bit for what they come with.
(sorry for my slightly pessimistic approach of this launch)
another comment on the wattage/psu power
just ran CinebenchR20+Kombustor together (which is way worse than your usual benchmarks/power tests) and got max values
207watts email@example.com@1.36v (this is a good/very good bin, I have one that does 220+watts at 4.8ghz ><)
so everything else motherboard pcie soundcard, drives water cooling fans etc...around 167Watts
got to say I didnt expect that much from Z390 omg 335watts on the gpu too yikes
only comforts me into building my computers with no cooling above the videocard (front/side rad or front+bottom rad)