Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 13, 2015.
Not everyone lives in the US, and some of us have to pay over £0.14p per KW/h
I also don't care about power consumption from a financial point of view and i agree it does get thrown around as bragging rights by Nvidia users in the same way AMD fans did back in the Fermi days.
It's heat and noise that concern me.
They could put a really good cooler on it, but that will bump up the prices to something that might already be quite expensive.
Hoping it is something great though, tech wise things have been a touch boring over the last few years.
I'm about heat too. I don't think about a single cent on my electricity bill, I don't want to heat up my summer living room as if I'd have the stove on all day...
Same bullsh#$ with heat. Like video cards haven't been putting out 60-80 C, or even more, FOREVER.
I game with headphones on, so noise isn't an issue.
Edit: Some heat numbers for a 580gtx.
This card ran 42 degrees C in IDLE which is very normal. When the GPU is stressed out 100% for several minutes the card reaches roughly 85 to 87 degrees C. For a GeForce GTX 580 these are rather normal numbers. Also, we measure at a room temperature of 21 degrees Celsius.
Noise and heat is a issue,unless you live in a cave and a single 300W gpu will keep you warm.
Uhm well, just to tell you a secret, not everybody plays with headphones
Lucky for me, I have air conditioning. So still not an issue.
I realize not everyone uses headphones, but if the noise bothers you that much, it's an option to think about.
Yeah, I hate headphone audio so only use them if I'm not at home.
If you do like headphones then it's a non issue, but a noisy card would have to be surprising cheap or fast for me to consider it.
Well the HBA should reduce the power consumption some what but not massively what it does mean is 512bit bus with fewer memory chips which = less complex PCB = lower over all costs making it cheaper to produce.
300W it's fine to me..... people who already put hands on the samples could swear things like even a possible 980Ti will lag far behind the reference 380X.....
If its faster than 2x 980s though
Won't be like
Here u go buddy,u will need one for all those warm gaming sessions.
Ok let me put it this way, 290x vs 780ti vs 980gtx @ maximum peak
290x 512bit can use up to 324W
780ti 384bit can use up to 266w
980gtx 256bit now uses up to 190W (stock), and can still reaches 80C with stock cooler..
This 300w is 300w is kinda far fetched then, I mean 780Ti heats the same with stock cooler, ok 82C, so if 266W is 266W then it should heat up to 85-90C then, no? Its extra +80W compared 980GTX after all, and both use the same cooler with similar noise ratio (~4db louder by 780ti)
They didnt save total ~80W just by improving core design, they obviously had to sacrifice bus speed and come up with new compression algorithm to compensate lower bandwidth also to lower power output & heat.
Anyway I still believe this new 380X will heat less, even if it has 300W tdp spec., at least 10C lower with same stock cooler like by 290X (94C)..
Yes, and it's really very weird when you consider we're only talking the power consumption of a few light bulbs and we're *not* talking about a dire need to conserve power to get the most out of our batteries... (Thank goodness! I get really tired of battery-powered, have-to-recharge devices which seem more trouble than they're worth, lately. Ugh.)
Of course, no real info yet...but it's certainly interesting. ~4k monitor prices are preparing to drop through the floor and the real GPU performance race will be to see who can drive them best, first... There's a real market there (~4k monitors), no mistake about it, and it's going to mature pretty quickly because initially there will be price wars of the kind we all like to see....
I have an FX-9590 and two 7970's, well over 300W, yet I never feel any heat emitting from my computer, or notice the few hours of gaming where the PC works the most, as a huge strain on the electrical bill..
I mean if you want a less than 200W card, go buy a 980 or 970 then, they are great cards, some of us though, just like going with maximum performance, in my case my Watercooling will handle several 300W cards while staying quiet, and as long as the Idle is good too, it likely won't show on your electrical bill unless you're some economical green serial PC gamer who can't function with less than 8 hours of gaming every day.
I did once experience my PC could heat up my apartment, but it was a insanely insulated shoebox of an apartment and the PC at the time consumed maybe the same as an entire 980.
I'm sure there will be some other 300 series card that won't consume 300W's though.
Note, the numbers you are shown is using Furmark at 1280x1024 .... I dont even know how they do for dont make kick the driver profile who detect furmark and shoot down the power limit.
Well, the 300W might be the total power target or something, but my guess is that there won't be much missing, especially if your going to OC that GPU. The cooler, a hybrid one as far as I know, will be able to handle more than 300W, so I don't expect that 300 number is linked to it's cooler.
And well, I'm living in a rather nice apartment, some 30m² living room, and there's two factors actually heating it up in the summer: the TV, and my gaming rig. With both on, I see a rise of 2 to 4°C after a few hours in summer, when the temperature is already high (above 28°C). With the TV off it's only a 1 to 2°C rise, but still I notice it. Adding to it, I have a fairly cornered place for my rig (which I will change), so maybe the situation lets me see it overly sensitive. But the difference between 28 and 32°C can be felt, so I won't go for the heat production factories.
I think it'll be faster than a 980, I have little doubt about it, but more like 1.5 times as fast, not twice, at least not with 1440p and below. This might behave different with 4K and all the eyecandy though.
I'll clarify my earlier post.
Doesn't this seem a bit suspicious to anyone else?
Identical news item from 2012, identical linkedIN profile with a different first name and gender.
Maybe transgender surgery took place somewhere along the line? Make of it what you will.
290x had 6.2B transistors, 780Ti 7.1B transistors, that approximately is difference in performance between them.
Yes, Maxwell is better, It is back down to 5.2B transistors with 980, that is sole reason for lower power consumption. But good feat on nV side none the less, sad is price they have set in comparison to 780Ti.
Huang marketed it as twice the performance per transistor and per watt, while that was big over exaggeration it told that nV saved a lot per GPU manufacturing and I consider 980 as slap in the face of customers. And they gladly took that slap and encouraged nV to do even bigger slaps in future.
I for once am glad for full 300W PCIe graphics card as it is to standard.
HBM saves few more watts which means they are eaten by GPU = more power.
I Do Not think AMD had in store same performance per transistor boost as Maxwell but lets see real world benchmarks 1st.