Discussion in 'Videocards - NVIDIA GeForce' started by Rodman, Sep 13, 2006.
Interesting read here
The Inquirer lol, I'll believe it when I see it
lolz im with above poster.
Both the G80 and R600 are not going to be smaller dies (65 nm) when first introduced, so they are going to run hot and need a ton of power. It's not just INQ. Every techno rumor monger online has been saying the same thing.
Where did you hear this? I heard all DX10 cards were going to be smaller dies except NV was going to use a slightly bigger one than Ati.
How did smaller dies mean bigger power consumption? That's counterintuitive unless I'm completely crazy. They will have bigger power consumption because they'll have more transistors. Don't take my word for it though, I might be a lunatic.
i actually believe it..... and i've been a second-hand witness of somewhat of the truth of it. Think they had pictures of the afformentioned Computex, where OCZ showed an auxiliary power supply that would fit in an optical drive bay to power the video cards, since the earlier DX10 cards to be released went a tad out of hand, and may require 100-300w per card (depending on the card). that could have changed in the passed few months, but you never know. look at the one Asus card (forget.... was it the 7800GT Dual).... it had an external power jack. don't know if it was REQUIRED, but there was a wire running from the 6-Pin PCI Express power socket on the card, to the bracket that holds the card in the case.
I think it would be great to have a separate power supply for the GPU. That should ensure greater system stability and would lessen problems of power peaks overburdening one PSU. A 200 watt unit in a drive bay should be enough for the GPU and would probably not be costly at all.
unless of course you read my previous comment..... could require between 100w and 300w PER CARD. and besides.... the extreme power requirements was supposedly for the first sets of DX10 cards from both ATi and NVidia...... for all we know that might be fixed now..... or will be fixed after the first set. I think i personally am going to hold off on buying into DX10 hardware..... DX9 didn't have that magnificent of an entrance for the hardware market when it came out...... don't want to be the victim of a repeat performance.
edit: also..... a power supply in a drive bay probably WOULDN'T be that good of an idea. would require either 2-3 very small, high RPM fans on the front blowing air through it causing lots of noise, or basically having a standard PSU setup, with a 120mm fan on the top/bottom, and vents to allow the air to go out of the case or into the case and exhausted out the back. the reason the latter of the two ideas would be bad.... is because that'd require at LEAST a case with 5 drive bays if using two optical drives, or an optical drive and fan controller, or an optical drive and front-mounted audio bay (like the high-end Audigy's come with) so the supply would have enough space to pull some air from.... even though that wouldn't be that great either, since it'd have limited space to pull air from, and would probably be warm air.
I seriously doubt they will have their own power supply. It will be smart however, to buy one of those 1-Kilowatt power supplies.
Interesting thing to note:
I was at a major LAN recently and they were raffling off these:
They said they haven't been released yet. Seems like this will be the future if power requirements do go up.
...... how would that be smart exactly? if you ask me..... that's a worse idea than an auxiliary PSU/external power.
you'd even be better off buying a case that has space for two power supplies, and using one 500w to power your video cards alone. i'm sure that they'd be quieter than any bay-mounted PSU..... and probably for a bit less than $300-500 for a 1KW PSU. you could probably even just buy two power supplies for a larger-than-average case, and rig them both up inside your case somehow.
Owlsphone: i actually just saw that now.... was cruising Thermaltake's site cause i coulda swore i saw a dual-power supply Thermaltake case, and that aux. PSU is shown on their front page.
I also wanted people to notice what was said about the die and card design. They mentioned it was so 'odd' that they couldn't even exsplain it yet. Mmm...
Smaller dies, eventually. But the first releases of the G80 and R600 have been rumored to be at 90 nm or at most 80 nm. Maybe with all the delays, including those associated with Vista, maybe both ATI (or is it AMD these days?) and nVidia might just scrap the plan for the the 90 nm, high power-consumption versions of the new GPUs.
yea apparently... but it is INQ, recently bad trends of bad news.
anyways, yea next-gen GPUs are going to be suckers but it will atleast take a large ammount of load of my crapping out PSU.
on the other hand... how much could this really increase the electricity bill? shouldn't people be worrying about global warming?!?! were looking at a 1-6 degree centigrade increase by 2100.. (if i rmbr well)
So conclusion: Don't buy first gen DX10 cards! Wait for the second or third gen cards
I wonder how much of this will be true, more interesting stuff..
I take it as Nvidia seems to want to dominate the DX9 area as I think they are betting on DX10 not being that popular as Vista really won't be mainstream till 08. I mean Crysis will be optimized using DX10, but if you have a DX9 monster like the G80 it will power through not so well optimized DX9 stuff. In other words Crysis will probably play just as smooth in DX9 on a G80 as it will in DX10 on the R600.
Most of us will continue to use XP and DX9 so the G80 is looking to be a killer card for awile even if it is only a DX9 card for the most part. What does everyone else think?
i think ATI are going to kick arse on the DX10 cards.
it gonna be while before DX10 cards mature enough for me to buy one i think.
INQ is at it again today with R600 rumors. They are now saying that it will only be an 80 nm card (not 65nm) and it will draw 250W. Take it for what it's worth. The good news is that INQ believes that the R600 could be twice as fast as today's fastest cards. The article has a funny typo in it, but I'm never sure whether the comedians at INQ do this on purpose or not. The said that the R600 will run hot and need a "super doper" cooler. Ummm, where can I get when of these things?:smoke:
I agree, I remember the nvidia 5 series were not so good in dx9. When they came out i bought a 5600 but it was rubbish in dx9 so went for the 5900 which was pretty good. I think ill buy the second gen dx10 cards.
I'm gonna focus on processors at the min, thinkin of the core duo but not sure if it will perform much bettin games than my athlon64. Does anyone know?