G80 to have it's own PSU?

Discussion in 'Videocards - NVIDIA GeForce' started by Rodman, Sep 13, 2006.

  1. Rodman

    Rodman Ancient Guru

    Messages:
    1,721
    Likes Received:
    0
    GPU:
    GTX285 SLI@756-1580-2750
  2. morbias

    morbias Don TazeMeBro Staff Member

    Messages:
    13,445
    Likes Received:
    37
    GPU:
    -
    The Inquirer lol, I'll believe it when I see it
     
  3. senior98

    senior98 Ancient Guru

    Messages:
    2,095
    Likes Received:
    0
    GPU:
    xfx 8800GT 512mb
    lolz im with above poster.
     
  4. jabloomf

    jabloomf Master Guru

    Messages:
    310
    Likes Received:
    0
    GPU:
    EVGA GTX 580
    Both the G80 and R600 are not going to be smaller dies (65 nm) when first introduced, so they are going to run hot and need a ton of power. It's not just INQ. Every techno rumor monger online has been saying the same thing.
     

  5. InfDaMarvel

    InfDaMarvel Ancient Guru

    Messages:
    3,218
    Likes Received:
    0
    GPU:
    HD3850 256mb
    Where did you hear this? I heard all DX10 cards were going to be smaller dies except NV was going to use a slightly bigger one than Ati.
     
  6. Satan

    Satan Ancient Guru

    Messages:
    6,684
    Likes Received:
    0
    GPU:
    some
    How did smaller dies mean bigger power consumption? That's counterintuitive unless I'm completely crazy. They will have bigger power consumption because they'll have more transistors. Don't take my word for it though, I might be a lunatic.
     
  7. ElementalDragon

    ElementalDragon Ancient Guru

    Messages:
    9,318
    Likes Received:
    10
    GPU:
    eVGA RTX 2070
    i actually believe it..... and i've been a second-hand witness of somewhat of the truth of it. Think they had pictures of the afformentioned Computex, where OCZ showed an auxiliary power supply that would fit in an optical drive bay to power the video cards, since the earlier DX10 cards to be released went a tad out of hand, and may require 100-300w per card (depending on the card). that could have changed in the passed few months, but you never know. look at the one Asus card (forget.... was it the 7800GT Dual).... it had an external power jack. don't know if it was REQUIRED, but there was a wire running from the 6-Pin PCI Express power socket on the card, to the bracket that holds the card in the case.
     
  8. alanm

    alanm Ancient Guru

    Messages:
    11,253
    Likes Received:
    3,350
    GPU:
    Asus 2080 Dual OC
    I think it would be great to have a separate power supply for the GPU. That should ensure greater system stability and would lessen problems of power peaks overburdening one PSU. A 200 watt unit in a drive bay should be enough for the GPU and would probably not be costly at all.
     
  9. ElementalDragon

    ElementalDragon Ancient Guru

    Messages:
    9,318
    Likes Received:
    10
    GPU:
    eVGA RTX 2070
    unless of course you read my previous comment..... could require between 100w and 300w PER CARD. and besides.... the extreme power requirements was supposedly for the first sets of DX10 cards from both ATi and NVidia...... for all we know that might be fixed now..... or will be fixed after the first set. I think i personally am going to hold off on buying into DX10 hardware..... DX9 didn't have that magnificent of an entrance for the hardware market when it came out...... don't want to be the victim of a repeat performance.

    edit: also..... a power supply in a drive bay probably WOULDN'T be that good of an idea. would require either 2-3 very small, high RPM fans on the front blowing air through it causing lots of noise, or basically having a standard PSU setup, with a 120mm fan on the top/bottom, and vents to allow the air to go out of the case or into the case and exhausted out the back. the reason the latter of the two ideas would be bad.... is because that'd require at LEAST a case with 5 drive bays if using two optical drives, or an optical drive and fan controller, or an optical drive and front-mounted audio bay (like the high-end Audigy's come with) so the supply would have enough space to pull some air from.... even though that wouldn't be that great either, since it'd have limited space to pull air from, and would probably be warm air.
     
    Last edited: Sep 13, 2006
  10. Slam

    Slam Ancient Guru

    Messages:
    3,228
    Likes Received:
    23
    GPU:
    EVGA RTX 2070 XC
    I seriously doubt they will have their own power supply. It will be smart however, to buy one of those 1-Kilowatt power supplies.
     

  11. Owlsphone

    Owlsphone Master Guru

    Messages:
    395
    Likes Received:
    0
    GPU:
    eVGA 7800GT 490/1150
    Interesting thing to note:

    I was at a major LAN recently and they were raffling off these:
    http://www.dailytech.com/article.aspx?newsid=1115

    They said they haven't been released yet. Seems like this will be the future if power requirements do go up.
     
  12. ElementalDragon

    ElementalDragon Ancient Guru

    Messages:
    9,318
    Likes Received:
    10
    GPU:
    eVGA RTX 2070
    ...... how would that be smart exactly? if you ask me..... that's a worse idea than an auxiliary PSU/external power.

    you'd even be better off buying a case that has space for two power supplies, and using one 500w to power your video cards alone. i'm sure that they'd be quieter than any bay-mounted PSU..... and probably for a bit less than $300-500 for a 1KW PSU. you could probably even just buy two power supplies for a larger-than-average case, and rig them both up inside your case somehow.

    Owlsphone: i actually just saw that now.... was cruising Thermaltake's site cause i coulda swore i saw a dual-power supply Thermaltake case, and that aux. PSU is shown on their front page.
     
  13. Rodman

    Rodman Ancient Guru

    Messages:
    1,721
    Likes Received:
    0
    GPU:
    GTX285 SLI@756-1580-2750
    I also wanted people to notice what was said about the die and card design. They mentioned it was so 'odd' that they couldn't even exsplain it yet. Mmm...
     
  14. jabloomf

    jabloomf Master Guru

    Messages:
    310
    Likes Received:
    0
    GPU:
    EVGA GTX 580

    Smaller dies, eventually. But the first releases of the G80 and R600 have been rumored to be at 90 nm or at most 80 nm. Maybe with all the delays, including those associated with Vista, maybe both ATI (or is it AMD these days?) and nVidia might just scrap the plan for the the 90 nm, high power-consumption versions of the new GPUs.
     
  15. fungry

    fungry Master Guru

    Messages:
    253
    Likes Received:
    0
    GPU:
    8800GTS 320mb
    yea apparently... but it is INQ, recently bad trends of bad news.
    anyways, yea next-gen GPUs are going to be suckers but it will atleast take a large ammount of load of my crapping out PSU.

    on the other hand... how much could this really increase the electricity bill? shouldn't people be worrying about global warming?!?! were looking at a 1-6 degree centigrade increase by 2100.. (if i rmbr well)
     

  16. Darkasantion

    Darkasantion Master Guru

    Messages:
    982
    Likes Received:
    0
    GPU:
    Club 3D HD5870 1035/1325
    So conclusion: Don't buy first gen DX10 cards! Wait for the second or third gen cards:p
     
  17. Rodman

    Rodman Ancient Guru

    Messages:
    1,721
    Likes Received:
    0
    GPU:
    GTX285 SLI@756-1580-2750
    http://www.theinquirer.net/default.aspx?article=34359

    I wonder how much of this will be true, more interesting stuff..

    I take it as Nvidia seems to want to dominate the DX9 area as I think they are betting on DX10 not being that popular as Vista really won't be mainstream till 08. I mean Crysis will be optimized using DX10, but if you have a DX9 monster like the G80 it will power through not so well optimized DX9 stuff. In other words Crysis will probably play just as smooth in DX9 on a G80 as it will in DX10 on the R600.

    Most of us will continue to use XP and DX9 so the G80 is looking to be a killer card for awile even if it is only a DX9 card for the most part. What does everyone else think?
     
    Last edited: Sep 14, 2006
  18. SniperDaws

    SniperDaws Banned

    Messages:
    2,565
    Likes Received:
    0
    GPU:
    XFX7600GTXXX Zalman Vf900
    i think ATI are going to kick arse on the DX10 cards.

    it gonna be while before DX10 cards mature enough for me to buy one i think.
     
  19. jabloomf

    jabloomf Master Guru

    Messages:
    310
    Likes Received:
    0
    GPU:
    EVGA GTX 580

    INQ is at it again today with R600 rumors. They are now saying that it will only be an 80 nm card (not 65nm) and it will draw 250W. Take it for what it's worth. The good news is that INQ believes that the R600 could be twice as fast as today's fastest cards. The article has a funny typo in it, but I'm never sure whether the comedians at INQ do this on purpose or not. The said that the R600 will run hot and need a "super doper" cooler. Ummm, where can I get when of these things?:smoke:
     
  20. Waffen-5

    Waffen-5 Banned

    Messages:
    310
    Likes Received:
    0
    GPU:
    XFX 8800GTX 768mb
    I agree, I remember the nvidia 5 series were not so good in dx9. When they came out i bought a 5600 but it was rubbish in dx9 so went for the 5900 which was pretty good. I think ill buy the second gen dx10 cards.

    I'm gonna focus on processors at the min, thinkin of the core duo but not sure if it will perform much bettin games than my athlon64. Does anyone know?
     

Share This Page