600 series pricing

Discussion in 'Videocards - NVIDIA GeForce' started by Moonbogg, May 13, 2012.

  1. Mikedogg

    Mikedogg Guest

    Messages:
    2,830
    Likes Received:
    0
    GPU:
    Radeon HD 4650M 1GB
    You want price drops, stop buying their products. Or at least delay your compulsive spending habits on launch of their products. Maybe then they'll go, WTF, why aren't people jumping all over this. Maybe then they'll drop their pricing.

    Speaks words of wisdom, he does.
     
    Last edited: May 15, 2012
  2. Steve30x

    Steve30x Master Guru

    Messages:
    517
    Likes Received:
    11
    GPU:
    XFX RX 7900GRE
    I would love to know how the whiners know how much manufacturing of the GTX600 cards cost and how they know the 680 is a mid range card. In my opinion its all speculation. You guys seem to want stuff at a lower price and dont mind Nvidia or ATI selling their cards at a loss.
     
  3. snowdweller

    snowdweller Guest

    Messages:
    492
    Likes Received:
    0
    GPU:
    Leadtek GTX 580 SLI
    This to an extent. Not so much prices as some people just don't have that much money to blow on fun which is a shame because once you get max eye candy at constant 60... man whats a console :nerd:
     
  4. Steve30x

    Steve30x Master Guru

    Messages:
    517
    Likes Received:
    11
    GPU:
    XFX RX 7900GRE
    I dont have money to buy a Ferrari but you dont see me crying about it. Having a high end GPU is like having a ferrari. You pay a premium for the faster hardware. You can quote me all you want and say that the GTX680 is not the GTX680 it was supposed to be but unless you have good evidence to back up what you are saying then you are wasting your time.
     

  5. snowdweller

    snowdweller Guest

    Messages:
    492
    Likes Received:
    0
    GPU:
    Leadtek GTX 580 SLI
    I complain about not having a ferarri all the time haha
     
  6. Steve30x

    Steve30x Master Guru

    Messages:
    517
    Likes Received:
    11
    GPU:
    XFX RX 7900GRE
    I dont because it would be too much money to keep running. The maintenance , insurance , tax and fuel costs of a ferrari would be very expensive.
     
  7. mezball

    mezball Master Guru

    Messages:
    540
    Likes Received:
    36
    GPU:
    MSI Ventus RTX 4080

    Well, until they can out do their previous gen cards by at least 100-150% compared what they do now, that would suit me just fine.

    As for all the comments this gen using less power then the last. Since when have people all of a sudden become concerned about this? If you want to become green, then maybe we shouldn't be buying these cards period. Do people have any clue how much hazardous waste is produced for manufacturing semi-conductors? I certainly do, I've been with 3 companies and it's not pretty.
     
  8. Frohman0905

    Frohman0905 Master Guru

    Messages:
    973
    Likes Received:
    0
    GPU:
    GTX680 2GB OC @ 1202/7000
    The chance of everyone boycotting nvidia because of their so called "high" prices is very, very unlikely to happen. In case you didn't notice, a lot of people are buying Kepler cards right now. Actually there are a lot of people who are happy about it, who don't complain. At hardocp, there is a topic about how happy users are with the 680 and I read far more posts about happy users than unhappy ones. It has been like that for years, you only hear the unhappy users which is a rather small percentage.

    Also, no one forces you to upgrade every generation. Of course there are people who always want the best of the best but I see these as a small percentage. Myself, I've upgraded from an 8800gtx. The performance difference is huge and I'm a happy customer. For me, the 500 I paid for my card is totally worth it. I'm sure there are others who share my opinion, otherwise they wouldn't have bought it.

    What I'm trying to say is: get over it! The price tag for high end cards will always hover around $500. It has been like that for years so why would they kick themselves in the nuts?
     
  9. Ade 1

    Ade 1 Master Guru

    Messages:
    600
    Likes Received:
    0
    GPU:
    Gigabyte GTX 690
    I think this is called "The vocal minority and the silent majority"!
     
  10. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Since they started running into thermal walls which reduced the ability of them to increase performance. You can't just focus on performance and ignore everything else, the architecture has to be as efficient as possible otherwise you run into walls.
     

  11. errorrrr

    errorrrr Master Guru

    Messages:
    362
    Likes Received:
    0
    GPU:
    MSI GTX 670
    For people who run computers 24/7 as a desktop-game during active hours, and torrent seeds at night sleeping, power is VERY VERY much on my mind.

    I also AFK quiet a bit on several MMOs, like FFXI where you setup bazaar. I did that for over 4 years in the game. That bill adds up. Not to mention the nature which electrical components fail quicker and easier when used in high temp on pro-longed periods of time.

    Let's not mention in the summer time, the AC has to not only cool the normal room temperature but ALSO the heat given out by your PC which adds another layer of electrical cost. Though the hot GPUs make a good small heater during mild winters...

    Also, for people who like to have mATX or even HTPC as a all-in-one comp, you want the lowest power coolest performing cards in them. So yes, it's very much in your mind.

    Also, here is another analogy. the GTX 6xx series isn't really for people who have the GTX 5xx series, but for people who have like GTX 4xx series imo.

    Each iteration of Graphics Card is like new Car model every year. You dont' buy a new car every year even though "performance" is better, and "looks" better, and more "efficient". You buy it when you see a considerable upgrade.

    So if you have a GTX 5xx, just consider yourself have last year's Camry/Lexus. And you shouldn't/don't need to upgrade till next year's model.
     
  12. The Chubu

    The Chubu Ancient Guru

    Messages:
    2,537
    Likes Received:
    0
    GPU:
    MSi GTX560 TwinFrozrII OC
    Simple really (if you relax and take the time to think about it of course). All the mid end chips of these years have been around 250mm^2 in size. High end chips have been around 400mm^2 or bigger.

    Usually both nVidia and AMD maintained their high end die size but with smaller fabrication processes, they could fit more transistors in the GPU.

    AMD stopped having massive chips with the 2900XT. They started to release smaller and more power friendly high end chips than nVidia, whereas nVidia released always the biggest chip they could afford (8800 Ultra, GTX280, GTX480, etc).

    Now, with smaller die size, they can fit more chips in a single silicon waffer. The bigger the chip, the more probable is that it ends with some error in it. So having smaller chips isnt only more productive (you can fit more chips in a single waffer) but more efficient since if say that you had a 500mm^2 chip and some part of it got borked, the whole chip may go to the trash can, while if you had four 250mm^2 chips in the same space, probably only one of those would be defective, and you'd still get 3 working GPUs.

    nVidia released a 290mm^2 chip that has all the characteristics of a mid range card, except the relative performance (remember, relative) to the last gen nVidia cards and to current AMD cards.

    Always both companies used the most recent fabrication process, so that rather than being a variable in the price, is a constant, we cant say that the GTX680 is more expensive than it should be because is using a new fabrication process since most of the cards always used a new fabrication process.

    But we can say that they cant justify the price of it from the die size side of things, since it isnt big at all. Producing a single GTX680 GPU should cost more or less like producing a GTX460 chip when it was new tech a few years ago (notice that the GTX560 wasnt produced on a new fabrication process by that time, so its not a good comparison). Maybe a GK104 chips is actually less expensive since the GTX460 GPU has a bigger size die than the GK104 and TMSC had lots of troubles with the 40nm fabrication process (less production, high price, etc).
     
  13. errorrrr

    errorrrr Master Guru

    Messages:
    362
    Likes Received:
    0
    GPU:
    MSI GTX 670
    You do know that Smaller fab process gives the same amount of transistor on the less amount of spaces right? if GTX 680 is fabed using 40nm it'd be close to 500m^2 die.

    GTX 645 is fabed using 40 nm with die of 332^2. You think that's the "flagship" in the GTX 600 series cuz it's got a bigger die?

    As i've said/speculated, the rarity of the GTX 6xx series isn't due to demand, it's the amount of fail chips resulted in the 28nm fab process. It's more evident when you see the GTX 670 still got some on the market after a whole weekend, while the GTX 680 IS STILL missing in supply.

    This leads to me to believe a fair amount of the GTX 680 didn't make the cut and resulted in a larger-than normal amount of GTX 670.
     
    Last edited: May 15, 2012
  14. Texter

    Texter Guest

    Messages:
    3,275
    Likes Received:
    332
    GPU:
    Club3d GF6800GT 256MB AGP
    ATM nVidia just can't keep chucking wafers at it like they could with Fermi, which was 'worse' simply because they had sub-20% yields on a much larger chip. TSMC are ramping up production, but yields will have to be improved from nVidia's side for the most part. A 20% yielding GK104 that can get about 200 chips on a $5k wafer will cost $130 to make. nVidia pay for working silicon and not on a per wafer basis so who knows what the chip will cost them. But TSMC would like to get more out of a wafer, that's for sure. They struck a deal for the nth time, but this continuing struggle to produce any working chips at all can't be good for their relationship lol.
     
  15. Desi26

    Desi26 Member Guru

    Messages:
    114
    Likes Received:
    0
    GPU:
    Sapphire Radeon 7950 OC

    I completely agree with you! Who wouldn't say going from a 4870x2 to a 680 is well worth it? You waited 3 generations of cards, while every enthusiast out there will drop $1000's on the drop off a hat to have the latest in SLI, CF, ever 2 to 4 years. It seems you got the most out of your old card and the new upgrade and made your decision to upgrade just at the time you thought was right, which was now.

    I was happy paying the $400 going from a GTX 570 to my current card for the added 15-25 fps, and will also say It was well worth it for me, lol

    You do the research before your purchase, on your comparisons, reviews etc to then decide *if* its time to upgrade or not, and I knew EXACTLY what I was getting, every purchase.

    I cant remember ANY two generations of cards ever having huge yields in performance differences for Basic-Enthusiast cards. I think the biggest one for me was the 4870 to a 5870, mainly because of the fact that it included directX-11 compatibility and big performance increase.
     

  16. The Chubu

    The Chubu Ancient Guru

    Messages:
    2,537
    Likes Received:
    0
    GPU:
    MSi GTX560 TwinFrozrII OC
    Yep. Second pharagraph (did you read at all?).
    Nope. I dont believe you even read at all what i posted. Are you trolling me or something? Hey, i even said that its easy if you relax first (second sentence). Take your time if you need it.
    Its on par with you so far i guess. Speculating, jumping to conclusions, jumping over conclusions too...
     
    Last edited: May 16, 2012
  17. errorrrr

    errorrrr Master Guru

    Messages:
    362
    Likes Received:
    0
    GPU:
    MSI GTX 670

    GTX 570/580 had 520mm die, with 3000mil transistor, using 40nm fab

    GTX 470/480 had 529mm die, with 3200mil transistor, using 40nm fab

    GTX 280 started with 65nm fab despite the GTX 100 series already using 55nm fab. The GTX 285 then started using 55nm fab.

    In FACT, the GTX 285 SHRUNK in die size compare to GTX 280 because of the fact moving to a smaller fab process.

    Now the GTX 680/670 have a 295mm die, with 3540 million transistor on a 28nm fab.

    The relationship between the die size in EACH generation would suggest GTX 680/670 not even mid-tier card but low-end card. Do you really believe that?

    The die size was not the real matter of issue here. The transistor count is. And from that perspective, the GTX 680/670 is the successor to GTX 580/570 with respective increase in transistor counts.

    Transistor count is the only constant in each increasing generation.

    You really think Nvidia is able to make a 500mm die using 28nm fab and cram 6000mil+ transistor on first try?

    Get real buddy.
     
  18. The Chubu

    The Chubu Ancient Guru

    Messages:
    2,537
    Likes Received:
    0
    GPU:
    MSi GTX560 TwinFrozrII OC
    Great! You read the post, unbelievable! Takes a good day or two but when I finally make it, its a pretty nice achievement. I knew you could! I had faith in you.

    Besides, you're changing the subject a lot from post to post, I was discussing why the GTX680 looks like it was meant as a mid range card, not if nVidia could get a 500mm^2 chip on the market or not right now.

    So, talking about 500mm^2 die... http://www.guru3d.com/news/gk110based-surfaces--has-2880-shader-processors/ Even Hilbert say's that the GK104 is a mid range card :p
    Um.. Yes? It is expected. A GTX280 should be compared to the last whole new architecture, which was the G80 chip, also around 500mm^2 i presume (yep, 480mm^2 actually).

    So between the comparable architectures like G80, GT200 and Fermi (all unified shader architectures, whole new design and fabrication process, etc), all of them had die sizes around 500mm^2. What nVidia learned from AMD is launching low end cards with newer fabrication processes as a test (the GT100 series that you mentioned) which was unheard of before the HD4770 from ATi as far as i know.

    I'm guessing that the GK110 chip will have a surface around the oh-so-unexpected 500mm^2 mark.

    Anyway i'm not sure if i'll keep posting here since you seem to have a pretty bad attitude about the whole issue for some reason :/
     
  19. Nichtswisser

    Nichtswisser Guest

    Messages:
    263
    Likes Received:
    0
    GPU:
    ASUS GTX 670 DC2 4GB
    It's not just the die size, it's also the memory interface that points more in the direction of a mid range card. And when you consider the 680 as the planned successor of the 560, than the castrated memory interface makes sense.

    As far as I know the story, the 680 was planned to succeed the 560 and was than relabeled when benchmarks showed the card could keep up with the fastest AMD could offer. From a business perspective it makes perfect sense, why produce and sell a more expensive card when a cheaper to produce card will do and can be sold for the same price. There is also the poor yield of working chips which a bigger die probably would not improve to say the least.
     
    Last edited: May 17, 2012

Share This Page