Nvidia kills GTX285, GTX275, GTX260 (rumor) abandoning High/Mid Market?

Discussion in 'Videocards - NVIDIA GeForce' started by pagusas, Oct 7, 2009.

Thread Status:
Not open for further replies.
  1. Infiniti3D

    Infiniti3D Master Guru

    Messages:
    345
    Likes Received:
    0
    GPU:
    8800 Ultra H²0
    The money they squeezed out of the Geforce 8 series is such a huge amount, and they are planning to do that again with Fermi. Who care's about
    G200 with DX11 arround the corner. Nvidia sells GPU's and bridgechips to board partners so they can turn it in a whatever Black or Core edition. They contracting Asus, Asus say's: we want 6000 of those and 10.000 of those...
    Ofcourse Nvidia will make sure they sell the amount of chips and GPU's on paper first to see if it's profitable, the people working there studied dont worry and watch out for Charlie:wanker:
     
  2. J STEEL

    J STEEL Guest

    Messages:
    394
    Likes Received:
    0
    GPU:
    EVGA GTX 680 FTW+ SLI 4GB
    I just read on Techreport that Nvidia just responded to this. As I cant give the link to that site, you will have to go there yourself.
     
  3. hallryu

    hallryu Don Altobello

    Messages:
    11,381
    Likes Received:
    15
    GPU:
    2x HD7970
    I lol'd!:D
     
  4. Karl 2

    Karl 2 Ancient Guru

    Messages:
    2,606
    Likes Received:
    0
    GPU:
    EVGA GTX 295
    But we can't use the same financial reasoning :)

    ATI is not in financial trouble, in fact it's becoming AMD's cash cow. Market analysts who claim that ATi's retail sales are not as profitable as nVidia's usually fail to mention ATi's performance as an OEM supplier and designer/manufacturer (as ATI Technologies ULC) of non-computer electronic components and military technology. ATi uses profits generated from those sectors to finance their retail products, something nVidia cannot indulge in to the same extent due to their assets being tied up by recent (and some say ill-advised) acquisition of some of its former suppliers.

    Because they have different structures, comparing nVidia with ATi is more difficult than it appears at first glance, and limiting the comparison to retail products can be misleading.
     

  5. Karl 2

    Karl 2 Ancient Guru

    Messages:
    2,606
    Likes Received:
    0
    GPU:
    EVGA GTX 295
    I think this time nVidia would prefer not to spend as much money as they did on the G80 ;)

    It paid off in the end but it was a very risky venture. Consider that ATi spent a lot less to catch up, at the expense of time however.

    The people who work at GM also studied :D

    I'm not comparing nVidia with GM (rofl), but it's always been a roller coaster ride with them. I like roller coasters :)
     
  6. Denial

    Denial Ancient Guru

    Messages:
    14,201
    Likes Received:
    4,105
    GPU:
    EVGA RTX 3080
    They actually technically spent more. Jen-Hsun said at the conference that they had invested over $1B to make Fermi possible. But I think that accounts for more than just the architecture of the card, things like partnerships with certain companies, etc. Either way they spent a metric ton again.
     
  7. Karl 2

    Karl 2 Ancient Guru

    Messages:
    2,606
    Likes Received:
    0
    GPU:
    EVGA GTX 295
    I assume you meant "not top of the line SLI". Thanks for reminding forum members that highest-end cards are unprofitable prestige products catering to a fringe market, and can't be used to gauge the market success or general availability of a particular brand. Ford doesn't live on Lincolns and GM sells a lot more Chevrolets than Cadillacs.
     
  8. Karl 2

    Karl 2 Ancient Guru

    Messages:
    2,606
    Likes Received:
    0
    GPU:
    EVGA GTX 295
    That would be the money that AMD would not let him spend for ATi? :D
     
  9. morbias

    morbias Don TazeMeBro

    Messages:
    13,444
    Likes Received:
    37
    GPU:
    -
    Seriously, can we stop posting 'news' written by Demerjian? The guy never cites any sources because most of what he writes is bullsh!t that exists only in his head.
     
  10. Denial

    Denial Ancient Guru

    Messages:
    14,201
    Likes Received:
    4,105
    GPU:
    EVGA RTX 3080
    Well here is the official word from Nvidia:

    http://www.hardocp.com/news/2009/10/07/nvidia_abandons_market6363636363
     

  11. Karl 2

    Karl 2 Ancient Guru

    Messages:
    2,606
    Likes Received:
    0
    GPU:
    EVGA GTX 295
    Me neither, I give them my ex-wife's info instead.
     
  12. rooisone

    rooisone Member Guru

    Messages:
    176
    Likes Received:
    0
    GPU:
    EVGA SC GTX 570
  13. Karl 2

    Karl 2 Ancient Guru

    Messages:
    2,606
    Likes Received:
    0
    GPU:
    EVGA GTX 295
  14. GC_PaNzerFIN

    GC_PaNzerFIN Maha Guru

    Messages:
    1,045
    Likes Received:
    0
    GPU:
    EVGA GTX 580 + AC Xtreme
    The sites name should be MostlyBS.AMD.com :)
     
  15. chanw4

    chanw4 Guest

    Messages:
    2,362
    Likes Received:
    26
    GPU:
    NITRO+ RX6800XT SE
    Good to know its not true but then, how nvidia compete with 5800 series with their 'current gen' card? They are little more expensive in price (unless they drop it but they would loss more) and can't compete with 5800 in performance wise (other than 295 but that card is way expensive). Fermi won't be out for months while ATi AMD releasing their refresh card and the Hemlocks/Cypress (5870X2/5890??).
     
    Last edited: Oct 8, 2009

  16. Squall Leonhart

    Squall Leonhart Banned

    Messages:
    1,331
    Likes Received:
    0
    GPU:
    Geforce GTX 275 896
    Logic would have it, that they are dumping those chips so they can increase Fermi production.

    machine lines not producing last gen parts can now be producing next gen parts.

    because when it comes to developers, nvidia's drivers work, while ati drivers work with hacks.
     
    Last edited: Oct 8, 2009
  17. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    nVidia has silently gone up for sale a few times from what I understand. Silent sale = man in charge simply wants to retire. Given that nVidia does business in the US, until GT300 hits the market, they're required to continue production of GT200. If you check the laws for various countries....you find out real quick how much of that article is bull****. Also, I'm pretty sure everyone has heard about every attempt that Jen-Hsun has made to sell nVidia. It's not like he's even tried to hide it. The man wants out....but I doubt he'd destroy the company to do so. He wants more than anyone is willing to pay for nVidia....doesn't help that he only tries to sell when the company looks bad (G84/G86 coverup for example). He tried to sell before nV stocks tanked last summer, after nV admitted to having prior knowledge of the issues.

    I highly doubt that nV is backing out of the mid-range or high-end segment any time soon....if ever. The US Consumer Protection Act requires nV to continue with production of GT200 until equivelently/better performing GT300 parts get to market, so no go on EOL for GT200 chips unless nV wants to worry about an anti-trust suit next.
     
    Last edited: Oct 8, 2009
  18. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti


    Sure he does. He cites his own articles as sources. I mean if he can't trust himself, who can he trust? :3eyes:
     
  19. coolville

    coolville Active Member

    Messages:
    61
    Likes Received:
    0
    GPU:
    EVGA FTW COOP GTX295
    Whether Nvidia stays or goes, my next card will be a GTX 295:nerd:
     
  20. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    Well, to an extent, he is right....nVidia is going to EOL GT200. His timeframe is just completely wrong. nVidia would never back out of a market they've dominated just because of pricing. There are too many fanboys that don't worry about price.
     
Thread Status:
Not open for further replies.

Share This Page