ATI X1950xtx (Spec, release date, pic) 56K warning

Discussion in 'Videocards - AMD Radeon' started by thecake90, Jul 22, 2006.

  1. MarkIII

    MarkIII Active Member

    Messages:
    55
    Likes Received:
    0
    GPU:
    ATi Sapphire x1800gto2

    O christ...

    Either way you do not make any sence. Gddr4 is actully a little overdue. It has been announced for some time now, and gddr3 has been out for quite awhile.
    Has I read back on you post...
    where did you say their is no need for it???:rolleyes: You said it smells fishy and you never heard of gddr4 comming out so early. But please enlighten me on why you think their is no need for gddr4. I would really like to know.
     
  2. roguesn1per

    roguesn1per Ancient Guru

    Messages:
    9,505
    Likes Received:
    0
    GPU:
    GTX580
    no point putting it on cards that will be outdated when g80 and r600 will be out in a month
     
  3. InfDaMarvel

    InfDaMarvel Guest

    Messages:
    3,215
    Likes Received:
    0
    GPU:
    HD3850 256mb
    This card will be good....for 9months...so thats 9months of well spent money...or I can wait 9months and spend 500 or less or somthing that will last me for every game I wanna play for another 2-3 years....So Hmmm guys what is the better choice?
     
  4. roguesn1per

    roguesn1per Ancient Guru

    Messages:
    9,505
    Likes Received:
    0
    GPU:
    GTX580
    wait for g80/r600 in a couple of months, will play todays games awsome, and wen directx 10 comes out u will support it
     

  5. DGordon

    DGordon Active Member

    Messages:
    80
    Likes Received:
    0
    GPU:
    ASUS 3850 512mb
    how do u get into getting samples of cards and wat is a sample like a card in the beta testing stages. And do u get something out of it like the card when its finished :dave:
     
  6. Saucy

    Saucy Master Guru

    Messages:
    389
    Likes Received:
    0
    GPU:
    6800 Silencer5 (16x1,6vp, 425/1147)
    Wait a minute... The 1550 mhz GDDR3 on the x1900 is by no means a bottleneck. So why is ATI adding faster memory to a card that has plenty of memory bandwidth already?

    Even with the initially rumored core clockspeed of 700-750 (which is doubtful, if you ask me. ATI had already squeezed just about as much as they could out of the R580 core with the x1900's), 1550 Mhz is fast enough to eliminate any bottleneck that might occur.
     
  7. guardz

    guardz Master Guru

    Messages:
    763
    Likes Received:
    0
    GPU:
    XFX GTS250 1G OC
    ..because people go out and buy them...for the sake of them being new. Instant cash in nVidias and ATI's bank!
     
  8. roguesn1per

    roguesn1per Ancient Guru

    Messages:
    9,505
    Likes Received:
    0
    GPU:
    GTX580
    and its gona be priced 20% for 3% increas,,,,,,
     
  9. guardz

    guardz Master Guru

    Messages:
    763
    Likes Received:
    0
    GPU:
    XFX GTS250 1G OC
    ^ lol. I second that. ;)
     
  10. UZ7

    UZ7 Ancient Guru

    Messages:
    5,535
    Likes Received:
    72
    GPU:
    nVidia RTX 4080 FE

  11. SWEBarb

    SWEBarb Active Member

    Messages:
    65
    Likes Received:
    0
    GPU:
    Evga Geforce GTX 750 Ti
    wow 2ghz ddr im def buying this one.
     
  12. Copey

    Copey Guest

    Messages:
    10,703
    Likes Received:
    0
    GPU:
    960 2GB
    apart from having slightly higher clocks and dongle less crossfire support, i dont think the x1950 pro offers anything new, it will probably perform over the x1900gt and below the x1900xt , but the x1950xt is a different game all together, GDDR4 is going to be really good
     
  13. Vanadis

    Vanadis Master Guru

    Messages:
    927
    Likes Received:
    0
    GPU:
    EVGA 8800 Ultra
    I just bought a X1900XT today. Owell. It's still a great card.
     
  14. rpg711

    rpg711 Guest

    gddr is the same as ddr... it just has a g to make it sound cooler
     
  15. Gromuhl'Djun

    Gromuhl'Djun Ancient Guru

    Messages:
    5,452
    Likes Received:
    30
    GPU:
    4070ti
    Not exactly, GDDR isn't approved yet to be used for things other than graphics cards.
     

  16. Palerider

    Palerider Ancient Guru

    Messages:
    4,361
    Likes Received:
    2
    GPU:
    HIS X1950XTX
    the new card looks tempting, but i think i'll pass. it will be a great tool for bringing down the price of the x1900xt. maybe in the $300 range.a perfect transition card untill the dx10 cards come out.my current card is a senior citizen and the games i play, HL2-COD2,all run fine.
     
  17. samusXP

    samusXP Active Member

    Messages:
    77
    Likes Received:
    0
    GPU:
    HIS ICEQ X² HD 7950 3GB
    I certainly hope the X1900 crossfire edition cards go down. I'm broke. :rolleyes:
     
  18. zinc99

    zinc99 Master Guru

    Messages:
    605
    Likes Received:
    0
    GPU:
    ATI 5850
    This card will be fine for at least 2 years. Even with DX10 comming out who is really going to go over to vista for a little eye candy. So just to run DX10 thats $200 for vista + $400 for a new card, I dont think so. And game makers are not going to run head first into vista. All of there clients are still in XP and DX9, so I dont think its going to really have an impact for at least 2 years. If you dont think I am right look how long it took for just DX9 to be used. The 6800 came out like 4 years ago, and only now is DX9 really being used. Alot of games out today are still SM2.0 (BF2). So really who cares get the good card now and enjoy your gaming.
     
  19. MarkIII

    MarkIII Active Member

    Messages:
    55
    Likes Received:
    0
    GPU:
    ATi Sapphire x1800gto2

    qfmft

    the x1950pro is 200.00!!!!!! no master card or dongel, and 80nm. 600mhz core and 1.4ghz gddr3mem. Did I mention $200? :)
     
  20. Decane

    Decane Ancient Guru

    Messages:
    5,195
    Likes Received:
    21
    GPU:
    GTX 1060 6GB
    Ohhh.... And I just got my X1900XT a few months ago :(. For some reason this card really brings to mind the 7800GTX 512mb, although it doesn't even have an increased amount of Vram...... :rolleyes:
     

Share This Page