Geforce 8600 Ultra 512 Mb 256 bit? Will be or not?

Discussion in 'Videocards - NVIDIA GeForce' started by Gameslove, Jan 19, 2007.

  1. Gameslove

    Gameslove Master Guru

    Messages:
    650
    Likes Received:
    0
    GPU:
    GTX760 4GB OC/GTS450physX
    The recent rumors regarding the GeForce 8600 (G84) of having a 256-bit memory interface is incorrect. We are able to confirm at this point of time that GeForce 8600 series will be 128-bit and has 256MB GDDR3 memories onboard. Onto the rumours, We heard there are 2 versions; 8600GT and 8600GS clocked at 650/1800MHz and 550/1400MHz respectively for core and memories. Both come with 64 shader processors.

    http://www.vr-zone.com/?i=4498
     
  2. getsuga12

    getsuga12 Ancient Guru

    Messages:
    4,313
    Likes Received:
    0
    GPU:
    Geforce GTX870M
    Vr-zone is the only site which states this. Unfortunately, i won't jump onto their bandwagon just because they said, "We are able to confirm at this point of time". It also wouldn't seem plausible to stick a 128-bit interface and 256MB Vram into a next-generation card. Then again, I'll just wait until they come out.
     
  3. rabidgoldfish

    rabidgoldfish Member Guru

    Messages:
    138
    Likes Received:
    0
    GPU:
    evga 7800gt
    they are not the only one to state that, i know tcmag. and another place that i cant find back have said that too. but it is always better to wait until they come out, but at this point i would have to say 128 bit with 256 is going to be it.
     
  4. Karl 2

    Karl 2 Ancient Guru

    Messages:
    2,606
    Likes Received:
    0
    GPU:
    EVGA GTX 295
    A next-generation mainstream card. It has to come with obvious limitations lest high-end models lose their market appeal.
     

  5. getsuga12

    getsuga12 Ancient Guru

    Messages:
    4,313
    Likes Received:
    0
    GPU:
    Geforce GTX870M
    so... what about the ROPs? Are they exactly the same too? Seriously, how can they release two cards with exactly the same specs, except with different clocks? It's like a EVGA 7600GT and a 7600GT Fatality Edition. That is why i find it a bit hard to believe, because it'd be stupid to release a 8600GT and a 8600GS that only differ in clock speeds.
     
  6. FlawleZ

    FlawleZ Guest

    Messages:
    4,279
    Likes Received:
    28
    GPU:
    EVGA GTX 1080Ti SC2
    Are we forgetting its entirely possible they will cut pipelines out of the GS models? We can probably safely assume they'll be laser cut as they were with the 7 series so at any rate, that will put a damper on modifying them.
     
  7. scoutingwraith

    scoutingwraith Guest

    Messages:
    9,444
    Likes Received:
    9
    GPU:
    Tuf 3070Ti / P1000
    Yeah i do not think that NV will do the same as the 6 series cards. GT and Ultra anyone??

    My personal speculation will be that the GS card will be missing some components from the GT (and there is a big chance that the GT might 256-bit bus due to the higher series cards.) Makes sense

    8800GTX-384 bit bus (256+128)
    8800GTS-320 bit Bus (256+64)
    8600GT-256 bit bus
     
  8. getsuga12

    getsuga12 Ancient Guru

    Messages:
    4,313
    Likes Received:
    0
    GPU:
    Geforce GTX870M
    Cut pipelines? They're using shader processors (unified shader architecture), and according to our good friend Vr-zone, the GT and GS model have the same amount of shader processors.
     
  9. Goff

    Goff Master Guru

    Messages:
    408
    Likes Received:
    0
    GPU:
    EVGA GTX260 55nm SSC
    lol getsuga, the 7600gt and 7600gs only differ in clock speeds so i think its plausible they'll do this for the 8600gt and 8600gs
     
  10. scoutingwraith

    scoutingwraith Guest

    Messages:
    9,444
    Likes Received:
    9
    GPU:
    Tuf 3070Ti / P1000
    A mistake. The GT has different type of memory gDDR3 while the GS comes in flavors of DDR2 memory hence its bandwidth and speed is slower.
     

  11. getsuga12

    getsuga12 Ancient Guru

    Messages:
    4,313
    Likes Received:
    0
    GPU:
    Geforce GTX870M
    ok... so now i'm being treated like a retard for scoffing the oh, so great VR-zone.

    @Karl 2: You need to fully understand that the term "mainstream" is used for low-end cards. When the 6 series first came out, the 6600GT was the midrange card. When the 7 series first came out (before the 7900GTs), the 7600GT was the midrange card. With the 8 series, the 8600 cards should be the midrange card. The 256-bit interface and 512MB Vram is already considered mid-range now, so there is no problem with putting these two things into the 8600.
     
  12. scoutingwraith

    scoutingwraith Guest

    Messages:
    9,444
    Likes Received:
    9
    GPU:
    Tuf 3070Ti / P1000
    No worries m8. I am little skeptical myself unless i see other trusted site (X-bit Labs, Hexus, Beyond3D) to acknowledge the same. So unless we have proof somewhere else on numerous occasions i would consider this a speculation only.
     
  13. FlawleZ

    FlawleZ Guest

    Messages:
    4,279
    Likes Received:
    28
    GPU:
    EVGA GTX 1080Ti SC2
    Just because G80 uses shaders doesn't mean the rest of the 8 series will. We all know well by now that using unified shaders isn't Nvidia's forte like it is for ATI. Unless it states somewhere else specifically and officially, I don't see any reason to assume the mid range would follow the same trend.
     
  14. FlawleZ

    FlawleZ Guest

    Messages:
    4,279
    Likes Received:
    28
    GPU:
    EVGA GTX 1080Ti SC2


    I think its interesting to note that given the current game selection, even the step from 256MB to 512MB is 100% pointless. There is absolutely without a doubt no gain whatsoever from the added memory at this point in time. This could differ with DX10, but how long are we from seeing even a handful of DX10 titles?

    http://www.anandtech.com/video/showdoc.aspx?i=2607&p=10
     
  15. John Dolan

    John Dolan Ancient Guru

    Messages:
    2,245
    Likes Received:
    0
    GPU:
    2x GTX 780 SLI
    Supreme commander comes out on feb16 thats the first dx10 title i know of. Im looking foreward to this game.I think its multi threaded also i think
     

  16. FlawleZ

    FlawleZ Guest

    Messages:
    4,279
    Likes Received:
    28
    GPU:
    EVGA GTX 1080Ti SC2
    So we have Supreme Commander, of course Crysis, anything else? It's going to be a slow start this year for DX10 releases. I bet this time next year will have a much more promising list of releases. Until then, I'll stick with my 7800GTX.
     
  17. dudecat64

    dudecat64 Ancient Guru

    Messages:
    3,794
    Likes Received:
    13
    GPU:
    Asrock rx 6750xt
    If they bring this out on a 128bit interface then they might aswell not even say it will compete will ati's x1900 series. That 128bit interface will just kill the card with aa/af/hdr added. Thus giving ati the advantage in that they dont need a super fast dx10 card in the $180 market to beat nvidia. Plus making a 8600ultra on 128bit would make absolutely no sense at all. Thus giving it 512mb ram would not improve it's performance either. I would think if they did this the core and memory speeds would be alot higher to make up for the 128bit interface. Sorta like the 7600gt with 128bit interface works cause of the high clock speeds. Also look at the 6800nu vs 6600gt. The gt is faster when the aa/af are not used and lower resolutions. Higher the resolutions and up the aa/af the 6800nu walk all over the 6600gt. Catch my drift here people.
     
  18. John Dolan

    John Dolan Ancient Guru

    Messages:
    2,245
    Likes Received:
    0
    GPU:
    2x GTX 780 SLI
    I think we will see a dozen or so dx10 titles within the next 6 months some of which look to be awesome games http://forums.guru3d.com/showthread.php?t=211622 all of these games will ofcoarse still play fine on a dx9 card and win xp.
     
  19. getsuga12

    getsuga12 Ancient Guru

    Messages:
    4,313
    Likes Received:
    0
    GPU:
    Geforce GTX870M
    Yes, the step from 256mb to 512mb would be useless now, but as a manufacturer, wouldn't you want your cards to be more "future-proof"? At least once the dx10 games start pouring in, the 8600 Ultra 512MB Vram won't have any trouble running them. I'm thinking long-term, so if you think i'm wrong or anything, that's because you're thinking short-term.

    And yes, while we do not really know whether the G84 will use unified shaders or not, it is better for us to assume that it will (based on a lot of websites releasing the exact same info). If Nvidia sticks with pipelines, people won't be able to run HDR+AA on the rest of the 8 series card. Without the ability to run HDR+AA, the cards can't be called "next generation".
     

Share This Page