GeForce GTX 690 arrives in the Guru3D lab ...

Discussion in 'Frontpage news' started by Guru3D News, May 1, 2012.

  1. The_Fool

    The_Fool Maha Guru

    Messages:
    1,015
    Likes Received:
    0
    GPU:
    2xGIGABYTE Windforce 7950
    What are you talking about? If you're running an application that demands a lot of video memory it'll perform MUCH faster if you have more video memory than not enough. One 690 will completely stomp on your 3x 5870 setup for what you're doing. I can force Doom 3 to use all my video memory as well.
     
    Last edited: May 3, 2012
  2. Jcazz

    Jcazz Master Guru

    Messages:
    871
    Likes Received:
    0
    GPU:
    Tri-fire 5870
    but of course, what i am trying to make you realize is that the PROC/GPU starts to crap on itself....funny a 4gb 680 cant keep up with 2/3/4 see my point. its overall rendering, and you cant render without MEMORY. why so many engines if there sucking through a ****ing straw. "example"

    open the straw up and let more flow < memory...
    Engines are revving up and redlining < gpu

    Now take away the Intergrated BS and you can help further eliminate this. my previous post explains that.

    If anyone really does graphic modding, you will understand were i am comming from on an enthusiast level

    I am just talking about DX10 were it needs to be. and thats not close to were we started. just like DX9...DX11 is still 5+ years away from running hardcore. we still are not running Pure HD...mabe HD/UHD resolution. but not HD textures amogst other things.

    Metro 2033 is a great example of this. Low textures..post, shaders, well everything. DX11, the only thing thats stands out is perception and ofcouse a little higher quality in realism, but the true effects are very far away for DX11 to fully shine.

    Then to render the full effects. your original i thought i was running games in 2560x1600 with or without aa has now been downgraded to lower res once the textures,gameplay,physics, post, shaders, everything catches up in detail and quality. guess it depends where ur system stands as well, but the cycle of life goes on.

    my option? i bitch and compain to every man woman and child until he or she understands that they could be getting a better deal, or we can play the integrated game with slow ass cards and non socketable gpu/memory and buy new PCB with memory and GPU every time your memory or GPU goes **** U. thank nvidia and amd for acting like Apple on this one.

    I hope now u see my little delima..

    Or i could wait...and wait..and wait..and they can lie and lie and lie...guess thats the path we all have choosen to render.
     
    Last edited: May 3, 2012
  3. The_Fool

    The_Fool Maha Guru

    Messages:
    1,015
    Likes Received:
    0
    GPU:
    2xGIGABYTE Windforce 7950
    Let's say Crysis 3 at it's worst will hypothetically use 3 GB of video memory. PCIE 3.0 has a bandwidth of 16 GB/s at best. The GTX 680 has a memory bandwidth of 192 GB/s. I'd much rather have all textures pre-loaded on a 4 GB card than have a 2 GB card and have to load textures in the middle of a game. It seems to me that there's better performance with more video memory and craps on itself with less.

    Yes, 2, 3, or 4 680s in SLI performs better than a single 680. However, that has nothing to do with memory as each GPU has it's own memory and each have their own copy of the textures.

    I agree on one thing, though. It would be nice if the memory could be removed and upgraded on cards.
     
    Last edited: May 3, 2012
  4. Jcazz

    Jcazz Master Guru

    Messages:
    871
    Likes Received:
    0
    GPU:
    Tri-fire 5870
    agreed. Same thing with GPU swaps.."hey the new GPUs are out. 3-5 different models of chips to select per generation, and alot of different memory combos to work with and upgrade on. Socket direct on mainboard and no more cards is faster.... why didnt Intel stay with SLOT A then if "cards" were so great....where is my SLOT Core I7. not socket. Software and chipset would also have to be changed to accomidate something like this.
    Ocing GPU specs in Bios like the CPU would be the ****...
    Having a GPU the size of a CPU in terms of chip size or bigger than what is on those cards would be great. Having one fat ass GPU Chip as a high end model that = the performance of say - 3/4 cards would also be great. But then there is keeping cards as well....to upgrade ur performance all together slowley within cost ranges...i dunno its a headache.

    Wouldnt adding the GPU/CPU together create more of a headache than its worth...in terms of heat and speed compared to todays gen of high end cards in terms of performance per watt and what we are getting. Another problem is the energy....the faster u want to go, the more energy and heat...chip size to waffer size.....i dunno...i am not a certified engineer.
    Keeping the GPU and CPU seperate seems to be better, but not on the card and not Apples way of lock out everyone from upgrading in order to upgrade it all scheme.

    anyways back to now. 690 will still rock. and be a good card. i will have to wait.

    Even though you can shrink....For a typical generation, there is no replacement for displacement. a desktop pc will always be here, a super computer will always fill rooms, Cell phones and tablets will always be old Desktop tech and well, the GPU and CPU even though u can fit them on one chip...the cores are still SMALLER than if they were kept seperate.

    a big cpu and a big Gpu i like than a small cpu/gpu crammed together also. But in the end, if it helps with delay, latency timmings to do it this way then i understand. Its just taking forever to create and shrink.

    Software....outguns hardware 10 to 1

    Crysis 3 when it comes out will look and play most likey like everyone wants mabe not mabe better for some...the deal is, it will be a great game visualy. But that doesnt mean you are getting the Real Deal version of the game. ive learned this through Killer instinct "snes" Ultra64 version...Crysis...at release..and Crysis now. have you ever watched the Intro of Crysis? even though rendered at low res, thats how the game should of looked and played in 2007. Thanks to computer hardware energry and everything else. it takes time. or u could of rendered it in a cabniet in a ACRADE somewhere and got more than the effects we have now or the same for 2007.

    Only one console company..Nintendo. They helped me realize along time ago. Quantom 3d. and 3dfx...and Nintendos ULTRA 64. ENERGY...and shrikin before u get the real deal holy field. takes years. caugh CRYSIS.

    I must apologize for my ramblings and getting forums off topic.

    Back to the 690.
     
    Last edited: May 3, 2012

  5. Musouka

    Musouka Master Guru

    Messages:
    326
    Likes Received:
    0
    GPU:
    ZOTAC GTX970
    This needs to be the final packaging :)
     
  6. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,392
    Likes Received:
    18,564
    GPU:
    AMD | NVIDIA
  7. Rich_Guy

    Rich_Guy Ancient Guru

    Messages:
    13,138
    Likes Received:
    1,091
    GPU:
    MSI 2070S X-Trio
    Cheapest on Cockers is £840!!!, rest are about £900, lmfao, im absolutely crying here at those prices, hilarious, awesome card though :D :D :D :D
     
    Last edited: May 3, 2012
  8. Veteran

    Veteran Ancient Guru

    Messages:
    12,094
    Likes Received:
    21
    GPU:
    2xTitan XM@1590Mhz-CH20
    Scan is cheapest.

    edit....there prices have just gone up 20-30 quid in a matter of 45min,that why i got in early as i knew prices would rise.
     
    Last edited: May 3, 2012
  9. Ade 1

    Ade 1 Master Guru

    Messages:
    600
    Likes Received:
    0
    GPU:
    Gigabyte GTX 690
    So it says 4gb (2 x 2gb) - I am right in thinking that effectively it has 2gb of vram to use rather than a full 4gb?
     
  10. Rich_Guy

    Rich_Guy Ancient Guru

    Messages:
    13,138
    Likes Received:
    1,091
    GPU:
    MSI 2070S X-Trio
    Yes :)
     

  11. Paulo Narciso

    Paulo Narciso Guest

    Messages:
    1,226
    Likes Received:
    36
    GPU:
    ASUS Strix GTX 1080 Ti
    680 SLI vs 690 results are a bit odd, because 690 should never be superior. I guess new drivers have improved performance on those particular games, like Hard Reset.
     
  12. morbias

    morbias Don TazeMeBro

    Messages:
    13,444
    Likes Received:
    37
    GPU:
    -
    Yes, but the 690 uses onboard PLX which means next to no latency compared to a normal bridge.
     

Share This Page