Would a GTX 580 improve my current system?

Discussion in 'Videocards - NVIDIA GeForce' started by NeoEnigma, Oct 16, 2011.

  1. mitzi76

    mitzi76 Ancient Guru

    Messages:
    8,722
    Likes Received:
    19
    GPU:
    MSI 970 (Gaming)
    i dont understand all the bandwidth stuff but it's obvious to me when someone tries to slime their way out of an arguement.

    pretty sad really kitch.

    anyways guys all we know is sli/xfire is great when it works but doesnt give you double the fps..depends on the game/drivers.

    and tbh i'd avoid it as my personal experiences whilst good always irritated me 2 much when one game had issues.

    shame really as peeps spend a lot on xfire/sli and are let down by either the code for the game or drivers.
     
  2. NeoEnigma

    NeoEnigma Master Guru

    Messages:
    631
    Likes Received:
    0
    GPU:
    ---
    I read through this thread:

    forums.guru3d.com/showthread.php?t=227317

    I'm not saying it's scary because I think it's dangerous. I'm saying it looks SCARY HARD. Like... fidgeting with RAM timings and coming up with a stable voltage after increasing settings. Makes my head spin... but I guess it's because I've never messed with any of this before.

    I guess step 1 is to see how hot my CPU runs after I put in a new cooler and go from there. I have a new cooler ordered already so that should be a few days.
     
  3. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,366
    Likes Received:
    906
    GPU:
    1080Ti H20
    Its easy to learn, so dont worry. We are here to help and guide you through it if you need it. :thumbup:

    Good place to start with is temps, make sure you use quality paste and apply it correctly.
     
  4. kitch9

    kitch9 Ancient Guru

    Messages:
    1,889
    Likes Received:
    0
    GPU:
    XFX 7990 3GB
    I'm done with the argument, its full of dudes typing their opinion and providing no evidence to back it up.

    Anyway fella, I'm pretty impressed you've managed to form an opinion on who's right or wrong whilst admitting you don't understand the subject matter.

    Ever thought of becoming a politician?

    By the way if a game is properly SLI optimised you can expect gains of well over 90%, and over 60% for games that not so well optimised.

    http://www.guru3d.com/article/geforce-gtx-560-ti-sli-review/1
     

  5. kitch9

    kitch9 Ancient Guru

    Messages:
    1,889
    Likes Received:
    0
    GPU:
    XFX 7990 3GB
    Yes in SLI available bandwidth is doubled. I still stand by that, and you guys are still yet to provide evidence that isn't the case. I'm done with the discussion until that is the case.

    I've edited my post to keep you happy, and to remove the straws you wanted to clutch onto.

    *Edit* To clarify my argument one more time as you guys appear to have forgot:

    IN a SLI environment, whilst the fixed memory bandwidth on each card stays the same, both cards share the workload equally *(Edit) or dynamically as the driver sees fit it turns out.* (Plus any SLI overheads.) so therefore the workloads of the memory controllers on each card (Bandwidth.) is effectively halved.

    This effectively doubles the workload the memory controllers can do, so therefore effectively doubles bandwidth.

    I fully understand the data in each cards ram is mirrored and loaded with exactly the same game assets, what I don't accept is that both GPU's can only read or write from the same files in their video ram at the same time, especially when you consider each GPU may be working on rendering a completely different scene to the other and it may need completely different textures, geometry etc from video ram.

    Some of you guys mention "Loading" to video ram, but this loading is usually done via the PCI-E bus and has nothing to do with video card memory bandwidth. In most scenarios that don't involve streaming, all loading of textures, assets and geometry to both video cards ram will be done before you enter a level and then remain static whilst you play. (This needs to be the same for both cards so the CPU can provide the correct file addresses to the graphics driver.) The GPU then accesses and manipulates this data (Not all of the data currently in VRam, only the data it needs for its current frame.) to create each frame. This data manipulation has to be kept to a minimum to minimise the amount of SLI overheads as the GPUs have to tell each other over the SLI bridge if they change anything in their respective ram, which adds to latency.

    That is my argument, and I'm yet to be proven wrong.

    *Edit* Ok after reading the below article, the more I think you guys insisting the GPU's in SLI can only read and write the same data at the same time to create their respective scenes are wrong.

    http://www.yougamers.com/articles/13801_video_ram_-_how_much_do_you_really_need/

    How the hell can two GPUs create two different scenes, that may use completely different geometry, textures, effects and AA etc whilst only being able to read and write the same data as the other GPU? The latency would be massive!
     
    Last edited: Oct 18, 2011
  6. ---TK---

    ---TK--- Ancient Guru

    Messages:
    22,111
    Likes Received:
    2
    GPU:
    2x 980Ti Gaming 1430/7296
    your like the captain of a sinking battleship contacting the enemy to see if they want to surrender. fatal character flaw you have. admit your wrong and move on. a lotta smart people here and it may be hard to pass off rubbish as fact regarding sli.
     
  7. kitch9

    kitch9 Ancient Guru

    Messages:
    1,889
    Likes Received:
    0
    GPU:
    XFX 7990 3GB
    I'm happy to admit I'm wrong when I am wrong, in fact I've been looking to find info that proves I am wrong and found nothing apart from "a few people saying so." but loads that backs up my thoughts.

    *Edit* Found another article that backs up my thoughts at Xbitlabs:

    http://www.xbitlabs.com/articles/graphics/display/gf6800u-sli_4.html#sect0

    So the driver can allocate workloads to each card dynamically depending on complexity across the scene. Lemme guess, Xbitlabs are wrong, amirite?
     
    Last edited: Oct 18, 2011
  8. ---TK---

    ---TK--- Ancient Guru

    Messages:
    22,111
    Likes Received:
    2
    GPU:
    2x 980Ti Gaming 1430/7296
    I don't think split frame rendering is used anymore. Too much of a performance. Hit in the past. Lol
     
  9. WhiteLightning

    WhiteLightning Don Illuminati Staff Member

    Messages:
    28,753
    Likes Received:
    1,595
    GPU:
    GTX1070 iChillx4
    when i had my 8800gtx sli, the main renderer was AFR. i could set it to SFR with nhancer though, but in many games there would be tearing (line right in the middle of the screen), so best was AFR for me back then.
    isnt there a SLI indicator to see the load ? i vaguely remember something like this.
     
  10. ---TK---

    ---TK--- Ancient Guru

    Messages:
    22,111
    Likes Received:
    2
    GPU:
    2x 980Ti Gaming 1430/7296
    There is one large vertical green line for Afr at the left Iirc split frame had a horizontal line across the screen haven't used it since 7800gt sli and it caused a tremendous performance hit over afr1 afr2. I haven't seen it in the ncp in years
     

  11. kitch9

    kitch9 Ancient Guru

    Messages:
    1,889
    Likes Received:
    0
    GPU:
    XFX 7990 3GB
    Whether it's used or not it still confirms the drivers can allocate workload to each card as they see fit, and that was years ago.
     
  12. k1z

    k1z Member

    Messages:
    32
    Likes Received:
    0
    GPU:
    EVGA Gtx 980 SOC
    LOL at this thread. But this is G3D after all. Even 4chan's spiderman derails don't quite match the derails going on here.
    @OP, I was at a similar spot. I had a 1GB 5870 though. Anyway, I got the chance to sell it to a friend when the 580 came out, I did exactly that, bought a 580, OC'ed my q9550 to 3,6 (easier and much more exciting than I thought), and it worked out just great, performance increased TONS in all games. Though with Ivy bridge and possibly the 7xxx from AMD soon, I wouldn't upgrade just quite yet. I suggest you just OC for now, then see what you can squeeze out from that 5870 (I had a stock Saphire model and 960/1380 was very manageable and stable).
     
  13. mitzi76

    mitzi76 Ancient Guru

    Messages:
    8,722
    Likes Received:
    19
    GPU:
    MSI 970 (Gaming)
    +1 there you go op. great advice!

    p.s at kitch. i do actually understand most of it except for some nitty gritty but i studied history for a degree so the maths part is my weakest. however one of my strong points is spotting where someone is subtlelly trying to deflect an original arguement because he is losing it. its called misdirection (well thats what the yanks say in films no? :)

    quit while you're ahead(not).
     
  14. kitch9

    kitch9 Ancient Guru

    Messages:
    1,889
    Likes Received:
    0
    GPU:
    XFX 7990 3GB
    Thanks man I'll take your advice.

    Actually I won't.

    I've just had it pointed out to me that 3d Mark Vantages texturing fill rate test is ENTIRELY DEPENDANT ON MEMORY BANDWIDTH, nothing else.

    So, lets have a look at some benchmarks for a 4870:

    [​IMG]

    Good stuff, that was a decent card in its day, so now lets have a look at a 4870x2, which as per the name is 2x 4870s on one card.

    [​IMG]

    Well would you look at that.............. A benchmark for VIDEO CARD MEMORY BANDWIDTH showing a 4870x2 scoring TWICE the score that a single 4870 pulls.

    Guys, you don't have to apologise, but the fact you'll probably not post in this thread again will speak volumes.

    As a clueless person once said:

    CLASS DISMISSED!
     
    Last edited: Oct 18, 2011
  15. ---TK---

    ---TK--- Ancient Guru

    Messages:
    22,111
    Likes Received:
    2
    GPU:
    2x 980Ti Gaming 1430/7296

  16. ---TK---

    ---TK--- Ancient Guru

    Messages:
    22,111
    Likes Received:
    2
    GPU:
    2x 980Ti Gaming 1430/7296
    its 3d mark vantage nemrod, if they posted a vantage score 2 gpu will score higher than 1 in the tests
    you got some recent evidence man. split frame rendering is not relevant in 2011. what are you going to post next voodoo3 benchmarks to support your claim? funny stuff
     
  17. kitch9

    kitch9 Ancient Guru

    Messages:
    1,889
    Likes Received:
    0
    GPU:
    XFX 7990 3GB
  18. ---TK---

    ---TK--- Ancient Guru

    Messages:
    22,111
    Likes Received:
    2
    GPU:
    2x 980Ti Gaming 1430/7296
    lol that doesnt prove your point but it had nice perty colors in the graphs
     
  19. kitch9

    kitch9 Ancient Guru

    Messages:
    1,889
    Likes Received:
    0
    GPU:
    XFX 7990 3GB
    Dude, thats the video memory bandwidth tests from Vantage (Texture fill rate.)

    How much more do you need?

    http://techreport.com/articles.x/20088/3

    Read the article in the above link, and let it slowly sink in that you have no idea WTF you have been typing about for the last few pages.

    Come back when you've got a clue so we all know you are safe yeah?
     
  20. kitch9

    kitch9 Ancient Guru

    Messages:
    1,889
    Likes Received:
    0
    GPU:
    XFX 7990 3GB
    How on gods name does that not prove 2 cards in SLI have double the video ram bandwidth of a single card.

    Its nice and simple, in the bandwidth tests, the pretty graphs and colours show 2 cards in sli have twice the bandwidth of one card.

    Really, its that simple.

    Now.

    It is,

    really,

    has it sunk in yet?
     

Share This Page