780 TI 6GB Version

Discussion in 'Videocards - NVIDIA GeForce' started by blakedj06, Jan 11, 2014.

  1. wheeljack12

    wheeljack12 Guest

    not that is off the wall. AMD max requires 600 watt from the mouth of the horse.
  2. eclap

    eclap Guest

    What's AMD max?
  3. wheeljack12

    wheeljack12 Guest

    600 watts for a single r9 280x card.
  4. scheherazade

    scheherazade Guest

    AFAIK games like BF4 use 2+ gigs vram maxxed out at sub-surround resolutions.

    You could go with surround/eyefinity with AA and 3 wouldn't be enough, or next gen 3 might not be enough when maxxed out.

    But right now, 3 should be ok.

    I believe you meant 1/24 FP64

    32 is single precision.
    64 is double precision.

    It's also worth mentioning that rendering APIs (directx/opengl) are single precision.

    Double precision is a big deal for GPU compute applications that need lots of units of accuracy, or have a lot of error-compounding matrix math. Scientific type stuff. Not gaming.

    As much as I don't like nvidia artificially nerfing the 780's 64 bit performance just to retain a market for the Titan (and the like), the weak double performance simply doesn't factor into the gaming experience.

    Last edited by a moderator: Jan 14, 2014

  5. eclap

    eclap Guest

    well, depends on what you call maxed out. even at 1080p, with 200% resolution scaling you'll see over 2gb usage. But before you get anywhere near that 200% resolution scaling you'll be out of gpu power. at 1080p maxed out (resolution scaling at 100%) you'll struggle to hit 2gb vram usage.

    Just played quick round, 1440p, maxed out (no msaa) with 130% resolution scaling, 2.1gb vram usage.
  6. ---TK---

    ---TK--- Guest

    ^^whats your vram usage max aa got to be close to 3gb
  7. Loophole35

    Loophole35 Guest

    What frames where you getting with those settings?
  8. eclap

    eclap Guest

    hmmm, I can try maxing everything out (including 200% resolution scaling). I'll post back in a bit

    EDIT: looks like I'm out of ram with those settings, let me increase pagefile and try again.
    on average I'd say in the 90s. with some drops here and there.
    Last edited by a moderator: Jan 14, 2014
  9. ---TK---

    ---TK--- Guest

    bf4 is screwed up with sli, considering how long the games been out thats pretty bad. I cant run 1 card with the settings I like so this ones on the backburner for now. when I set the game to single gpu all the flashing junk is gone. force afr 1 or 2 and the game runs around 20fps
  10. GhostXL

    GhostXL Guest

    Likes Received:
    PNY EPIC-X RTX 4090
    Ouch is that for real? Then what the heck is a single R9 290x??

    Everything works fine in BF4 for me at 2K Ultra settings. No flashing no artifacts notta. I'm on 2 GTX 780's though OC'd to 1150 core.

  11. eclap

    eclap Guest

    ok, tested with everything possible maxed out in game. around 2900mb vram usage (10fps). with my normal settings (Ultra, with mesh on high and ssao instead of hbao and no msaa but 130% ish resolution scaling) I'm getting around 1800mb vram usage.

    This is at 1440p. If resolution scaling indeed works as I assume it works, 200% resolution scaling of 1440p equals roughly 3.55k resolution. So here you have it folks, BF4 at 3.55k with 4xMSAA and everything maxed out = roughly 2900mb vram usage.

    EDIT: don't have a screenshot to prove it, but I'm sure you'll take my word for it. I forgot that I have MSIA OSD disabled for screen capture :bang:
    Last edited by a moderator: Jan 14, 2014
  12. techie405

    techie405 Guest

    I laugh at people who say 3 GB is enough. Maybe for current games it is but I try to look forward when buying a new card and considering some newer XBONE and PS4 games are already using 3GB+ just for graphics at 1080P and below resolutions, 3GB cards will be completely out of date in a year or 2.
  13. eclap

    eclap Guest

    This would be a good time to post some proof. Or go away.
  14. ---TK---

    ---TK--- Guest

    2gb cards are not out of date yet, 3gb cards should be good for the forseable future.
  15. techie405

    techie405 Guest

    Why is this so hard to believe? What do you think they are going to do with the 8GB of RAM in those new consoles? Just not use it? When the Xbox 360 came out with 512 MB of ram we had 512 MB graphics cards so it seems current cards are behind.

    Anyways, http://www.eurogamer.net/articles/digitalfoundry-inside-killzone-shadow-fall
    Last edited by a moderator: Jan 14, 2014

  16. eclap

    eclap Guest

    I don't see any proof in there, sorry
  17. ---TK---

    ---TK--- Guest

    Isnt killzone a ps4 exclusive? It will never come out for pc iirc. If a 3gb card is so bad why do you still have a 1.5gb 480?
  18. techie405

    techie405 Guest

    Yes KZ is exclusive but whats important is it was designed for the PS4 hardware unlike current multiplatform games which were designed for the XBOX 360 & PS3 and ported to PC, XBONE, and PS4. As soon as devs dump the old consoles RAM usage will skyrocket.

    Why do I use a card with 1.5GB? Because I haven't needed more for current games designed for the 360. When I do buy a card "Maxwell" I will get the most vram I can get so it will last a while.
  19. Illnino

    Illnino Guest

    I did the same and hit 4024MB
  20. eclap

    eclap Guest

    What resolution? And is that one or two 290x? Can you post a screenshot?

Share This Page