780 TI 6GB Version

Discussion in 'Videocards - NVIDIA GeForce' started by blakedj06, Jan 11, 2014.

  1. wheeljack12

    wheeljack12 Guest

    How they are clustered is a good observation. However, the clusters stiil make up the amount of cuda cores however clustered into SMX units. Yes, your right that it's a full gk110,forgot about that. Sorry about Farming. Anyways, what do you mean by farming? I know trolling, being a PITA to the users.
  2. jbmcmillan

    jbmcmillan Guest

    Maybe a pita but not trolling but trying to up your post count instead of just using the edit button.
  3. Loophole35

    Loophole35 Guest

    One SMX unit holds 192 cuda cores that is Kepler. Also post farming is done by users to increase their post count so other users will see them as more established a user on said forum.

    And this is why I called it post farming and not trolling.
  4. wheeljack12

    wheeljack12 Guest

    Don't worry, My message is more important to me than my post count. I am not here for self image if you get my drift.

  5. oOEvil1Oo

    oOEvil1Oo Guest

    Are you serious? I have 2 770 lightnings with a 3770k and under full load my PC uses 920 watts. Now at idle it only sips around 130. overclocked of course but still, so yes use a smaller power supply and never overclock anything you may be able to run 750 watt PSU. :nerd:
  6. eclap

    eclap Guest

    Yes I'm serious
  7. cyclone3d

    cyclone3d Guest

    6GB per card.. heh.

    Complete waste for now. Maybe when the next generation consoles come out it might be closer to being "needed".

    I don't really see this as a huge seller, especially once the review sites do their thing. The only people that will "need" it are people running 3+ displays in Surround, or people with 2+ displays at 1440p. Even then, I would think that 4GB of VRAM per card is more than enough.

    I am just now upgrading to 2GB cards. Even now 1GB cards are plenty for most things at 1080p.
  8. Fox2232

    Fox2232 Guest

    Measured where? On wall socket? Then 920W * 0.85 (efficiency of PSU) = 782W draw of components.

    Yes, it's overloading 750W and I would not advise anyone to do such thing short or long term.
    But it's very real thing and should work till it's not.

    And as side note MSI Lightnings draw more than standard ones and can likely overclock more.
  9. oOEvil1Oo

    oOEvil1Oo Guest

    I read my usage from my kill a watt whenever I replace or upgrade parts. I guess stock clock CPU with reference stock clocked 770's you may skirt the 750 watts, but sorry I own this and you cannot tell me that its good. Also my I am using 900 watts running BF4, have not checked with stress tests. Specs may look ok but unless you actually test it for yourself you don't know. Also if your using an AMD CPU your seriously not going to make it. Also why would you want to play so close with your power limit? I always go big on my PSU and the point I am at now, my next will be the 1200 corsair.
  10. eclap

    eclap Guest

    yes yes yes, we know, you need to read up on the context and why what was said was said.

  11. Solfaur

    Solfaur Ancient Guru

    Likes Received:
    GB 3080Ti Gaming OC
    This, it will only make sense for 2 way or 3 way SLI setups. Also, I'm pretty sure 800 line will come out this year, so throwing 700-800$ (or whatever the price will be) on a 780Ti 6GB is may not be worth it.
  12. Fox2232

    Fox2232 Guest

    Yes, that thing (kill a watt) measures what PSU draws, not what components draw, so they draw close to 800W which is doable but not advised for good 750W.
  13. AcidSnow

    AcidSnow Master Guru

    Likes Received:
    Sapphire 5700XT
    When playing BF4 @ 1080p, I can't get my 290 to use more than ~1.5GB of VRAM, even when up-scaling to 150%.

    More VRAM is utterly pointless for 90% of users. ...So unless you're using 4K, why would anyone even consider 6GB at this point in PC advancement?

    My 5870's would disagree, the 1GB of VRAM on them was getting eaten up by BF4 (for whatever reason). ...They dominated BF3 w/out any problem, but when BF4 came around they were like "nope."
    Last edited: Jan 12, 2014
  14. Fox2232

    Fox2232 Guest

    Because PS4 has available quite a lot? And that means developers will take advantage of it, but on PC optimizations will go towards amount of vram mainstream has.

    And btw, I got to 2.6GB vram on BF4 and test on Titan to 5.4GB. So unless BF4 does think you have only 2GB vram there is no reason why you should have such low vram usage.
  15. Koniakki

    Koniakki Guest

    I'm sorry guys, I've been reading the thread since the beginning and this is too much! LOL!

    Some replies here are utterly hilarious/ridiculous! :p

    1st the whole amd vs nvidia is here again as always but that 770 SLi? 900+w for 770SLi?? lol!!!

    My previous 780@1241/7300 with my 3770k@4.6, 16GB ram at 2400Mhz, H100i, 4 Megaflow's, 2 Skythe's 110cfm and an Aerocool shark 140mm, 3 HDD's(2/3/4TB) was using about 400w(with VSync OFF) when measured from the wall and about 340w with VSync ON.

    Assuming my PSU is at 88% that is: 350w and 300w respectively. My 3770k alone without a GPU installed @ 4.6-4.9 draws about 200-220 watts. Cannot remember if that 200+w was for 4.6 or 4.9 OC tbh. :p

    So even if I add another 780@1241/7300 with even 300w consumption I'm looking at what? 700w while gaming without VSync on? And 650w with it ON?

    And someone here says he draws 900+w? From SLI 770?? Even if Lightnings?

    Oh please....
    Last edited by a moderator: Jan 13, 2014

  16. chasus

    chasus Guest

    Yes. Not only one needs 2-way or 3-way SLI to raise the rendering power high enough to match the details required to utilize the 6GB, one also needs a high resolution as well. I imagine, either 3x1920x1080 or a 4K monitor as the minimum.
  17. Veteran

    Veteran Guest

    Very true, its the quality and content of your post that carrys the weight of your post:)
  18. grndzro7

    grndzro7 Guest

    780TI = Garbage
    1/24 FP32, no LoD setting.

    6GB of ram is useless on a card where you can't force -LoD.

    Also when those awesome Mantle supported engines come out later this year the dual precision floating point performance of NV's cards(not including Titan) is really going to hurt vs AMD on Mantle.
  19. Koniakki

    Koniakki Guest

    And that people, it's how you post a comment full of "hate"... :thumbup:
  20. eclap

    eclap Guest

    Yep, that was textbook!

Share This Page