EVGA employees say 260x2 < 280x1 true or false?

Discussion in 'Videocards - NVIDIA GeForce' started by Treble557, Jan 1, 2009.

  1. Treble557

    Treble557 Master Guru

    Messages:
    424
    Likes Received:
    0
    GPU:
    EVGA GTX295/8800gtx physx
    See, that's the thing.. I did read a few articals on it, but I honestly don't know enough about the specs and their meanings to formulate a proper opinion on it, and make a decent judgment call based on it.

    This is why I come here, so that you guys can fill me in on the "blanks" so to speak, to help with that opinion on what to buy.

    Right now my idea is that the 2x260 later when prices go down will be my best bet with my power supply (http://www.newegg.com/Product/Produ...e-_-Power+Supplies-_-Cooler+Master-_-17171024) and with cooling for the kinda power I'll see compared to this single gtx295. I suppose most of that even for you guys is simply speculation based atm due to it not even being out.

    You said there "should be".. So, I mean... With that in mind, do you feel my choice would be good for the next 3 or so years before making another huge upgrade basiclly? or with these will I run into a roadblock on a game like crysis coming out faster then I think?
     
  2. Flopster

    Flopster Banned

    Messages:
    3,025
    Likes Received:
    0
    GPU:
    Club3D 8600GTS 256MB
    I'm not sure about your psu... I don't think it can handle two gtx260, although if the're is enough amps and gtx260 is 55nm...
    Though I think it will be enough for gtx295.
    Anyway you should wait a bit for the review of the gtx295.
    For 3 years?
    I think the it'll last for 2 years on 1920x1200 resolution, but definetely you should oc your cpu to 3ghz, but it'll probably go even further.
     
  3. JumperR

    JumperR Master Guru

    Messages:
    896
    Likes Received:
    0
    GPU:
    GtX 570 @ 900/1950
    People say people who game at 1680x1050 no need 295gtx.. but how come?

    the 295gtx gets like 20 fps more then 260gtx at that reso :D

    but anyways i am still gonna get 260gtx. and think about it ...

    :) will step up.. so i have 90 days to think about it on getting 295gtx and a new PSU.. will spend alot.. but taxes is here soon. :D
     
  4. Matt26LFC

    Matt26LFC Ancient Guru

    Messages:
    3,123
    Likes Received:
    67
    GPU:
    RTX 2080Ti
    Who says that? The GTX295 will almost double your FPS at that res in some titles.
     

  5. Treble557

    Treble557 Master Guru

    Messages:
    424
    Likes Received:
    0
    GPU:
    EVGA GTX295/8800gtx physx
    Well, I linked in my post the PSU I have, it's amps and such are all listed there, so you could probably judge that better then I could honestly. :/
    Even if it's just for 2 years, an upgrade of that amount would still be... substantial.

    I could upgrade the PSU if need be. Reccomend one that 2x gtx260 core 216 55nm's would like the most. Link it with newegg, so I can get some price comparisons down. I'm all for upgrading any part if I absolutly must to see this machine improve to the level it needs to be at.


    Now, about gaming at 1920x1200.. Something is off about the res's below that one with my machine atm.
    It seems to just make the picture worse and the screen crappier looking... in wow. lol
    In crysis it's a definite FPS increase, but in wow, for some reason, it doesn't go up or down! It's weird as hell!
    Wow is what I spend most of my time on too honestly, so yeah. I need something that'll handle it perfectly, instead of the 18 fps I get in raids right now with my 8800gtx.

    Sooo yeah. Suggest me a power supply, please. I would really appreciate it, so that I can make this a solid upgrade.
     
  6. JumperR

    JumperR Master Guru

    Messages:
    896
    Likes Received:
    0
    GPU:
    GtX 570 @ 900/1950

    yeah but some people plays old games :p like CS:S etc

    and i want more fps and more smoooth
     
  7. Treble557

    Treble557 Master Guru

    Messages:
    424
    Likes Received:
    0
    GPU:
    EVGA GTX295/8800gtx physx
    so the res im playing at is infact a huge defining factor here then in what card I get im assuming?
     
  8. Matt26LFC

    Matt26LFC Ancient Guru

    Messages:
    3,123
    Likes Received:
    67
    GPU:
    RTX 2080Ti
    Yeah generally the lower the res the more CPU dependent you become, i.e your CPU cant provide your GPU data fast enough. The higher the res the more GPU dependent you are i.e its taking your GPU longer to render so your CPU can keep up.

    If i where you I'd OC that Q6600 to 3Ghz+ that'll give u a boost, some games like crysis just love clock cycles!

    I'd wait till the full GTX295 review comes out, should b very soon. As for your PSU think it has 54Amps across the 12v Rails, so thats plenty for one GTX260/280 not sure about two of em.

    I think the Thermaltake Toughpower at 750w has 60Amps across its rails.
     
  9. Treble557

    Treble557 Master Guru

    Messages:
    424
    Likes Received:
    0
    GPU:
    EVGA GTX295/8800gtx physx
    so, with a 680i sli mobo, made by MSI... what do I do to OC it?
     
  10. Passion Fruit

    Passion Fruit Guest

    Messages:
    6,017
    Likes Received:
    6
    GPU:
    Gigabyte RTX 3080

  11. Foes

    Foes Ancient Guru

    Messages:
    1,673
    Likes Received:
    0
    GPU:
    Tri-Sli 260's
    Deciding on what GPU also depends on what game or games you are playing. This whole what GPU should I run thing is kinda like a shot in the dark. Below are some numbers that I have obtained from this website (Guru3d). All FPS scores are @ 1920/1200 with stock 280 and 260 GPU's

    Far Cry 2
    280=51 FPS
    260=50 FPS

    Crysis
    280=47 FPS
    260=43 FPS

    COD4
    280=62 FPS
    260=55FPS

    More how to OC 680i reading material http://www.evga.com/forums/tm.asp?m=61146
     
  12. Treble557

    Treble557 Master Guru

    Messages:
    424
    Likes Received:
    0
    GPU:
    EVGA GTX295/8800gtx physx
    I gotta say honestly, that your post here was probably the most simple and most helpfull one I've gotten so far.

    That guide even had screenshots of what to do! That's so awesome.

    Now, my new questions since I now know about the cpu and gpu, would be...

    1) I run at 70-80C atm not overclocked on the cpu, should I wait till I get the new fan to OC them?

    2) Am I better off dropping the q6600 and getting anew cpu altogeather with the gtx260 instead of overclocking it to get the best fps I can in... Well, in WoW honestly (high cpu power game). If so, please reccomend one, and lemme know like, what im looking for in it that trumps my q6600.

    3) If I OC this CPU, what numbers do I put into the windows for the.. I think it was multiplier and the actual number thing to get the 3.2 ghz. I think the guy in his said he set it to 8x and 1600, but im not sure if that's what I should be shooting for.
    My math isn't.. Well, isn't the best, lol. So I don't trust myself with just calculating it out myself and going from there. So im askin one of you if you'd be ok doing it for me?

    Thanks again guys!
     
  13. slickric21

    slickric21 Guest

    Messages:
    2,458
    Likes Received:
    4
    GPU:
    eVGA 1080ti SC / Gsync
    Okay i'll start to offer my 2 cents..

    1) Those temps are a little high for default clock speeds. Get a thermalright ultra 120 extreme and put a decent 120mm fan on it.
    http://www.anandtech.com/casecooling/showdoc.aspx?i=2943

    Note: make sure your case has good airflow, in and out. Tidy your cables so they don't restict airflow. You may need a new case with good airflow. Its crucial to get your setup right and judging by your cpu temps at moment you may be suffering from poor airflow.

    Then start the overclocking !!!!

    2) Stick with the q6600 and overclock it, if you get around 3.2ghz + then no way will it bottleneck a gtx 260.
    P.s if your gaming at 1920x1200 then you will be more GPU bound in games, not cpu bound.

    3) Someone with your cpu and mobo needs to help you here.... hopefully they will be along soon
     
    Last edited: Jan 1, 2009
  14. Foes

    Foes Ancient Guru

    Messages:
    1,673
    Likes Received:
    0
    GPU:
    Tri-Sli 260's
    70-80C idle or load? If load, then they are semi high, if idle they are way high and you should only read and develop a game plan until your new cooling solution arrives. What CPU cooler did you purchase?

    Your Q6600 is stock 2.4Ghz with a 9x multi, I see no reason, temperature dependent, that you should easily be able to OC that Q6600 to 3.2Ghz. Anything beyond that most likely will require adding Volts to different parts, monitoring temps and trial and error. I would first start out with a FSB 1600 and MEM 800 with a multi of 8x, that would put your CPU @ 3.2Ghz. The equation goes like this:
    FSB (QDR) / 4 (because the FSB is quad pumped, thus the QDR) X multi
    1600 / 4 = 400 x 8 = 3200 (3.2G)
    1600 / 4 = 400 x 8.5 = 3400 (3.4G)
    1600 / 4 = 400 x 9 = 3600 (3.6G)

    Don't worry about memory timings, just set them to AUTO or OPTIMAL if you have that option. The most important thing in OC'ing the CPU is temps temps temps. As a good rule of thumb, 65-70C UNDER LOAD and ofcourse anything under 65C is acceptable. Program to use to stress CPU is called Prim95. You can search Prim95 on google and download the program. You will also want a good temp monitoring program, i use and recommend HWMonitor, which you can download from the download section found on the main page of guru3d.com. But until your new CPU cooler arrives, read read research and read. Also, run some game and synthetic benches on your computer before you start OC'ing and save the results. This way you can compare those results with the same benches when with your CPU @ 3.2, 3.4 and 3.6 so that you can actually see what gains you are getting from the OC. You might find that the gains from 3.4 to 3.6 is slightly marginal and may not be worth the extra stress and heat. Read through that entire thread that I posted earlier. Take notes on issues people where having, and what Volt settings they where using to obtain speeds over 3.6Ghz.
     
    Last edited: Jan 1, 2009
  15. Treble557

    Treble557 Master Guru

    Messages:
    424
    Likes Received:
    0
    GPU:
    EVGA GTX295/8800gtx physx
    Wow. Well, you guys are amazing at this. I feel like I should be paying for information of this level, lol.

    I guess the only other question I could ask here is.. For later once im done with all this upgrading and I get my custom fan, gtx260(216 core 55nm), OC my cpu to 3.2 ghz (thank you very much for the details on that again), and all that stuff... Whats left is to find out what PSU I'll need to SLI 260's for later, and if my ram will beable to handle all this! (OCZ 4 gig dual channle 800 mhz).

    Also, im gonna go ahead and assume that the 780I SLI from EVGA will beable to handle all this stuff too? the overclocking, the new cards, etc etc... Is that all in line for this new upgrade lead im taking?
     

  16. slickric21

    slickric21 Guest

    Messages:
    2,458
    Likes Received:
    4
    GPU:
    eVGA 1080ti SC / Gsync
    I don't think that the 680i > 780i is a very smart upgrade tbh. Why are you even considering it ?
    Wait a while and consider getting a X58 when the new Core i7 cpus are more affordable.

    As for you PSU you need something around 800w to be comforable with sli gtx260's and at least a 60a 12v rail.
    Heres an example, always go for a trusted brand on your PSU.
    http://www.newegg.com/Product/Product.aspx?Item=N82E16817256024
     
  17. Foes

    Foes Ancient Guru

    Messages:
    1,673
    Likes Received:
    0
    GPU:
    Tri-Sli 260's
    I would not consider going from the 680 to the 780, or even the 790 chipset. I dont think the gains, if any, would justify the cost. Your RAM can easily handle it, it will not be running above its specs of 800mhz. In regards to the PSU for a 260 SLi, obviously output power is a concern, but the biggest concern is the Volts on the +12V line. Nvidia recommends atleast 38V's for a 260. People will say you can run a 260 with slightly lower volts, but lets use what Nvidia recommends as a rule of thumb to go by. Here is what Guru3d recommends:
    A GeForce GTX 260 requires you to have a 550 Watt power supply unit at minimum if you use it in a high-end system. That power supply needs to have (in total accumulated) at least 40 Amps available on the 12 volts rails.

    A second GeForce GTX 260 requires you to have a 700 Watt power supply unit at minimum if you use it in a high-end system. That power supply needs to have (in total accumulated) at least 50 Amps available on the 12 volts rails.

    I am getting ready to Sli my 260 here soon and I will be using the Corsair 850W http://www.newegg.com/Product/Product.aspx?Item=N82E16817139009
     
  18. Treble557

    Treble557 Master Guru

    Messages:
    424
    Likes Received:
    0
    GPU:
    EVGA GTX295/8800gtx physx
    That's interesting... The reason I was going from the 680 to the 780 was because I needed the PCI-E 2.0 slot in it for the 260 card. Is there a better mobo I can get atm for the 2.0?

    I mean, if I don't have that, am I even getting a real upgrade by using the 260 over the 8800gtx?
     
  19. GhostXL

    GhostXL Guest

    Messages:
    6,081
    Likes Received:
    54
    GPU:
    PNY EPIC-X RTX 4090
    If you play 1920x1200 GTX 280 is the better card. I've used GTX C 216 , but it was much slower with 8x and 16x AA. If you just want 4x AA then the Core 216 is fine. But if you really want 1920x1200 like I use, GTX 280 is the better buy.
     
  20. Treble557

    Treble557 Master Guru

    Messages:
    424
    Likes Received:
    0
    GPU:
    EVGA GTX295/8800gtx physx
    Welp, I never really turn AA above 4x. I've.. never seen the need to honestly. Does it really make it look that much better? Because I never saw too big a diffrence.

    *edit*
    Also, still hoping to see if the 2.0 slot is worth upgrading for or not yet from someone. My wanting to go from the 680I to the 780I was called into question and it sorta also called into question (in my mind) if the 2.0 slot is actually worth it or not.
     

Share This Page