NVIDIA GeForce Titan 780 3DMark score

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 31, 2013.

  1. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
  2. SLI-756

    SLI-756 Guest

    Messages:
    7,604
    Likes Received:
    0
    GPU:
    760 SLI 4gb 1215/ 6800
    Yep but looking at these numbers the first question is what game is gonna get use outta this kinda gpu power now and even into near future, apart from Crysis 3 of course.
     
  3. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    With new consoles this year you'll see graphics get pushed up a little more, just look at Unreal 4 engine and Luminous engine and whatnot. Plus 4K monitors are going to be all the rage soon.
     
  4. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,097
    Likes Received:
    2,603
    GPU:
    3080TI iChill Black
    so many nay sayers its funny, usually those that just bought a new gpu. Deal with it :p
     

  5. SLI-756

    SLI-756 Guest

    Messages:
    7,604
    Likes Received:
    0
    GPU:
    760 SLI 4gb 1215/ 6800
    Deal with it? how're you getting on with that imaginary mobo you have in your specs, good OCer is it?

    *high five anyone?

    haha :p
     
  6. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    I'd say its more because a jump that massive has only occurred once before, that was with the 8800. It's just rare and the only evidence is a single screenshot from a chinese website.

    Thing people have to remember though is that nvidia has already built the chip. It's found in the K20X. Stick that thing into a gaming card and clock it 20% faster and you have 690 performance @ 225w. The only thing I'm skeptical about is the price. I can't see them launching a product like this at the standard $500 point.
     
    Last edited: Jan 31, 2013
  7. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,097
    Likes Received:
    2,603
    GPU:
    3080TI iChill Black
    Well yes that's what im saying, a lot can not grasp the fact it can be so fast.. And that it will make their new shiny GK104 just a poor mid-range gpu soon.


    @SLI 754

    its great, best from both worlds :p
     
    Last edited: Jan 31, 2013
  8. Mraz

    Mraz Master Guru

    Messages:
    664
    Likes Received:
    12
    GPU:
    /
    I would say the picture itself is fake, BUT, on the other hand there is and always will be a plan that is already on the table with instructions and/or engineer ideas of what should future cards have.

    What I mean is, this card may be a simple K20X ''consumer'' level modified card, but what is behind the corner as a new lineup we will never know, and seriously if next-gen will run Unreal4 which looks really good, imagine a PC exclusive next-gen titles that will look better, but will also suck your PC dry.

    There are still games people cant max out in the current gen even with 690 or 680's in SLI, such as Witcher 2 with ubersampling, that game runs in like 30 FPS with those settings on 1920*1080p.

    By the way I dont see Crysis 3 raising the bar or setting any standard in anything really, game sucks graphic wise in my opinion, and even on Very high, without that V-sync on it runs butter smooth, and looks crappy.

    The real future that may crush even the K20X type of cards will be things such as VFX Tech Demo(look it up on Youtube), now imagine that kind of graphics with hundreds of NPC's thousands of rays, real time shadowing like in Metro etc...and tell me that Crysis3 is better in any way?
     
  9. Rich_Guy

    Rich_Guy Ancient Guru

    Messages:
    13,138
    Likes Received:
    1,091
    GPU:
    MSI 2070S X-Trio
    I thought they'd already said that Titan wasn't the 780, as their 7 series isn't due till after AMD's 8s.
     
  10. makaveli316

    makaveli316 Guest

    Messages:
    168
    Likes Received:
    0
    GPU:
    Asus Strix GTX 970
    It's like Apple giving you iPhone 7 next year with all that stuff that are missing from the previous iPhones. They won't do it, cause they rather do it progressively and still get the most of the profits every year even without huge changes.

    I said Apple, but that's just an example, cause every company does this.

    A 780 being much better than cards with current price of 1000+ dollars/euro is unrealistic, unless it costs at least the same.

    Looking at the performance of the 780, may i ask how a 790 would perform? C'mon, this is fake. I hope not, but it is.
     

  11. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    Nope with 2688 cores and clocking it up 20% from the current clock of 732MHz and your TDP of 235 watts and thermals are out the window and you still won't be close to a GTX 690 never mind more than double SLI 680's. It's BS performance numbers without even thinking about it.
     
  12. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    Yeah you might be right now that I'm actually thinking about it. I mean the 690 and the GK110 is only a 13% difference in core count. That's not to bad. Considering the clock speed some Kepler units are hitting it might be possible to make it up the difference there but not without exceeding 300w TDP. I guess they could go over but I don't know. And I definitely don't see them going 15% over. Not with 2688 cores. So yeah, changing my opinion.
     
  13. Texter

    Texter Guest

    Messages:
    3,275
    Likes Received:
    332
    GPU:
    Club3d GF6800GT 256MB AGP
    Just looking at some GK104 and GK110 specs, I noticed that GK110 has 960 DP cores (64 per SMX) that don't return in the official CUDA core count, whereas GK104 only has 64 (8 per SMX). That would bring GK104 to 1600 total CUDA cores and GK110 to 3840 if you could use the DP logic for SP calculations. (I never coded for a GPU so what the heck...at least I'm trying to make sense out of a score that would be roughly 2k higher than expected)

    And if you take X3300, divide it by 1600, multiply by 3840, times .8 for clock speeds and 1.15 for the GK110's memory advantage you get 7286, which is close to the 7107 score. Nonsensical perhaps, but if you'd want to make up a fake score, this could be the way lol.

    As for the TDP, a K20X has a HPC TDP rating, and Geforce is rated for games. Loads are completely different, with the HPC rating being comparable to running Furmark on a Geforce. A Geforce Titan with a 250W TDP would be a 350-400W K20XXX lol

    And yes I'm just trying to make the best out of the rumor season with this.
     
  14. Rugburn

    Rugburn Guest

    Messages:
    133
    Likes Received:
    0
    GPU:
    2x EVGA GTX1080 Ti FTW3
    I do hope this is a genuine test result of the 780.. I held out on the 680's (still have my 3x 580's 3gb) because I didnt feel the 680's would give me much of a performance gain..
     
  15. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,097
    Likes Received:
    2,603
    GPU:
    3080TI iChill Black
    @Texter

    I think you pretty much nailed it and how it could/will be.

    Also higher ROP and texture unit count takes care of the rest :)
     
    Last edited: Jan 31, 2013

  16. Gripen90

    Gripen90 Guest

    Messages:
    869
    Likes Received:
    21
    GPU:
    2x RTX 2080Ti SLi
    Thats a really high score !!! I doubt it to be true else I really have to get two of them. I get X8200 points with 3 stock clocked GTX 670s.
     
  17. Solid_State

    Solid_State Guest

    Messages:
    105
    Likes Received:
    0
    GPU:
    5870 VF3000A / GT430
    "insanity now serenity later" that score is something else wow!
     
  18. Illnino

    Illnino Guest

    Messages:
    603
    Likes Received:
    0
    GPU:
    XFX 290x EK H20
    April fools!, oh wait, oh its only February 1st.......
     
  19. pbvider

    pbvider Guest

    Messages:
    989
    Likes Received:
    0
    GPU:
    GTX
    Don’t believe everything you read!
     
  20. warlord

    warlord Guest

    Messages:
    2,760
    Likes Received:
    927
    GPU:
    Null
    very nice point of view maybe it's almost correct, maybe the difference you found is a cpu bottleneck because it's just a 2600K there for a beast gpu ;)
     

Share This Page