AMD R9 390X, Nvidia GTX 980 Ti and Titan X Benchmarks Leaked

Discussion in 'Frontpage news' started by vavyn, Mar 14, 2015.

  1. bigfutus

    bigfutus Master Guru

    Messages:
    535
    Likes Received:
    59
    GPU:
    MSI 3080 VENTUS 10G
    And where is the price for that big ass PSU that can handle this nonsense?
     
  2. Megabiv

    Megabiv Guest

    Messages:
    798
    Likes Received:
    2
    GPU:
    GTX980Ti SLI (h²o)
    It was on those slides as a GM200 cut 6GB which is performing better than a 980. There was also 9XXperforming better than a 960 and worse than a 970 which would lead you to believe these are the Ti versions. Of course if this was a legitimate website review I'd agree, but since it's a quick photoshop with no actual benchmarks, settings shown or even the games it's running I'd take it with a few boat loads of salt.

    The speculation does have basis though If I remembered correctly the GTX560 when released was pretty meh... then Nvidia released a 560Ti which was a much better card and probably what they should have released at the time. Then fast forward to the GTX780 which was replaced with a 780Ti variant so it stands to some reason Nvidia would be up to their old tricks again.

    *hypothetically speaking here*
    The 960Ti if performing where it does on that chart and released for the same price the 960 is at now, wouldn't be that bad a card for mid-range performance as it shows it just under a GTX780. Likewise if Nvidia do release a 980Ti it stands to reason the performance would be like that of the 780ti > 780.


    Yeah it's all speculation until some one has the benchmarks up and I mean real ones with full disclosure on settings, games and timedemos.
     
  3. snip3r_3

    snip3r_3 Guest

    Messages:
    2,981
    Likes Received:
    0
    GPU:
    1070
    I think people are forgetting that the Titan is more orientated for HPC users than gaming. Even though you can use it for gaming, the ample RAM is there more for rendering than any game will ever utilize even at 4K. I still view Titans as a great value for GPU assisted rendering.
     
  4. mR Yellow

    mR Yellow Ancient Guru

    Messages:
    1,935
    Likes Received:
    0
    GPU:
    Sapphire R9 Fury
    If AMD price the 390 right they will make a killing.
     

  5. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    Titan will use a different GPU. The GTX970 and 980 both use the GM204 GPU whereas Titan will use the GM200 GPU. While they will share the same micro-architecture, the GPU's will likely use different "designs" as has been done in the past. The GM200 should target the professional market, as well as "gamers" so it can't suffer from the same "memory issue" as the GTX970 if NVidia wants to sell many of them. Also, only the GTX970 has any "memory issue"....

    The suspected GTX980Ti is listed as "GM200 cut" and the suspected GTX960Ti is listed as "GTX9**"

    These are simply rumored benchmarks with no evidence available to support the results shown in the graphs.

    And that's mostly a lie. Unless you intend to spend the time pushing both configurations to their absolute maximum, you'd actually be spending less on an AMD configuration for similar performance in instances where only 4 threads are used. In either case, you'll still be able to use the same power supply. The power draw difference isn't nearly as extreme as you want people to believe.

    If you're buying a power supply small enough that the power draw of your chosen components may present an issue, you've already made a mistake.

    NVidia still markets it as a 4GB card as well.

    That's why it's called a "rumor"

    Supposedly the GM200 will lack the necessary DP performance for HPC use. Of course, that's only a rumor as well since NVidia doesn't generally confirm anything prior to release.....
     
    Last edited: Mar 15, 2015
  6. gx-x

    gx-x Ancient Guru

    Messages:
    1,530
    Likes Received:
    158
    GPU:
    1070Ti Phoenix
    yea? Well, get yourself 530W PSU and try it. Also, don't spend more than 70$ on mobo. Good luck! :)
     
  7. evasiondutch

    evasiondutch Guest

    Messages:
    207
    Likes Received:
    0
    GPU:
    2x-MSI gaming 290x 4gb OC
    It would being impressive if it was other way around you green fanbois lol
     
  8. gx-x

    gx-x Ancient Guru

    Messages:
    1,530
    Likes Received:
    158
    GPU:
    1070Ti Phoenix
    it's not 3 fps it's 3%. That may or may not be 3fps, 33 fps or 0.3 fps.

    this leads me to believe that the whole thing is fake. Had the person testing cards and games actually done it, he would have posted fps. The % based way allowed him to not test anything and be called for obvious BS, not that it matters since most of you wouldn't have figured it out as this thread demonstrates perfectly.
     
    Last edited: Mar 15, 2015
  9. evasiondutch

    evasiondutch Guest

    Messages:
    207
    Likes Received:
    0
    GPU:
    2x-MSI gaming 290x 4gb OC
    You can bet they will.
    AMD said they gonne win back market share no matter what.
    This means they come with a 390x that put all Nvidia cards to shame(i hope).
     
  10. Lane

    Lane Guest

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    Raytracing dont ask for a lot of memory. The problem is more on the processing power, for large, complex scene you will need to mulitply the gpu's, each gpu seen as a thread by the softwares, so each gpu have his own stack..

    The thing is most professional now use cloud render farm system, untill they can have a "cray like" based system on the office.

    I dont say the titan will not be good at it ( Raytracing use FP32 ).. put 4 on an home system for raytracing, and you will be quite happy with the setup.
    Just you will hit the computing limitation on one gpu way before the 12GB.
     
    Last edited: Mar 15, 2015

  11. snip3r_3

    snip3r_3 Guest

    Messages:
    2,981
    Likes Received:
    0
    GPU:
    1070
    You need alot of vRAM once you start getting complex scenes with large textures/render target. I'm not saying 12GB is going to be used completely, but it is much more preferable than 3/4GB (780/980). I personally don't need such cards, but do know people and organizations that utilize large farms of Titans for render purposes only. Previously if you wanted these capabilities you'd have to shell out even more for Quadros and Teslas. Provided you are using a CUDA/OpenCL render-to software package, all you really need is Quadros on the workstation with farms equipped with much cheaper Titans.

    Cloud services usually cost money via time. Most of the time we do iterative improvements/changes to a certain object/scene. Being able to render fast and cheap (electricity is part of the rent at the office) means alot. The faster we get a render out, the more we time we can tweak and get a more polished final project done.
     
  12. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    Why would you intentionally build a system with a 530watt PSU? That makes no sense at all. The absolute smallest I use in a "gaming" system is 600watts. Anything smaller and you're limited when it comes to upgrades.

    Under $70? Ok...... http://www.newegg.com/Product/Product.aspx?Item=N82E16813130679 Only $60 right now. It's regularly $70.

    Find me a Z97 motherboard for under $70.... The cheapest right now http://www.newegg.com/Product/Product.aspx?Item=N82E16813135389 It's $65 right now, regular price is $80.

    Unlike Intel, AMD allows overclocking on all processors and chipsets. Even my Athlon 5350 can be overclocked.

    Intel Core i5 4670K is $245. I can get an AMD FX-8320E ($140) + motherboard for $200 right now, or $220 next week. I'd be spending $310 for an i5 4670K and the ECS motherboard linked above this week or $325 next week. That $100 price difference easily covers the cost of a stronger PSU and still leaves more money in your pocket. The i5 4690K is $240 right now, regularly $250 making it $305 - 330 for board and processor, leaving you anywhere from $100 to $130 left over. That's a big price difference.
     
    Last edited: Mar 15, 2015
  13. gx-x

    gx-x Ancient Guru

    Messages:
    1,530
    Likes Received:
    158
    GPU:
    1070Ti Phoenix
    Because, when I measure my consumption it doesn't go over 387Watts. 530 is actually more than enough, for intel.

    PS. I use Z68 mobo, works just like Zxx whatever if you don't want extreme OC. I payed 65$ for it 3 years ago. You can OC on it just fine (I don't have an K series atm, but I OC via turbo multi with non-K) my sound system is external, all I need is optical or digital out from mobo and that's it.

    If that mobo you linked can get AMD fx 8xxx to 4.4 then ok but it's still slower than intel i5 @3.8, you will need something to get that AMD to at least 4.8GHz. That alone would draw over 200 watts from the wall in full load versus ~120W on intel? I know electricity is cheap but PSU isn't.

    I payed 120$ for my i5 and 65$ for my mobo. It's ~5-10% slower than series 4 intel on 200$ mobo and tons cheaper. If you want the newest and gretest, sure, AMD is cheaper, a bit, but slower still. It has to do with Vishera being an improved buldozer being 2+ yeaars old tech. If you can live with that, you can well live with intel i5 2500K for the same money can't you? It's still faster than that Vishera.

    PPS. Yes AMD allows for OC on all their CPUs so what? They need to allow it otherwise almost no one would buy them no matter the price. You need to overclock them to get your value for money and even then if game is optimezed for less than 4 cores an i3 will beat most of their overclocked CPUs.
     
    Last edited: Mar 15, 2015
  14. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    My Athlon 5350 based system doesn't go over 100watts. Would it make sense to run it on a 150watt unit instead of the 300watt unit it uses? The advantage to using a better power supply (aside from increased quality), is that you don't have to consider power consumption of components prior to upgrading, nor do you risk instability down the road as the unit degrades.

    So, you're worried about power consumption of a processor....but not your graphics card. I'm drawing less than 350watts under load. An FX-8320E isn't going to draw much more than my current configuration. Unless you go with a configuration that's complete overkill (such as what Anandtech did with the ASRock FX990 Extreme6 motherboard for their review), the FX-8320E draws less power at idle than my i7 2600K does.

    Value for money? My Athlon 5350 system cost half what it would have cost to build a Celeron system, yet performs just as well with lower power consumption. I also have the added benefit of being able to overclock it, unlike the Celeron.
     
    Last edited: Mar 15, 2015
  15. gx-x

    gx-x Ancient Guru

    Messages:
    1,530
    Likes Received:
    158
    GPU:
    1070Ti Phoenix
    There are test and charts showing consumption. My total consumption under heavy stress is 387W, that is everything on full load, not just CPU.
    I can add another 270x and I would still be "fine" but would rather get an 650W PSU of course, and I wouldn't get CF anyway. :)

    enable EIST please then check again. With EIST I see my idle speeds at 1600MHz and consumption per core under 10W most of the times, say, while typing this.

    PS. I am not worried about consumption because of the electricity bill, I just don't want to have a PC that needs a 1000W PSU that costs a lot of money. Even between a 500W and 750W there is a huge price gap. I am living in a country that makes cheap stuff for low salary so that 1st world countries can enjoy, I cannot afford to be an elitist. 750W PSU is my whole salary, well, not whole, one third of it.
     

  16. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    According to Hilbert's review, the FX-8320E draws approx 66watts at idle whereas Anandtech (using a power hungry ASRock FX990 Extreme6) claims 89watts. Either way, my i7 2600K draws over 100watts at idle. I don't disable power saving features just to overclock.

    My 620watt unit is perfectly sufficient for any hardware I'd be willing to put in my case. My UPS reports 380watt total load for my PC, cable box, router, TV and monitor. I'd have to unplug my TV, cable box, router and monitor to get an accurate power draw. According to the spec sheet, my monitor draws about 26watts. So, 354watts left to account for. I should be somewhere in the 300-320watt range at 100% load.
     
    Last edited: Mar 15, 2015
  17. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    Well this slide looks like that early leak back in December?, but back then all called it fake.. Mostly uber nvidia fans got upset by it.
    http://forums.guru3d.com/showthread.php?t=395508


    But then again like I said back then, its still too slow for 4K, both GM200 and 390X.
     
    Last edited: Mar 15, 2015
  18. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Like, it's possible to kind of guess and get it right. I mean remember the 980 rumors about it being slower then a 780Ti? There were benchmarks for that too. For every rumor that's correct there's like 30 that are wrong. And like I said in that post, when you're comparing a non-existent card to another one, it's just dumb.

    Also, the 390x isn't even slated till a June release now. Do you really think that they had a finished model to be benchmarked back in December with a driver that's actually indicative of the cards performance? Because honestly if they did -- and they still haven't released this card, then AMD is worse managed then I originally imagined.
     
  19. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    If the card was ready to be benchmarked in December with a proper driver, then waiting for a June release would indicate that AMD is now in worse shape than they were before from a management perspective.

    By the time R9 390X launches, it will be competing with products NVidia will have had on the market for roughly 9 months.
     
  20. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    Yes and no.


    Btw I dont remember any rumor saying 980gtx will be slower then 780ti, most said on par or slightly faster.
     

Share This Page