Radeon R9 295X2 Review : VS TITAN-Z

Discussion in 'Videocards - AMD Radeon' started by DGLee, Apr 8, 2014.

  1. DGLee

    DGLee Guest

    Messages:
    44
    Likes Received:
    2
    GPU:
    AMD Radeon HD 5970 2GB
    After a couple of days NVIDIA announced its newest and finest graphics card named GeForce GTX TITAN Z, AMD confirmed that a successor of Radeon HD 7990 is en route to its launch. Maybe its code name is a bit familiar to us; "Vesuvius" was originally designate the name of a volcano located in Italy which destroyed the whole -then- city of Pompei, however now it indicates AMD's most powerful and outrageous (for all meaning) graphics card. It has been given a name following current nomenclature, R9 295X2.


    R9 295X2 has two Hawaii XT GPUs. There were some rumors that it would have two Hawaii Pro instead of Hawaii XT thank to several reasons, most of those converge into one criterion : Power efficiency. R9 290X, which is based on Hawaii XT, consumes significantly more Watts than its Hawaii Pro variant R9 290, while the latter easily performs 90 to 95% of the performance of the former. Simply, Hawaii Pro is more power efficient and due to Total Board Power (TBP) limitation, using two Hawaii Pro instead of Hawaii XT seems sound. But, as of today, AMD didn't choose to do so. Not only equipping power hungrier GPU for its coming product, but also AMD actually overclock said hungrier GPU thus makes it the world's hungriest graphics card. In fact, official TDP specification of R9 295X2 is 500W. This is a necessary evil of being the world's fastest gaming gear, especially for the company whose fastest GPU cannot easily overwhelm the competitor's one in terms of power efficiency.


    Anyway, since NVIDIA announced that TITAN Z will have 8 TFLOPS of single presicion calculation power and this implies it will have severely lower operating speed than current single-GPU flagship such as GTX 780 Ti or GTX TITAN Black, there is no doubt that R9 295X2 will retake the "single card king" title. The point we have to focus on is, by how far, R9 295X2 bests TITAN Z. Today, on disclosing performance and other characteristics (power consumption, thermal characteristics, etc) of R9 295X2, I also am going to try to answer that question by "simulating" TITAN Z by lowering the operating speed of a pair of GTX 780 Ti(to fit in 8 TFLOPS at base).


    (I also made a youtube clip for quick summarization, but not sure whether it (= link the url) violates forum rules or not.)


    Test system configuration is as follow.

    [​IMG]


    Note that every Kepler-based SKU in this table has both "base" and "boost" state in operating speed. Since FLOPS number is derived from their own base clock, fitting GTX 780 Ti into 4 TFLOPS (= 8 TFLOPS for two of them) means its base clock goes 711MHz whereas the booster remains unchanged. In case of GTX 780 Ti, max boost varies according to ASIC quality of its GPU and is about +150MHz. So our own "simulated" TITAN-Z has a range of operating speed from 711 to 861MHz thus yields 8 to 9.7 TFLOPS instead of being solid at 8 TFLOPS. In other words, it is highly anticipated that "actual" TITAN-Z will be slower than what we demonstrated here.


    Well, let's torture it. Here are performance summaries for each scenario(res * aa).

    [​IMG]

    [​IMG]

    [​IMG]

    [​IMG]

    [​IMG]

    [​IMG]


    Power consumption & thermal characteristic:

    [​IMG]

    [​IMG]


    So far, we have seen all the test results and may derive following pros and cons:


    <Pros>

    - World's fastest single graphics card.
    - Almost as fast as a pair of R9 290X or GTX 780 Ti.
    - Significantly less noise than R9 290 / 290X thank to liquid cooling solution.
    - No throttling during over-an-hour test.
    - Scalability : CF-able and only needs two slot width.


    <Cons>

    - Very expensive. (yet it's only a half of TITAN-Z)
    - Historic TDP.
    - May incompatible with some PSU. (each 8-pin connector needs to bear ~20A)
    - May incompatible with some chassis due to radiator installation.


    Thanks for reading. Have a nice day!
     
  2. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    Thanks for the numbers:nerd:

    btw, did you benchmark all those or just used some sophisticated calculations?
     
  3. AvengerUK

    AvengerUK Guest

    Messages:
    699
    Likes Received:
    4
    GPU:
    Titan X (Pascal) EK-WB
    I would say nice, but your not using the latest Nvidia beta drivers - which are golden for 780 Ti SLi

    Without these the review is not really valid... sorry!

    (I'm not a fanboy trolling btw, the drivers do make a big difference when comparing the two)
     
  4. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,552
    Likes Received:
    609
    GPU:
    6800 XT
    the drivers sure helped in few select games that were really cpu bound but surely 295x2 is pretty much faster then 780 ti sli in the resolutions where cpu bound does not happen
     

  5. AvengerUK

    AvengerUK Guest

    Messages:
    699
    Likes Received:
    4
    GPU:
    Titan X (Pascal) EK-WB
    From a review using the new nvidia drivers:

    (AVG)

    [​IMG]
     
  6. kd7

    kd7 Guest

    Messages:
    151
    Likes Received:
    4
    GPU:
    7970m
    that table is a bit confusing because of the "Resolution" in the top left cell.

    Nice review but from what I saw the 295x2 had equal or better min fps in most games/settings.

    I am surprised at how little GPU reviews have evolved since my "glory" days of university (when I actually had the time to care).

    It's about time people started to include the median instead of just the average which is very sensitive to outliers (high and low spikes) and thus not a very good representation of gameplay by itself.

    The detailed plots of all the frames where one can easily spot the spikes are a big plus but these are usually deconvoluted to average/min/max fps, that in my opinion don't illustrate gameplay fluidity at all (even if the median was included).

    If any of you heavy duty benchmarkers wants to give me their raw benchmark data I'll be happy to make a few graphs to show you what I think would be a much better presentation of benchmark data.
     
  7. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    Nice post, and it certainly does look that way.

    Unless Nvidia has something planned that we don't know about i'm not sure how Titan Z can justify $2k, nevermind $3k.

    Would the extra 2GB of vram make a difference at 4k resolutions, or are Nvidia planing more optimisations to the DX drivers.
     
  8. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    Bad timing with the new beta from nvidia coming out yesterday
     
  9. AvengerUK

    AvengerUK Guest

    Messages:
    699
    Likes Received:
    4
    GPU:
    Titan X (Pascal) EK-WB
    It was cutting it too close for most reviews to use them, true :p

    and, I don't see why the Titan Z / how it can justify the massive price tag - AMD would be the obvious choice for a dual GPU single pcb unless theres something we don't know about the Z
     
  10. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    I can't see people buying the titan z for 3 grand. We may get a much cheaper 790.
     

  11. undeadpolice

    undeadpolice Master Guru

    Messages:
    203
    Likes Received:
    0
    GPU:
    EVGA GTX980 K|NGP|N (SLI)
    Fans would buying without second thought
     
  12. main_shoby

    main_shoby Guest

    Messages:
    1,047
    Likes Received:
    5
    GPU:
    RTX 3080
    Like most people here in this thread, I also own 780ti in SLI and personally speaking, I don't think there are games out there which justify such kind of investment in gaming hardwares :/
    I mean, surely there is a supply of high end games, but supply in this segment is not good enough. Two or three top knot titles per year isn't plenty.
     
    Last edited: Apr 9, 2014
  13. Deathchild

    Deathchild Ancient Guru

    Messages:
    3,969
    Likes Received:
    2
    GPU:
    -
    Yeah, you're right. In the end you realize it's just a waste of money and the gaming market is big.
     
  14. Veteran

    Veteran Ancient Guru

    Messages:
    12,094
    Likes Received:
    21
    GPU:
    2xTitan XM@1590Mhz-CH20
    +1 The reason why many people including myself are still on x58 even though i have a 4930k its not needed tbh.
     
  15. kd7

    kd7 Guest

    Messages:
    151
    Likes Received:
    4
    GPU:
    7970m

  16. Koniakki

    Koniakki Guest

    Messages:
    2,843
    Likes Received:
    452
    GPU:
    ZOTAC GTX 1080Ti FE
    What worries me the most, its the fact that we came to "accept" and somehow "justify" $3000 and $1500 gpu's. Dual or not. :bang:


    Edit: I also wanted to add that most if not all reviews compare the 295X2 with stock 780Ti's. I wanna see them compare two custom 780Ti's.

    Now that would give the benches a different outcome.
     
    Last edited: Apr 16, 2014
  17. AcidSnow

    AcidSnow Master Guru

    Messages:
    438
    Likes Received:
    24
    GPU:
    Sapphire 5700XT
    If the 3xx series is indeed 20nm, then I'd be curious to see how the 390X could perform, because honestly I don't know why anyone would buy a 290X over a 290.
     
  18. Gfxfyle

    Gfxfyle Member

    Messages:
    12
    Likes Received:
    0
    GPU:
    Radeon R9 290X
    I had an idea the numbers would be around this range. Time for me to upgrade! :)
     
  19. Rich_Guy

    Rich_Guy Ancient Guru

    Messages:
    13,146
    Likes Received:
    1,096
    GPU:
    MSI 2070S X-Trio
  20. GhostXL

    GhostXL Guest

    Messages:
    6,081
    Likes Received:
    54
    GPU:
    PNY EPIC-X RTX 4090
    I totally agree, but I do see the true 4K hardcore folks with money to burn buying 2 each.

    Man I hope they don't...what a waste of money.
     

Share This Page