3DMark Time Spy released

Discussion in 'Benchmark Mayhem' started by TDurden, Jul 14, 2016.

  1. Han2K

    Han2K Master Guru

    Messages:
    306
    Likes Received:
    6
    GPU:
    MSI GTX1080 GX
    Not bad at all.

    [​IMG]
     
  2. newls1

    newls1 Master Guru

    Messages:
    260
    Likes Received:
    24
    GPU:
    RTX 4090
    SSE has been with us since at least the PIII days.... His PhenomII has SSE
     
  3. Bardock

    Bardock Guest

    Yes! Just found it myself actually, but thanks for quick reply.

    3DMark's in minimum requires 1.8Ghz DualCore CPU with SSSE3 support, and my Phenom does not have SSSE3 instruction support.

    Thanks again tho.
     
  4. Undying

    Undying Ancient Guru

    Messages:
    25,332
    Likes Received:
    12,743
    GPU:
    XFX RX6800XT 16GB
    His Phenom II is lacking SSE4 instructions. I've seen quite a few games not running at all becouse that problem so i thought there is a connection.
     

  5. Agonist

    Agonist Ancient Guru

    Messages:
    4,284
    Likes Received:
    1,312
    GPU:
    XFX 7900xtx Black
    I have my 280 @ 1120/1400

    Whats weird is I have no voltage options in Afterburner even though this is a black edition XFX 280.

    Guess I need to make a bios for it.

    I used to have a reference Sapphire 7950 boost with a custom bios @ 1300/1500.

    Yes its time to snag up a $199 XFX 4GB RX 480 and use it ill high end vega drops.
     
  6. jonerkinsella

    jonerkinsella Guest

    Messages:
    1,860
    Likes Received:
    0
    GPU:
    rx480
  7. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
  8. Undying

    Undying Ancient Guru

    Messages:
    25,332
    Likes Received:
    12,743
    GPU:
    XFX RX6800XT 16GB
    Maxwell cards really benefits from Async enabled :

    [​IMG]
    [​IMG]
     
    Last edited: Jul 16, 2016
  9. Han2K

    Han2K Master Guru

    Messages:
    306
    Likes Received:
    6
    GPU:
    MSI GTX1080 GX
    You can't deny that Maxwell doesn't support Async-Compute as it is stated. If not, how do you explain the performance drop? :eek:c:
     
  10. GenClaymore

    GenClaymore Ancient Guru

    Messages:
    6,067
    Likes Received:
    52
    GPU:
    3070 TI 8GB

  11. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    I can't remember the exact wording, but think the Oxide dev explained that while Maxwell cards do have support for async compute, they are unable to use it and gain performance, if anything there is a risk of performance loss.

    I just tried with and without, and i gained 50 on the graphics score by disabling async compute.
    Nearly a year done the line, i'm sure some of us can give up on this every being a feature on Maxwell cards.

    Not the end of the world though, performance is still great with this benchmark and competitive considering what i paid.

    Just a pity DX12/Vulkan games are not this well optimised.

    Not much between our two systems, just my CPU starting to show it's age.

    3DMark score:6512
    Graphics Score :7335
    CPU core:3982

    http://www.3dmark.com/compare/spy/64011/spy/61196
     
    Last edited: Jul 16, 2016
  12. Undying

    Undying Ancient Guru

    Messages:
    25,332
    Likes Received:
    12,743
    GPU:
    XFX RX6800XT 16GB
    Indeed. Maxwell cards are good performers overall. It's a shame that wont get any better at this point. Peak of performance and async as you see above, isnt helping at all.

    Considering how Kepler cards are performing bad lately (Vulcan and DX12) Maxwell is still standing.
     
  13. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,097
    Likes Received:
    2,603
    GPU:
    3080TI iChill Black
    Or just not enabled properly yet. Nv didnt say anything about it with r367 branch, so.. You can continue with your conspiracy theory.
     
  14. seaplane pilot

    seaplane pilot Guest

    Messages:
    1,295
    Likes Received:
    2
    GPU:
    2080Ti Strix
  15. cad cam man

    cad cam man Master Guru

    Messages:
    614
    Likes Received:
    13
    GPU:
    MSI RTX 4090
    Last edited: Sep 1, 2016

  16. deathfrag

    deathfrag Guest

    Last edited by a moderator: Jul 18, 2016
  17. Undying

    Undying Ancient Guru

    Messages:
    25,332
    Likes Received:
    12,743
    GPU:
    XFX RX6800XT 16GB

    So what happened there?
     
  18. Hellraiser

    Hellraiser Member

    Messages:
    42
    Likes Received:
    2
    GPU:
    2070 Super
    SCORE
    3 796 with AMD Radeon R9 290(1x) and AMD FX-8350
    Graphics Score 3 924
    CPU Score 3 204
    3dmark.com/3dm/13336973
    Worst motherboard on the planet :( ASRock 970 Extreme4
     
  19. robintson

    robintson Guest

    Messages:
    423
    Likes Received:
    114
    GPU:
    Asus_Strix2080Ti_OC
    Driver: 368.81
    GTX980M@1.2 GHz , i74860HQ

    Time Spy Result: 3296
    Graphics score: 3223
    CPU Test: 3790

    [​IMG]

    http://imgur.com/a/txXPP
     
    Last edited: Jul 17, 2016
  20. BrimStone101

    BrimStone101 Guest

    Messages:
    39
    Likes Received:
    0
    GPU:
    16 gig
    messed up

    well nothing changes.
    http://steamcommunity.com/app/223850/discussions/0/366298942110944664/


    what he said was interesting.
    Im not sure of any of it
    If you guys made Time Spy and use Async Compute to overlap rendering
    , you play right into Pascal's hand. It does not have real parallel execution but
    it can fast context switch, with preemption and dynamic load balancing to
    improve it's shader utilization if it's under used. At 1080 and 1440p, it is very
    much likely not to be using 100% hence the very small gains. People find the
    gains drop to almost zero at 4K due to all the shaders being utilized already.


    If you guys had actually used a real parallel and multi-engine approach, you would see major gains across the
    board for all GPUs capable of this on hardware. Regardless of whether it's a
    low shader GPU like 380/X or even the new RX 480, which only have 2304
    shaders vs Fury X 4096.
     
    Last edited: Jul 17, 2016

Share This Page