Review: Nvidia GeForce GTX 1080 Founders edition graphics card

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 17, 2016.

  1. KFBR392

    KFBR392 Member

    Messages:
    34
    Likes Received:
    0
    GPU:
    RTX 3090
    :/ That's not what async is meant to do. There are computations which need/HAVE to be made serially, this is where async will make no difference whatsoever (and where you are correct in the nvidia is extremely efficient there, no argument). Then there are computations which can be completed asynchronously, meaning that it doesn't depend on the computation ahead of it. If, like Maxwell, you do not posses the ability to async compute then all those computations which could have been completed will be done serially. No matter how perfectly you programmed the efficiency the async capable chip will always outperform, I mean just look at programs that efficiently take advantage of hyper threading (aka async). As developers become more proficient at applying async there will be larger and larger disparities between chips of differing async capability.
     
  2. Corbus

    Corbus Ancient Guru

    Messages:
    2,469
    Likes Received:
    75
    GPU:
    Moist 6900 XT
    Open bench, 2164mhz core 1500mhz mem, 83 degrees, enough to run firestrike extreme at least. Thats what some site showed, believe it, dont, thing is with custom cooling on air i bet you can easily reach 2200mhz.
     
    Last edited: May 17, 2016
  3. Aelders

    Aelders Guest

    Messages:
    870
    Likes Received:
    0
    GPU:
    980Ti G1 @ 1490|8000
    hyper-threading isn't the same as async compute, hyper-threading is the same as async shaders because they're both marketing terms.

    simultaneous multi-threading has little to do with async compute

    also, funnily enough, in AotS async compute is used for rendering tasks, so you are explicitly synchronized with the graphics engine, so you're wrong on that too

    That's ridiculous, GPUs do not work serially. Maxwell's problems with async are not as you describe, and it is capable of async compute. I know because it works in cuda
     
    Last edited: May 17, 2016
  4. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    Monster of a card, I really like;

    1. Noise-level (surprised me that Fury X is a noisy bitch).
    2. Power-consumption.
    3. 1080P Performance!
    4. 2K Performance! This is it's ideal use I think.
    5. 4K very good, despite not quite being the 60fps all-round we're looking for.
    6. Nice over-clocking, but, pointless right now.
    7. Really does beat GTX980Ti and Titan X!

    What I didn't like;

    1. Price; £619 is a lot, especially considering a GTX980Ti is around £490 now.
    2. Heat! was expecting it to run a fair bit cooler, especially after JHH presentation.
    3. Overclocking; didn't quite hit 2100Mhz.

    Definitely one to watch-out for once prices come down and settle. This does make the GTX980Ti look well-priced atm, but, I would rather have a little more "future-proof" bs, just not £130 extra for it.

    All-in-all, an awesome card and great for bringing in the negative AMD trolls.
     

  5. Aelders

    Aelders Guest

    Messages:
    870
    Likes Received:
    0
    GPU:
    980Ti G1 @ 1490|8000
    AMD subreddit is bitching about Guru3D being biased because the fury x is missing from some benchmark graph
     
  6. Aelders

    Aelders Guest

    Messages:
    870
    Likes Received:
    0
    GPU:
    980Ti G1 @ 1490|8000
  7. qgshadow

    qgshadow Guest

    Messages:
    109
    Likes Received:
    0
    GPU:
    GTX 1080 SLI 2560x1440
    Yeah maybe for 1 minute, after that it starts throttling. EK posted a picture on their facebook of a 1080 with a waterblock and it shows the card power throttled unfortunatly.

    Have to wait for better 3rd party cards.

    So much for Founder's edition which is bad and overpriced compared to other vendors :/
     
  8. davido6

    davido6 Maha Guru

    Messages:
    1,441
    Likes Received:
    19
    GPU:
    Rx5700xt
    lol its not turned out as epic as i thought bit off a let down really
     
  9. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    Most anything AMD related like to bitch. We should be used to it by now.
     
  10. Aelders

    Aelders Guest

    Messages:
    870
    Likes Received:
    0
    GPU:
    980Ti G1 @ 1490|8000
    Watch these videos


     

  11. Undying

    Undying Ancient Guru

    Messages:
    25,478
    Likes Received:
    12,884
    GPU:
    XFX RX6800XT 16GB
  12. Aelders

    Aelders Guest

    Messages:
    870
    Likes Received:
    0
    GPU:
    980Ti G1 @ 1490|8000
    temperatures I'm assuming actually, though high temps will also increase power consumption
     
  13. KFBR392

    KFBR392 Member

    Messages:
    34
    Likes Received:
    0
    GPU:
    RTX 3090
    Okey doke, so you are telling me that GPU's are only capable of carrying out independent instructions?
     
  14. morbias

    morbias Don TazeMeBro

    Messages:
    13,444
    Likes Received:
    37
    GPU:
    -
    Did anyone notice that there are traces on the PCB for an extra power phase? I wonder what that's about.
     
  15. Aelders

    Aelders Guest

    Messages:
    870
    Likes Received:
    0
    GPU:
    980Ti G1 @ 1490|8000
    Nope, I hadn't noticed, I noticed space for another power connector


    No
     

  16. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    It throttled even with water cooling, so it is power delivery/internal(software) power limit.
     
  17. Aelders

    Aelders Guest

    Messages:
    870
    Likes Received:
    0
    GPU:
    980Ti G1 @ 1490|8000
    Where did you find wc tests ?

    http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/30.html
     
    Last edited: May 17, 2016
  18. jura11

    jura11 Guest

    Messages:
    2,640
    Likes Received:
    707
    GPU:
    RTX 3090 NvLink
    Hi HH

    Can you try to do some compute benchmarks like IRAY or Otoy Octane Benchmark and LuxMark ?

    I would be very appreciated for this

    Regarding the performance looks OK,not quiet what I expected,I expected bit more and mainly those temps are just same as my Titan X with stock blower and that's bit shocking for me,I expected something in 70C region and not 82C and price,NVIDIA milking machine is back,£619 for this card is bit too much although beat Titan X in every game,just I would bit wait on compute test,I personally I don't game and I rather use my GPU for Octane and IRAY rendering

    Thanks,Jura
     
  19. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    15,140
    Likes Received:
    1,743
    GPU:
    GTX 1080 Ti
    I guess if board partners wanted to add a second 8-pin or even supplement a 6-pin just for overclocking.

    They did the same thing with the 750(Not the TI), there were pinouts for a 6-pin molex adapter.
     
  20. KFBR392

    KFBR392 Member

    Messages:
    34
    Likes Received:
    0
    GPU:
    RTX 3090
    So then there are instructions that the GPU has to perform in successive order because they are dependent?
     

Share This Page