GeForce GTX 1070 / 1080 Founders Edition Explained

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 9, 2016.

  1. isidore

    isidore Guest

    Messages:
    6,276
    Likes Received:
    58
    GPU:
    RTX 2080TI GamingOC
    i can only say this: :funny:
     
  2. Turanis

    Turanis Guest

    Messages:
    1,779
    Likes Received:
    489
    GPU:
    Gigabyte RX500
    Short story:good cop-bad cop.

    If you dont like our reference card (bad cop),then you can buy a custom card at the same price but good looking (good cop).Always is a single winner. :wanker:
     
  3. Xuanzang

    Xuanzang Master Guru

    Messages:
    250
    Likes Received:
    21
    GPU:
    ASUS TUF 7900 XTX
    I seriously thought the FE would have better overclocking capabilities or something. Oh well. :)
     
  4. kinggavin

    kinggavin Guest

    Messages:
    297
    Likes Received:
    0
    GPU:
    gtx 980TI super jetstream
    I think NVidia knows that theres customers who must have the gtx 1080 and that geforce LED on release day they cant wait they have to get the card and they will pay a permium for it so change name to founders from reference and add $100 and make lots of money which is the main goal of a company i suppose , but at the end of the day its a lot money to pay for a blower cooler people say its 180w card but i bet if you put 3 of them founders edition 1080s in sli they be heating up pretty quick , nowadays anymore than 2 cards even gtx970 150w you gotta watercool them to stop any thermal throttling without very loud fan noise
     

  5. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,752
    Likes Received:
    1,870
    GPU:
    EVGA 1070Ti Black
    still looks like slot cooling, which i will never buy again ever
     
  6. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,666
    Likes Received:
    597
    GPU:
    RTX3090 GB GamingOC
    H if this is true then why did they run the card with so high clocks (2.1Ghz).

    Was this a shady OC by Nvidia to boost performance or are the base clock and boost clock wrong.

    If founders cards aren't high end binned chips then explain why this card was running so high.

    Is 2.1Ghz an overclock or can we expect all 1080s to boost that high? If so why do they only state clocks of 1600-1800.

    I'm still confused, either they OC'd the crap out of it or all 1080s naturally boost over 2Ghz. If that's true then i still don't understand why they would release those lower clocks 16-1800.
     
  7. wsgroves

    wsgroves Guest

    So let me get this straight. The reference card with the new cooler is the regular card.
    The reference card with the new cooler + vapor chamber is the advanced edition.
    Wow, that's like one little fin block inside the gpu they are charging 100 for lol.
     
  8. Denial

    Denial Ancient Guru

    Messages:
    14,215
    Likes Received:
    4,132
    GPU:
    EVGA RTX 3080
    It's really no different then a 980Ti.

    Ti has an advertised core clock of 1000 and a boost of 1075. In some situations though, it boosts all the way to 1202. Which isn't advertised at all. That's a 20% increase.

    1080 has a stock frequency of 1607 and a boost frequency of 1703. A 20% increase over the base would be 1930, but the 1080 also has GPU Boost 3.0. So it's possible that it can boost itself even further if it detects the need for it. For example, in that particular Paragon scene they were showing, it was one character model on a black background. Most cards would probably max their frequency for a scene like that.
     
  9. iTile

    iTile Member

    Messages:
    23
    Likes Received:
    0
    GPU:
    MSI R9 290 4GB OC Gaming
    Well if the TDP of the GPU is 30-50% lower than the 980, and it produces the same or 15-20% more than the current 980, then nVidia's claim of 50% to twice as fast as a 980 is not completely false but not completely true either. As they would technically rate it according to performance per watt and not "twice the current frame rate" as people assume.

    AMD is doing the exact same thing.

    The only difference between the 2 is that AMD ages like a wine(red), and nvidia like fungus(green).

    GTX780 vs R9 290, 780 use to walk all over the R9 290...

    Nobody talks about these GPU's but looking at latest benchmarks, the 780 is 10-20% slower than a R9 290 compared to when they were first released. R9 290 is clawing at the heals of a 980 and it costed half as much as a 980. Correct me if i'm wrong?

    at the end of the Day the PR talk or as we call it in south africa "spin docter talk" is complete BS and should be takin with a grain of salt. they way they creampied these new GPUs have people foaming at the mouth and arguing over assumptions (Assumptions is the mother of all f00kups) and PR talk. "leaked benches" etc. like herion addicts some can't wait for their next "leak/hit" believing and fighting over half truths.

    I'll keep my wallet locked up in a safe in a safe in a safe, safeception before i upgrade from my current GPU (i do want that H.265 hardware encoding though). But Nvidia's marketing trumps (see what i did there) AMD. While AMD have been producing hardware that actually saves its consumers money by not having to upgrade as often as required by an nVidia buyer.

    Conversation is good, hypetrain is bad. wait for results of both camps then decide. Do you want a withering nVidia GPU thats a performer outof the box or do you want a sleeping dragon that needs awakening in mount doom?
     
  10. maize1951

    maize1951 Guest

    Messages:
    251
    Likes Received:
    8
    GPU:
    EVGA GTX 1080 8GB
    Hopefully they have the memory issue fixed on the GTX 1070 and not have that memory fiasco like they had on the GTX 970 cards.
     
    Last edited: May 9, 2016

  11. Denial

    Denial Ancient Guru

    Messages:
    14,215
    Likes Received:
    4,132
    GPU:
    EVGA RTX 3080
    Kepler, Maxwell and Pascal were all described as being twice the performance per watt, so if anyone is still confused after nearly 4 years to figure it out, they are probably helpless.

    You say no one talks about these GPU's but there is literally 10 posts a day about Nvidia downgrading their older stuff, specifically referring to those GPUs. So yeah, that conversation has already happened. The problem was less that Nvidia is downgrading and more that AMD is upgrading because there was untapped potential in it's cards that was being held back by it's driver.
     
  12. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    This is a very detailed comparing the 780 Ti to the 290X (2013 - 2016). There doesn't seem much in it.

    http://www.babeltechreviews.com/nvidia-forgotten-kepler-gtx-780-ti-vs-290x-revisited/view-all/
     
  13. Solfaur

    Solfaur Ancient Guru

    Messages:
    8,015
    Likes Received:
    1,536
    GPU:
    GB 3080Ti Gaming OC
    So the MSI/Asus/Evga etc. custom cards will be somewhat cheaper, well I DOUBT it.

    For example from MSI I could see a "reference" PCB 1080 with a standard Twin Frozr cooler to cost ~$100 less than the Founder's Edition from nvidia, but a Gaming Edition or better a Lighting Edition, those I'm 99.99% sure will cost AT LEAST as much as the FE, and likely quite a bit more...
     
  14. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,666
    Likes Received:
    597
    GPU:
    RTX3090 GB GamingOC
    GPU Boost 3.0 is responsible for a 400+mhz increase? on a reference card while only reaching 67C that's pretty mental if you ask me.

    I hope people aren't gonna be really pished off if they get a 1080 that does only boost to 1750 or something.

    My 980 G1 boosts to 1400 and my friends 980 only hit 1329 and he was pished about that(jokingly) and that's only 70mhz. I guess it will be a lotto then. Good luck guys i hope you get the beast of a booster and are happy.

    TBH i don't like surprises. I want to know the exact numbers before i buy anything. I also think that each card should have it's real Mhz displayed so that way there is no lotto.

    I've seen the same behavior with CPU's, that's with reported OC numbers. They buy one expecting high OC and are hopping mad when they get a dud. Hopefully with the GPU's it's not the case. If there's a big difference between boosts then i don't think that's very fair.

    They'll do the advertised speed of 1700mhz and if your lucky you'll get about a GTX460's worth of extra performance for nothing @2100mhz but we won't tell you it's real speed so GL.
     
    Last edited: May 9, 2016
  15. Fender178

    Fender178 Ancient Guru

    Messages:
    4,194
    Likes Received:
    213
    GPU:
    GTX 1070 | GTX 1060
    Well considering that the 1080 is going to have GDDR5x memory and the 1070 is going to have the standard GDDR5 I see no reason for the memory partition bs.



    Well since the founders edition is nothing more than a fancy reference style cooler Im sticking with the 3rd party coolers.
     
    Last edited: May 9, 2016

  16. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Let's face it. Even after explanation, there will be thousands people buying this, just because they are used to "founder's" name from gaming, and will expect something extra.

    Usual case of clever marketing.
     
  17. vazup

    vazup Guest

    Messages:
    333
    Likes Received:
    26
    GPU:
    r9 280X
    Is few frames difference really something to be mad about? Those few mhz likely will not help to get stable 60/120fps if the slightly lower one cant get it.
     
  18. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,666
    Likes Received:
    597
    GPU:
    RTX3090 GB GamingOC
    No but what i'm getting at is there was a disparity between two Maxwell cards both made by the same 3rd party company. Small maybe at only 70mhz. With Pascal is that disparity going to be similar to Maxwell because that's a huge disparity from 1700 > 2100.

    Like i said that's almost like two different cards and there would be a big difference in fps between a 1700 card and 2100 one.
     
  19. Denial

    Denial Ancient Guru

    Messages:
    14,215
    Likes Received:
    4,132
    GPU:
    EVGA RTX 3080
    Without a wider range of samples, we don't know. What we do know is that 16nm FinFet is going to behave differently when it comes to clocks then 28nm is. We also know that Nvidia is changing GPU Boost and how it functions. We know that we've been able to get 20% boosts with the 980Ti, so maybe it's possible that with 16nm/GPU Boost 3.0 we can get 25% boosts.

    Until we have a better sample size of cards hitting ~2100 or not, I don't think we can definitely say what's going on here. That or just waiting for the reviews on the 17th.
     
  20. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,666
    Likes Received:
    597
    GPU:
    RTX3090 GB GamingOC
    GPU Boost 3.0 probably works with Temps and maybe even the ASIC of the card. If it is temps then you are basically gonna want to get the coldest card available to sort of guarantee a high clocking card.

    It will be interesting to see which 3rd party card comes out on top for air cooling this series.

    I think they cherry picked really high ASIC cards for the reveal.
     

Share This Page