First GeForce GTX 1050 Ti Benchmarks Leak Online

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 30, 2016.

  1. Monchis

    Monchis Guest

    Messages:
    1,303
    Likes Received:
    36
    GPU:
    GTX 950
    Omg thanks nvidia for wasting latest node tech in dropping tdp by 15watts... now that is stupid.
     
    Last edited: Oct 3, 2016
  2. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,640
    Likes Received:
    1,143
    GPU:
    4090 FE H20
    Not to be insulting or anything but I'm going to have to agree with everyone else..

    You are not making any sense, lacking cognitive ability or something.

    A 750 ti comparing to a 950. You should be comparing a 750 to a 950, that is the correct hierarchical comparison.

    Secondly, a 950 is just a non cut down 750Ti with faster clocks. It's the same architecture, so spouting out all this crap makes zero sense.
    AMD has re-branded a series several times.
    What, are you expecting a 100% increase from the same chip?

    Thirdly, pulling this 1050 is 10% faster than 950 out of your ass isn't making sense either. This card isn't even out.

    If you want to go by 'leaked' specs, the Ti is 40% faster than the 950.
    Comparing the 1050Ti to 1050, 1050 has 20% less cores but much higher clocks.
    Guesstimate 12-15% slower. Putting that into perspective, 1050 is going to be about 25% faster than a 950.
    In an old ass benchmark.

    You're forgetting that newer architectures will be faster in specific workloads tailored for it, ie DX12/vulcan.

    So at this point you're making yourself look like a clueless raging guy on the internet.
     
  3. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    And increasing performance by 30%..

    You're also completely ignoring that AMD basically did the same thing with the RX460. Which means its pretty obvious that OEM feedback is requiring lower power devices in that segment. Probably because the vast majority of customers buying those cards want smaller, cooler desktops and don't care as much about the gaming performance.

    The problem is you seem to think that the world revolves around what you want and everyone should just cater to that. If you want more performance, spend literally $50 more and get an RX470 or a 1060.
     
  4. Monchis

    Monchis Guest

    Messages:
    1,303
    Likes Received:
    36
    GPU:
    GTX 950



    I´m comparing that way because those are/were the budget cards to get, you know the relevant stuff... this kind of things should be a given honestly :rolleyes:, as is the (obviously) estimated 10% difference between a (obviously) tuned 950 and the 1050.


    The way I see it, it´s more like nvidia thinks gaming revolves around its hardware... they need to up their game, specially in entry-midrange segments, stop making the slowest cards you can.
     
    Last edited: Oct 3, 2016

  5. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    LMAO, "almost" buying something doesn't count.

    I almost bought the playboy mansion, but, there wasn't enough money in the bank, true story bro.

    If they stopped, then, you wouldn't have anything to buy LMAO.
     
    Last edited: Oct 3, 2016
  6. airbud7

    airbud7 Guest

    Messages:
    7,833
    Likes Received:
    4,797
    GPU:
    pny gtx 1060 xlr8
    I think the performance of these little cards is great and I really want a 1060 but my 960 is holding up very well for my needs....

    back in the day you needed at least xx8+ or xx8x+ to even play good although the old 5770 was a good card/ nowadays an xx6 can do wonders for an upgrade without even needing a PSU upgrade
     
  7. Monchis

    Monchis Guest

    Messages:
    1,303
    Likes Received:
    36
    GPU:
    GTX 950
    The 970 technically I bought, amazon gave me my money back and I´m grateful, saved me what would have been a mistake worth 28 days of salary around here, which is now exactly the cost of the 1060. Now don´t take me wrong, I can deal with expensive stuff but it really has to earn my money, I´m not in a throw away society precisely, this kind of "improvements" are lame.
     
    Last edited: Oct 4, 2016
  8. Monchis

    Monchis Guest

    Messages:
    1,303
    Likes Received:
    36
    GPU:
    GTX 950
    Comparisons are always interesting. Let´s say you are a farcry fan, the $160 5770 would have gave you 42fps at around twice (close) console resolution in farcry 2 (921600pix vs 1764000pix) with 4msaa to boot:

    [​IMG]


    But what is the succesor r7 370 doing in farcry 4 under the same conditions, max settings (close to twice console resolution 2073600pix vs 3686400pix). 21fps with no msaa at all. Of course the 370 might have been a bit cheaper but even more expensive stuff like the 960/380 fall somewhere half the road of what the 5770 did:

    [​IMG]


    See, that´s the kind of thing I´m talking about when I say that bang for buck vs console is at its lowest ever, and things like the 1050 are pulling our feet even deeper. Even worse some people will still defend them :eek:
     
  9. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    You are comparing apples to mangos (because oranges are to close).
     
  10. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    Are we still feeding this troll? He keeps going over and over about nonsense, comparisons that have absolutely nothing comparable, and no matter what is said to him he can't seem to get rid of his delusion. I'd suggest we stop feeding the troll.

    BTW just a tip to the troll, though you didn't get it last time...: If you're going to compare a console to a graphics card, you might want to compare similar aged items. For instance, you just compared two cards to two different consoles, except you decided that you would make it very easy for your "delusion" to make sense by comparing a card that was 4 years newer then the xbox 360, vs a card that was 2 years newer then the xbox one/PS4. Not comparable. If you wait till next years cards, sure, then you can compare it. So next time you try shoving your delusion down others throats, try to actually compare logically. But considering this is the 2nd time you've done this exact thing, i doubt you'll listen.
     
    Last edited: Oct 4, 2016

  11. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    We are just feeding him rope.
     
  12. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    nerds rope? i want nerds rope.....
     
  13. Monchis

    Monchis Guest

    Messages:
    1,303
    Likes Received:
    36
    GPU:
    GTX 950
    Yeah lol, how I dare to compare a budget card to its succesor in a contemporaneous game... to its successor, also contemporaneous. Stupid jokes of you and your friend aside, this is valid and real.


    Ps.- Oh wait, by those charts, the 980 isn´t really that much better than the 5770 (relatively speaking, because one has to explain everything before someones head explodes), now you know what people is talking about when they say nvidia is now passing midranges as "high end". If this isn´t going backwards then I don´t know what it is :heh:
     
    Last edited: Oct 4, 2016
  14. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Yeah except for the fact that one chart has a 980Ti - $650 card running at 53fps and the other chart has a $380 card running at 70fps. So either Nvidia/AMD are gimping every single card at every pricepoint, or Farcry Primal is relatively more demanding on current cards then Farcry 2 was on those cards.

    If you really wanted to show the disparity between previous and current gen, you'd use tflop/$ % comparison between a mid-tier/high end card of yesteryear and the same comparison of a mid-tier/high end card of today.

    I did the comparison between a 460/480 and a 1060/1080 - the 460 is ~20% more valuable compared to a 480 than the 1060 is to 1080. So technically you're correct, there has been a 20% reduction in the perf/$ value of the x60 line up compared to the x80 lineup. That being said, there are a ton of factors that lead into that, other then Nvidia just wanting to create space. OEM Market is a big one, I already talked about that - OEM's definitely weigh the pros/cons of needing to add a 6 pin connector, or TDP values when designing cases/machines. I guarantee you that a lot of the reason for Nvidia wanting to hit that 75w TDP value is OEM Feedback, which is basically just feedback from the OEM's customers. Basically there is ton more people wanting smaller machines/lower TDP, then people not willing to spend $50 for a 470/1060 - which not only would make sense from an OEM perspective, but makes sense to me in general. If you want more performance just spend another $50.

    The other factor is the increasing number of competitive video game scenes that generally have lower graphics requirements. League/Dota/CS:GO/Overwatch all run fine on current gen mid-range hardware. There is a reason why AMD specifically markets the RX460 to Esports. It's basically become a sub-market in the overall gaming one, where Nvidia/AMD are attempting to deliver products capable enough to play those, but maximizing on TDP/die size (cost savings). Probably because they figure anyone wanting to play titles that are more demanding are again, capable of shelling out $50 extra bucks for a performance oriented card.

    You also have this issue of diminishing returns in games. Now in days simply adding more polygons to a scene isn't going to dramatically increase the visual fidelity of the game. You need to target higher resolutions/more complex lighting/etc. Due to this, more modern games take far higher frame-hits when operating at "Ultra" settings, which is where generally most benchmarks measure. Simply moving two or three settings from Ultra to High generally doesn't impact the quality of the game, but significantly increases performance of cards, especially the mid-range.

    Regardless, there is a ton of factors at play and there really is no easy way to compare it. The best way I can think of is like I said, a tflop comparison - which only shows a 20% reduction in value. I don't consider that number very meaningful and I'm sure it fluctuates up and down with every release.
     
  15. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    Just goes to show, even back then you get what you pay for and that games are getting more and more demanding, especially when you move to 2k resolution and above.

    Not much has changed. If anything, consumers like yourself are even more demanding than the latest games. What's clear, though, is you aren't going to be playing any latest games. It's ok, you can always watch real gamers play on Twitch and youtube.
     

  16. Monchis

    Monchis Guest

    Messages:
    1,303
    Likes Received:
    36
    GPU:
    GTX 950
    Well it is nice to see all those arguments, but what matters most at the end is the experience in the latest games and these are still designed around consoles, consoles didn´t launch yesterday, didn´t go high end hardware wise nor got pricier, and yet these two latest generation of budget cards are making experts of pc gamers in turning settings down no joke.

    I have another comparison, this time the dark blue bars in the DE:MD table are for high settings. Double console resolution, latest budget cards, latest deus ex game, the 560 blows the 1060 out of the water. You can make lots of fair arguments about why this comparison isn´t scientifically accurate but what really matters to a gamer at the end of the day is the performance in the games he wants to play, and these latest budget cards seem are all about turning settings down, and paying more, and more, and more. This can´t be good?

    [​IMG]
    [​IMG]
     
  17. EspHack

    EspHack Ancient Guru

    Messages:
    2,799
    Likes Received:
    188
    GPU:
    ATI/HD5770/1GB

    I see a similar drop in perf for high end cards and their predecessors, you're just demonstrating that games have become harder to run OVERALL

    if anything this pascal gen has brought more improvement to mid range cards than ever, the 1060 is badass and the 1050 is supposedly beating a 960, 960 and 950 were meh

    release dates for games and cards from amd-nvidia aren't aligned at all, so it makes little sense trying to compare that way
     
  18. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    Monchis: A crucial factor that you're not accounting for is the convergence of console hardware from X360 onwards to PC hardware. This will lessen the performance difference without a few generational leaps in gfx advancements on the PC side.

    Also, in relation to your console vs pc comparison; Human Revolution is a 30fps game on consoles. The lowest card in that list HD6670 already surpasses what the consoles could do with HR. Moving to Mankind Divided, the X1 and PS4 are now really PC hardware, so, the differences are going to be smaller than in the past. DE:MD is again only a 30fps (drops to 24fps I believe) game on consoles. Again, the lowest cards on the list (780Ti and 380X) already surpasses what consoles can do at much higher fidelity.

    Also, at Ultra settings, the PC version far surpasses console versions. What we are seeing is in-fact higher-quality gfx settings, much higher resolution as well as higher frame-rates than consoles are capable of.

    Another thing to bare in-mind in your cherry-picked comparisons; GTX1060 is not a 2k gaming card. The difference in resolution of this comparison is 2x.

    If you really want to see how far we've come in 5yrs, then, you need to pick 1 game, same resolution and measure fps between old card and new card.

    Choosing a game series like Farcry2 vs Farcry Primal or DE: HR vs DE: MD means nothing. If devs wanted to, they could make a game that would only run at 30fps on a GTX1080. Your mistake is you're blaming the graphics card vendors for this low performance when in actual fact there are a number of factors. DE:MD is not a good example of a well optimised PC game either.

    Technology that our cpu and gpu are made with is also near being maxed-out. This is the reason why the push has been for more cores, but, this gives diminishing returns like already stated.

    You will see less and less leaps forward for the next decade compared to the past. If I said there's only 10x more power compared to Titan Pascal left before producible gpu core is maxed-out in terms of processing power for a given size and power, I think it'd be pretty close to the truth. Within 10yrs we will have pretty much fully maxed-out this technology and will need another way to move forward.

    Ofcourse, once the tech has finally reached it's full peak, then, prices will start dropping. However, your comparison methods here in this thread probably won't be workable in 5yrs to come, especially in terms of jumps in resolution. It just takes too much processing power and electrical power to do it. I would be very surprised if we're talking about general 8K gaming in 2021. I think the situation will be worse than 4k today; cards really struggling to render 60fps@8k. Perhaps people will settle for 30fps@8k which is much less demanding.
     
  19. chronek

    chronek Guest

    Messages:
    184
    Likes Received:
    3
    GPU:
    Geforce 980 GTX 4GB gddr5
    No, it will no drop, cause there is no real competition... They will goes up with new cards, because it doesnt matter if they will cost like a new car, they have excuse that it is % better that last gen
     
  20. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    The GTX280 launched at $650 in 2008, that's $730 now - yet the GTX1080 is only $600 and it has basically zero competition.

    I get the general idea that lack of competition will increase prices, but there is definitely more to it then just competition. The pricing fluctuates all the time due to a wide variety of factors that people tend to ignore. I personally don't think Nvidia's pricing is that unfair at the moment and I'm pretty happy with the money I spent for my 1080.
     

Share This Page