First GeForce GTX 1050 Ti Benchmarks Leak Online

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 30, 2016.

  1. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    wtf are you on?

    Nothing has change but your perception and likely your desire for highest quality graphics

    Wait, you're complaining about there being a 50% increase in performance between the 750ti and 950, even though it should likely have been 10-30% improvement?

    Seriously, wtf are you on?

    Do you notice here how everyone is basically looking at your posts and going "...what? ..how do you even...wtf are you saying? Where is your logic coming from?"

    I ask, because typically when this happens, you should probably re-think the way you think, and likely acknowledge you're wrong in whatever way you are thinking, as you are seriously not making a bit of sense.
     
    Last edited: Oct 3, 2016
  2. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    Probably because you're not making a point. Relatively speaking, performance has far outpaced price. Even without taking into account inflation, what was the price of the 8800 Ultra? $800? What is the performance increase from that to the GTX 1080? Literally your making no sense. Who cares if you can buy a used rust bucket for $700? You can do that in the US. 4K isn't the standard yet.
     
    Last edited: Oct 3, 2016
  3. chronek

    chronek Guest

    Messages:
    184
    Likes Received:
    3
    GPU:
    Geforce 980 GTX 4GB gddr5
    You expect to pay more if you see more performance, but my point is different thinking I expect more from card for the same price because technology changed, matured, they have time to develop it, and now is not something more than standard, because we expect that from card, so why we have to pay more?
     
  4. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    You're not paying more. You're paying basically the same for a huge performance increase. You're not making sense.
    And seriously DO NOT selectively quote a post and remove the context of the rest of the comment.
     

  5. Monchis

    Monchis Guest

    Messages:
    1,303
    Likes Received:
    36
    GPU:
    GTX 950
    It´s ok, let me explain because obviously you don´t have a ****ing clue of what entry level cards have been capable up to the 660/750ti (last good ones). When the xbox360 was two or three years old you could get a card like the hd4670 with the same total amount of memory of a console and play xbox 540p games at 1280x1024 easily, 50% more resolution, 40-60fps, no need to turn effects off nor turn down textures. Today to play games from a three year old xbox one at 1440p with full gfx at around 60fps you need a $450 gtx1070, right?. Now imagine if the xbox one had cutting edge hardware like the 360 hahaha, stuff like the 780ti and 1070 would have been useless. Yeah, but it´s ok to have something like 10% improvement between a 950 and a 1050 LOL.
     
    Last edited: Oct 3, 2016
  6. chronek

    chronek Guest

    Messages:
    184
    Likes Received:
    3
    GPU:
    Geforce 980 GTX 4GB gddr5

    I used selective quote because that one sentence show difference in thinking.
    4 years ago i could buy graphic card and play in high settings for 400$, today you need 1200$ (sli), and i do not care if i need 16x more computing power to play new games, i expect that new card can play new games like was always
     
  7. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    You mean like Crysis almost a decade ago? Where even a $700 dollar 8800 ultra couldn't play it at except at reduced resolution and quality settings? You have a distorted view of history. You're still not making a point. A single 1080 can max out just about any game at 1440p which is a bit above the current standard gaming resolution. Just because we have displays that can run 4K doesn't mean the GPU power has kept up. Your complaint that 'you don't care how much extra rendering power it takes and it should cost the same', is bizarre.
     
  8. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    You should quote who you are addressing. Otherwise you seem like you're just raging at nobody in particular.
     
  9. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    Again, wtf are you on?

    Radeon HD 4860was not $80 when it was released, it was at minimum, $130. And why mention the 4860? I don't understand this, this was such an obscure card that basically was never officially launched but had a soft launch that most people didn't even care for

    As well, you're comparing a card that came out 4 YEARS after the xbox 360, vs 2 and a half years after the PS4

    And when it comes to actual performance, which is defined with your "can barely do"

    The xbox 360 had 240 GFLOPS, while the 4 year newer HD 4860 had 896 GFLOPS, an increase of 358%

    The PS4 has 1.84 Teraflops, while the 2.5 year newer GTX 1070 has 6.46 Teraflops, an increase of 351%

    So if anything in regards to what you are spewing out the only thing you actually did correctly is get two random graphics cards that are of similar performance increase over the consoles, except, you got the price wrong, and you're way off on the dates between console and graphics cards. Come back at the 4 year mark and see what happens.

    Now, if you're disappointed because a new card can't "do better" then the console, really? Do you even understand what current console games are ran at? By that, i mean, the graphics quality difference between the console and PC versions, unless screwed up by the developers/publishers (which are not the fault of the GPU manufacturers) are pretty vast. Typically, many games on the new console can barely do 1080p at 60fps, many times even if it says 1080p at 60fps, it's realistically lower resolution upscale to 1080 at 60fps. Typically they don't have any AA either, or at most 2x. And then the actual graphics quality difference? Most games would put the consoles on their knees if they were run at the highest AA, highest quality, specific to PC settings, and 1440p, which is what you're trying to compare.

    Realistically it sounds more likely you're upset that game developers/publishers are many times not treating the PC industry the way it should, but what exactly does that have to do with the topic at hand?

    This post very clearly shows me you have absolutely no idea how this stuff works

    For one, something that would work FOR YOU, is that 1280x1024 vs 960 x 540 is not twice the resolution, it's about 2.5 times the resolution, or 75% more resolution

    Two, 1440p is twice the resolution of 1080p, and despite what you apparently think, just because something that might have taken almost twice the performance (4670) then the xbox 360 got you to have more then twice the resolution, doesn't mean that would equal out here.

    The higher the resolution you go is not a linear path of performance

    And you state the same amount of memory? Do you not understand that an xbox 360 had 512mb of ram TOTAL? i mean, for the entire freakin system? Most computers when the HD 4670 came out had 2-4Gb of ram, PLUS the 512mb of ram for the graphics card, which comes to 2.5-4.5GB of ram vs the xbox 360's 0.5GB <--- This is very important to your way of thinking

    And three, it still seems very apparent you are mad with game developers/publishers for their lack of effort in the PC industry, but what does THAT have to do with THIS?

    And lastly, where did 540p come from? most games ran at 720p, some at 1080p, and very few at lower then 720p.......You're grasping at straws here.

    Oh, one more thing, i just gotta say.....LOL @ the statement about the xbox 360 having cutting edge hardware at its release date, that was a good one...That was a joke right?
     
    Last edited: Oct 3, 2016
  10. Monchis

    Monchis Guest

    Messages:
    1,303
    Likes Received:
    36
    GPU:
    GTX 950
    You have not been gaming on this type of cards either uh?. 950 was poor compared to the likes of the 660 and 750ti because those cards were far, far more capable of handling games of their generation. 950 and reference 960 just weren´t as good, turn down settings galore out of the box.
     

  11. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    Seriously dude wtf are you on? As i said before, if anything has changed its your perception and want for highest end, nothing else has changed. I'm ending this pointless conversation as i'm convinced you are 100% a troll now with your statements that have absolutely no depth but claims of quite literally nothing.
     
  12. Monchis

    Monchis Guest

    Messages:
    1,303
    Likes Received:
    36
    GPU:
    GTX 950
    who said 4860?... hd4670 is another card, and had the same amount of total memory, but lets just say it´s a number that nvidia still cannot put on a budget card. And well I said 50% more resolution thinking of 720p, but barely changes anything, 1070 and 980ti already struggle at 1080p and full settings in some games. And the 360 gpu didn´t have like high end hardware?, because I remember stuff like unified shaders and games like cod2, nfs carbon, and test drive unlimited running like dog **** in the 7800gtx of the time.
     
  13. Monchis

    Monchis Guest

    Messages:
    1,303
    Likes Received:
    36
    GPU:
    GTX 950
    Spare me the callings, otherwise say what you wish but wont change the fact that the 750ti and 660 were much more solid gaming cards on their time... try running the latest tomb raider with ultra textures and 4xssa on the 950, it would choke to death, nothing has changed LMAO:

     
    Last edited: Oct 3, 2016
  14. chronek

    chronek Guest

    Messages:
    184
    Likes Received:
    3
    GPU:
    Geforce 980 GTX 4GB gddr5
    I said standards changing, maybe 4 years ago playing at fullhd was "best" but today playing on 4k is normal, it is not excuse to increase price of graphic card. Looks for tv market, todays 4k tv cost the same as fullhd 4 years ago, they not complain about increased resolution at all, they keep to new technology

    btw Crysis is old, now are more demanding games
     
    Last edited: Oct 3, 2016
  15. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    I've gamed on a Riva TNT2 and an Ati before that. Further back it was 2D. What's your experience going back that far?

    Also, referring to this post of yours; http://www.anandtech.com/show/3909/nvidias-geforce-gts-450-pushing-fermi-in-to-the-mainstream/11

    You said you would accept gts450 performance vs GTX480. Wake-up call. The performance is crap. Best case: 45fps@1680x1050. 24fps@1080p. 2010.

    Basically, you're a 30-45fps gamer unless you drop resolution and even then the latest games will run like crap on your systems. So, cards like GTX1050ti would suit you fine. You should be used to dropping resolution and effect settings by now to make games look like crap so games are at least playable. It's the market you've always been in.
     

  16. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    4k is not "standard" at all. We are in early adopters stage at best.

    However, 8k is on the way.

    People like you need to understand resolution and it's effect on graphics cards.

    http://www.extremetech.com/extreme/215584-j-display-demos-first-17-3-inch-8k-panel

    Then, there's 11k as well;

    http://www.dailymail.co.uk/sciencet...-super-resolution-display-pixel-overload.html

    More 8k/4k information from February this year;

    http://hometheater.about.com/od/hometheaterglossary/g/8k-Resolution-Definition-And-Explanation.htm

    Quote "4K is just know beginning to make it in the mainstream consumer maket."

    Gaming on 4k is not normal, it's a relatively new thing and is very demanding.

    You budget guys will not be playing at 4k anytime soon. If we're talking 8k, then, you're going to be waiting at least half a decade before your budget cards can handle it. On the otherhand, if you do decide to buy a 4k screen...enjoy the latency and crap fps, but, that kind of gaming experience suits you guys perfect.

    Finally, show me a 400mph car for 10grand...
    Heck, show me a 1600mph car for 20grand...
     
    Last edited: Oct 3, 2016
  17. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    1920x1080 is the standard. 4K is niche. Please stop.
     
  18. Monchis

    Monchis Guest

    Messages:
    1,303
    Likes Received:
    36
    GPU:
    GTX 950
    I think he means something more like "if it was the standard".

    And Stormyandcold your last budget card was the tnt2?, no wonder. Also Stalker was a pc game.

    And Stormyandcold It´s not the market I´ve always been, I always used to jump between budget and mid-high segments, as a matter of fact 9 months ago I cancelled my 970 order after amazon refused to ship to a different address than that of the payment card... and you know what?, I don´t regret it a bit because my favorite genre are racing games and it runs Forza Horizon 3 like dog****... which brings us back to the point, budget and mid-range maxwell-pascal cards are not even catching the software, weak sauce, obviously budget gamers are gonna be pissed to see a 10% improvement between a 950 and a full new node 1050, because it´s like nvidia isn´t even trying (to catch with the software), 10% is more of a **** you to budget gamers honestly. On a positive note at least now they are kind of warning people promoting these cards as hardware meant for ancient games like mobas and counter strike, but hey, nothing has changed.
     
    Last edited: Oct 3, 2016
  19. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Where are you getting 10% from? The 3DMark 11 score for the 1050Ti is 30% faster than a 950. Most of the node shrink went to power consumption, which dropped to 75w, which is obviously Nvidia catering to the OEM market - where the vast majority of these budget cards are sold. That's not to mention that you make it sound like it's getting worse. Go look at old benchmarks of the GTX650, it gets like 30fps at 1080p in Crysis 2 & Battlefield 3 - which is basically what the 950/1050Ti are aiming for, 30fps in most modern titles at 1080p.

    You keep bringing this up and the more I look at it the less I see a problem. In fact, if anything the pricing makes more sense now then it did with Kepler. A 680 was $100 more for only 8% performance. It's no wonder why most people bought a 670 instead - which is probably why Nvidia changed the pricing in the first place.
     
    Last edited: Oct 3, 2016
  20. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,750
    Likes Received:
    1,868
    GPU:
    EVGA 1070Ti Black
    this whole thread has become stupid my brain hurts just reading some this stuff from people.

    Reminds me why I truly believe alot people are stupid
     

Share This Page