Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 5, 2019.
Good post, that seems pretty clear to me.
You probably don't read, what i wrote as well, which is fine I don't blame you:
I believe I've explained it well. Stating, that something is BS, doesn't make it BS. But you obviously have much more years of experience in development with these cards or consoles.
Every architecture does and that was my point of R* gimping Pascal. So you obvisouly didn't even bother reading it.Good for you.
stating what you're saying is bullcrap, makes it bullcrap.
1- There are two ways to write error-free programs; only the third one works.
2- One man’s crappy software is another man’s full-time job.
3- A good programmer is someone who always looks both ways before crossing a one-way street.
4- Software undergoes beta testing shortly before it’s released. Beta is Latin for “still doesn’t work.
5- If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization.
6- Don’t worry if it doesn’t work right. If everything did, you’d be out of a job
sorry if OT....to much coffee this morning I tell ya!.....
+ rep to you if you created those sayings rather than copy/pasting them, but funny either way!
Pascal performance is sad.specially 1080 and 1080ti
Don't leave, just use the ignore list. There's many people around here that are very vocal about expressing how little they know... just put them on ignore, they're not bringing valuable intel to the discussion anyway.
Obviously they are not gimping their cards. For one it would be marketing suicide as soon as someone could prove it but the legally of such action would even be questionable. But you have to wonder how hard they work to improve pascal though.
I mean at launch the 1070 and Vega 56 were pretty much neck to neck (it was a 1-2% difference). I went with the 1070 because i trusted nVidia more than AMD and nVidia served me well for the past 10-15 years or so. But when you look at the 1070 versus 56 today man it hurts. The difference is around 10% on average in favor of the 56 and can be up to 30% in some games like Dirt 4 or RE 2.
If i had a Vega 56 (or better a 1080 but that card was out of my price range) i could probably wait next gen to upgrade but 1440p and 1070 don't make an happy couple.
Well, my GTX1070 has lasted me nearly 3 1/2yrs, no regrets. The Vega 56 came out 15months later than Pascal.
What you're suffering from is gpu envy (most of us do). However, don't be fooled. At 1440p you wouldn't be that much better off with either Vega 56 or GTX1080 in RDR2 and certainly not enough to "wait next gen". Just turn some settings down for now until you're ready to upgrade.
It's a bit odd really and difficult to know what's causing these performance issues. I very nearly skipped buying it because of what I've read on various forums, I'm glad I took the risk because I've had no issues whatsoever with the game. I've set everything to medium, with some TAA and x4 anisotropic. The game never dips from 60 frames on ultra wide at 3440 using these settings and it's the probably the best looking game I've played. I have recently installed a new motherboard and a fresh OS install alongside that, so I wonder whether some of the issues people are experiencing are anything to do with old drivers and possible conflicts in their OS with this game. One thing's for certain i'm definitely not buying the gimping of older gen GPU's (Pascal, Maxwell etc.) because I have seen first hand this is not the case.
I would have thought it likely this game is optimised by rockstar for pascal but they'll have picked the most popular cards (eg 1060). Hence it probably runs quite nicely at medium which is what you'd expect a 1060 to manage. If you turn everything up to max they probably assumed you're using turing and optimised more for that. As turing has a significantly different architecture to pascal outlier cards like the 1080Ti which would expect to run very high suffer.
I bet rockstar could optimise it better for 1080Ti's but will they bother - is it worth their time? How much effort will nvidia put into badgering them to do it being as the 1080Ti is a last gen card?
I've made peace with the thought that when I'll get the game, I'll just turn down settings. It would be really stupid of me to buy a "current gen" card now over my 1080Ti, when in 6-12 months the new stuff will be out. Besides, from what I've seen even a 2080Ti can be crippled in this game, if one so wishes.
You knew the Kepler and Maxwell had the same fate and still went with it. Seems to me you ware ignoring the obvious and now had some regrets. Vega56 this days is compared to 1080.
Vega 56 performed similar to the 1080 in various low level API games at launch (Look at Doom/BF1 results). We've known AMD performs higher in those games for a while now as they can plug stalls. People actively talked about that when Vega launched so I'm not sure why anyone would be surprised by that (although there was argument about how fast dx12/vulkan would take off)
I also don't think Maxwell has aged poorly? The last time I checked the 980Ti wasn't that far behind where it was when the 1080 launched but I haven't look recently.
If the overall argument is "AMD cards have aged better" then I definitely agree and I think that should have been obvious once AMD won all the console contracts. That being said there is a question of how much that is worth to someone (typically based on how often they upgrade) and whether that will change with Turing - which is far more geared towards compute (where GCN should Accel so should Turing) and has a number of features like mesh shaders, variable rate shading, dxr etc that are all being implemented in various libraries.. so future games that take advantage of these features will only pull Turing ahead vs architectures without them.
Usually not much to me as i upgrade every generation. But usually xx70 cards are not sold for 800-850$ in canada (taxes included). I mean i paid my 1070 604$ CAD taxes included 2 years ago. That's a hell of a price rise in 2 years if you ask me. I wish i could just skip this generation but the 1070 just is not good enough for my taste at 2k. I just ordered my new card it will arrive today. Anyway it's off topic now. RDR 2 looks very good btw so not really a surprise it runs poorly on many computers.
Honestly, at this point I think, this game looks a bit like it's unoptimized.
Man, I can see you being a bit of a sad puppy once NVidia release their 3000 series next year then! Well, unless you're ok with upgrading next year as well as this year. I think it's best to upgrade on the crest of the wave if you like, pretty much right when a new architecture/shrink is released - that way you're enjoying the 'best' performance for longer, upgrading later in the cycle you're behind the curve and missing out on gaming value.
My next big upgrade will be when ray tracing will be more mature and i honestly don't think it will be happening with the next gen of cards. Even if the performance will be better with the 3xxx cards we will still see devs overdoing reflections just because they can and i hate it with a very strong passion. I hate when things are too shiny it's not realistic.
My 5700 XT is just a temporary card. It's a Christmas gift from me to me I wanted better performance at 2k without having to spend close to a grand with taxes and shipping. It really was the only option in Canada. In 2 years when ray tracing will start to be more mature and/or devs will start to develop for next gen consoles then i'll probably upgrade my 1800x cpu to a 3800x cpu on sale and maybe by then i'll be ready to spend a little bit more on a proper ray tracing card if the tech is more mature.
For now i just did not feel like the 2070 was worth it in Canada since ray tracing performance on it are already kind of sub par and at the end of the day it really performs like a 550$ CAD card if you ignore ray tracing. I felt like anything i would buy now would be terribly outdated once ray tracing mature and next gen consoles are released so i went for the cheaper option.
Good buy tbh. It's hard to accept the huge price difference between 5700XT and GTX2080.
Wow, you need some serious HORSEpower to run this game!