Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 17, 2015.
Lol, this thread has turned into a bitching fest.
Yeah, there's members here who are upset because people dare to be critical of this card. Oh well.
I don't know what's worse the actual bitching or the bitching of members saying they are tired of the bitching. Either way lot of bitching.
Yeah mine does 120hz all on auto but i run it at 100hz 24/7.
I can understand the disappointment to an extent.. but at the same time, DVI is being replaced by DP, so for this card not to have it is understandable.
It just so happens to mean that those of us with Korean monitors have to get the adapter in order to push that res. That's technology for you.
People have a right to be let down by that, and it definitely makes owning one of these gpus less "affordable" once you tack on the adapter purchase.. not sure why people are trying to downplay this.. it does suck.
Almost a hundred bucks on top, it's not a small amount.
Meanwhile, HDMI 1.4a. It's like they couldn't quite decide and figured they'd go somewhere in the middle.
AMDs reviewers guide is up on videocard z.
Again, that reflects less on the card and more on the fact that certain members here chose a legacy only connected monitor, a legacy connector that was planned to be phased out by most major players in 2015 (which was said in 2010). Remember, Maxwell 2 is still a 2014 architecture; let's see if Pascal (the next architecture) cards also drop DVI.
So I'm sorry, but that reflects on a poor monitor product (planned obsolescence legacy only connection) than the video card itself; and poor choices by those members for buying a soon to be obsolete standard only monitor. DVI is sixteen years old and hasn't seen updates, let it die.
Now HDMI 2.0, that's a legitimate criticism being that is the new standard which even its competition adopted last year.
Aren't you cute when you stand up to all of us, telling us we bought a poor monitor.
You might want to do a little research why these monitors have only one connection? It is not a design flaw. It is the reason why it's overclockable. No osd is another casualty of being overclockable.
Chillin, no one is arguing about tech here as it's irrelevant.
It's as simple as, if AMD don't pay for the adapter they will not buy the card.
You bought a monitor with only a legacy connection slated to die in 2015, no one forced you to. I'm sorry, but the bitching over the DVI does fall into the SAME category as VGA bitching, lack of Win XP support bitching, game port bitching, or any other legacy bitching.
Get an adapter or get a 2014 architecture card (Maxwell 2), period. Maybe you will get lucky and a AiB will throw in DVI somehow just for you. This crying is just plain absurd.
I don't think any of us is even all that upset with the fact that AMD didn't go with DVI. I'm pretty sure most of us are ok with buying the 980ti.
The only one bitching is you. I'm not worried about my dead legacy DVI port, in fact, it's not dead at all, I have a 970 plugged into it still works just fine. The 970 cost me less money than the 390x and it performs just as good and whenever I decide to upgrade I'll just go with one of the 980ti. Nothing is dead about DVI, far from it.
LOL 100MHz GPU overclock and ZERO on VRAM for a grand total of 5% fps gain
AMD said this card will be overclockers dream. Who could have guessed they were really talking about A NIGHTMARE
For all those complaining about the DVI port... Please let me know what games you are running at 1440p and are actually able to get over, I'll be generous here, 80fps without turning down a bunch of settings, which ruins the whole point of playing at higher resolutions anyways. You're all complaining about how it wont support my cheap korean monitor that does 100hz but I highly doubt you ever actually have FPS that touch 100. yes the game play is probably smoother than a 60hz monitor, but the "necessity" for 100+hz is a bit ridiculous given GPU's current power and how nothing currently has the ability to push out those kinds of numbers.
And I am not an AMD or NV fanboy, i've used both and both have served me well. Both have downsides given the year, nvidia has had heating problems/driver issues, and of course AMD has had the stock cooler problems ( easily fixed with AM coolers and such) as well as driver issues here and there.
Conclusion- stop bitching about DVI since you made the decision to cheap out on a piece of hardware that is now limiting you. You would be the first people to bash someone who put a 290x into an APU rig (this probably wouldn't actually happen but just as an example) and started complaining about how they now need to upgrade their cpu. Companies can't cater to every single person out there.
And since I know your response is going to be "but DVI would have been so simple to include", please give me a call when you start designing graphics cards, you have no idea what would have needed to be left out to include that DVI port, and you even said you would need two of them so AMD wasnt willing to sacrifice 2 DP or a DP and another slot just to cater to the .05%, nevermind what else would have needed to be shifted around to include dated hardware.
Damn new cards always brings out some whacko people on this website.
when a guy with 1 post gives his retrospective of AMD vs NV, and when he concludes "stop bitching" I take that very seriously
I agree that they should stop bitching about the DVI but saying QHD 80+ fps is impossible is the worst argument I've ever seen. Dota 2, CS:GO, HOTS, Life is Strange, Civ 5, Warframe are all games I've been playing recently, all maxed out @ QHD, all 100+fps on a 980, let alone a Fury X.
There is more games than Witcher 3 and GTA V.
Well it is better then mine I can get 50mhz more stable.
Just installed LOTRO, 1440p all maxed, easily capped to 110fps solid. Hope that answers it.