So I went there and techpowerups bios database and saw nothing new. Is this a modded Bios? I couldnt find anything going to the support section in EVGA website.
You have a 1080 with Samsung GDDR5X this update is for the 1070's that got micron GDDR5. You got +520 on your memory already. That is impressive. Have you made sure that it is not to high as sometimes you get worse performance from too high an overclock on VRAM (it gets into error correction but does not cause a driver crash or lock)? BTW there is nothing inaccurate in this graph. GPU's have been advancing at a faster pace than CPU's for quite some time. Your issue is mainly with software developers and not Nvidia. There is no reason the latest titles should be so heavy on the GPU. Graphics have not visually advanced that much since 2012 yet GPU's are 5x as powerful. There needs to be some real pressure put on these lazy developers and their brute force game engines.
I cant see those speeds when everything is being gimped. More specifically, the way it overclocks as well as the shoddy software. God smite boost 3.0!
Sure you can. Fire up portal and watch your FPS go through the roof (until you hit the inevitable limit of the engine and driver. The thing is hardware developers have made such powerful GPU's that software developers don't even care about optimizing game engines anymore.
I can tell you personally, I used to work for National Cash Register. They made the very first cash register. Now they do a very Large majority of ATMs and Point of Sale systems and machine parts. Think about how those machines just WORK Constantly all across the world and centrally reporting back to a headquarter location with minimal data loss and uptime. The level of equipment to work to count an extremely large amount of money has us monkeys to keep it going. I know my hardware and the conditions/abuse that these electronic equipment can run. I feel Nvidia is Gimping hardcore, for no real apparent reasoning. These cards can run at the same standards everyone else is doing. Products are doing it in the weather for crying out loud with better results. The Gimp IS REAL.
Sounds like a dick rider to me. Lazy developers and greedy publishers are why game engines are so poorly optimized. When new games that look marginally Better than the previous entry require twice the GPU power you are doing something wrong.
Sounds more like diminishing returns to me, effects that aren't noticeable at a glance visually but cost alot in performance.
I think something you and others may not realize is that increases in graphical fidelity require exponentially more gpu power as accurate effects are computationally expensive. the reason a game like portal runs as well as it does is due to the techniques used, for example if portal used ray tracing for lighting it would be far more accurate, but it would run at 3 fps. so instead like every other game on earth atm it uses an approximation that is 50-100x faster, doesn't look as good ,but it looks mostly as good. As game engines become more realistic in simulating reflections lighting ect. the techniques used become less and less approximate or "cheaty" . Moving away from things like POM to more geometry using tessellation is a good example. Approximations are good and all, but they have limitations, shadow mapping is a particularly common one to have issues, such as the shadow warping incorrectly or flickering in certain situations (skyrim) As has been true over the past 10-15 years, the amount of graphical improvement generation to generation slows down and the amount of power required will increase exponentially. a game that comes to mind is black by ea for ps2 and xbox https://youtu.be/Pn9dAbNJdVo?t=1m42s does a game like bf4 really look 10-20x better than black? cause your looking at 64mb of total system memory on a geforce3 variant. If anything the hardware is lagging behind the software these days, as we are reaching the limits of what the current sillicon based processing is capable of, its a big part of the reason apis like dx 12 / vulkan were developed.
^ That sentence has always been true since the creation of computers. The ways of rendering "realistic" graphics has been known for awhile, just that it's slow and hardware can't do it in realtime. Optimized or "cheating" methods to get good graphics without needing a ton of hardware is what is being improved on. Hardware will always move more slowly since it's a battle with physics. Software is more flexible due to our ingenuity and due to the various shortcuts that are available, or perhaps waiting to be discovered.
Eh, i research the crap out of every single part i get for a new PC build, the performance i get out of my builds is normally within 5% of what my research told me it would be, as for predicting how games and other things will work that are released AFTER my research, that is always a guessing game, generally though if you research every part you know what you are going to get. I'm not sure how you bought your current system expecting a GTX 1080 to do 60FPS at 4k for every game, when even when it first came out it was struggling to display 60FPS at 4k on a number of titles that had just been released around the time it came out. That's 4k 60FPS at full settings... you could always turn a few things down and get 60FPS, it will still look worlds better then upscaled 4k on consoles even if you aren't running full settings on PC. Also your thing about PCIE lanes seems to be a non issue, a single GTX 1080 running at 8X or 16X seems to make virtually no difference, all tests i have seen show less then 1% difference. Hell even some older tests i have seen with Crysis 3 at 4k with single (16 vs 8), SLI 2x (16x16 vs 8x8), and SLI 3x (16x16x8 vs 8x8x8) with a GTX 980 didn't seem to be impacted by running at 8x or 16x for one or all GPU's linus did a youtube clip on it a while back for that one, but there are newer tests done with single and SLI GTX 1080's that point to the same thing being true for them as well and were tested with newer titles. (at least for SLI x2, haven't seen a test for SLI x3 for the new cards yet).
I agree. My 980ti cannot run every game at max settings at 1080p 60Hz and maintain 60fps. A single 1080 is around 25% faster. Rendering at 4x the pixels cannot expect 60fps. This situation wont improve, future AAA games will get more demanding. Op didnt research well. He bought what he could afford and expects too much. Being at the cutting edge of gaming resolution should expect performance problems. We have only recently got enough power from 2 cards to get 4K 60fps in the majority of games, expecting that from one card is naive. At a minimum he should have bought the new Titan X and remained tight lipped about performance complaints even then. I fear he is a couple of years ahead of the technology. This is why I am still with 1080p on a projector to be at or very close to 60fps in all games. I'm happy with this res and its not breaking my bank to have an excellent experience. I can wait for 4K. I'm not knocking 4K, that in itself is a great experience. But there is rough with the smooth that must be accepted unless you have the cash. If you cant accept the rough in its current state, its not for you.
I can agree to this aswell. skyrimSE 4k with 1070 runs on around 45fps-60fps depending on the area.. So if less shadows its almost always 60fps. So yeah this is with the old drivers. Are new "skyrim improved" drivers help fps or worsen it? Could anyone confirm please?
People who complain about $1000 GPU not being able to maintain 60 fps in games without any research should be banned from PC gaming and given a console. In skyrim case, its like putting 20 attachments on AR15 and complain why you cant hit an aim within 10 meters. Just because hardware is solid, it doesn't mean it can catch up with all mess and unoptimized code in software. Benchmarks are there, they don't bull**** you.
my complaint, is that have SLI and NVMe AND an additional sound card. All using your PCIe lanes, then see what you have for performance.. It's ridiculous, Intel has gimped aftermarket manufacturers with limiting these lanes. Also the benchmarks at the time of release showed the gtx1080 running every great game at 4k. BF4, Pcars, Star wars, the list is huge. Then DX12 games come out, and the hype behind DX12 was, it gets you more gfx while offering better performance as well as multi gpu support. DX12 titles for the most part SUCK on Nvidia's $700 graphic solution. (Which BTW does most Dx11 games with 4k Ultra performance.) That's solid research, any more research and I should make my own damn graphic card.
Coolio, cheers I had to come back in the end, this is the nicest tech forum (a bit slow to extinguish trolls tho!). Your post is a case in point of its jollyness. Glad to see your 2500K still rocking, killer value CPU. Mine was doing great apart from wrecking its overclock with an accidental over volt. Then got bored so bought Skylake. Being an early adopter both times caused me some headaches, I'm on my 3rd mobo and ended up getting 3 for Sandybridge as well lol.
Dburgo, I think you overlooked diminishing returns. You could have gotten a 4K rig for a lot less if you done your homework, no offense. The guru's here already laid it out for you several times over. Not trying to **** on you its just if you'd read up a bit more, you'd realize you don't really need 6 cores and DDR4 to achieve solid gaming performance. You will need more than a 1080 however if you want everything 4K. Its just not that strong. Much stronger than AMD's flagship offering right now, but still. 4K has a long way to go. A Titan Pascal X is far suited as it stands now according to reviews. You could have shaved a bit here to gain some there, price to performance ratio is key in PC gaming.