Discussion in 'Videocards - NVIDIA GeForce' started by pharma, Sep 17, 2018.
I wish reviewers would turn off that damn bloom in the game that gives it the washed out effect.
Resizeable BAR - Ryzen R9 5900X vs. Ryzen 9 3900XT with AMD Radeon and NVIDIA GeForce in the benchmark | igor´sLAB
Reposted from beyond3d:
Nioh 2 image quality analysis:
On the left is a fragment of 9x SSAA "ground truth" image downsampled from 3840x2160 to 1280x720 with bicubic filtering as it doesn't add moire, aliasing and ringing like Lanczos.
On the right is a fragment of 720p DLSS Quality image reconstructed from 853x480 to 1280x720 by accumulating and verifying samples from a large sequence of 853x640 images
Here is another comparison:
Left one is the ground thruth 9x SSAA image again
Right one is native 720p without any kind of AA
And here is the third one just for fun:
Left image is 70% rendering resolution (896x504) of 720p + single frame spatial upscale to 720p with Mitchell–Netravali filter + sharpening applied on top of the scaled image aka AMD's Fidelity FX + CAS or whatever it's called
Right image is the DLSS Quality - a sequence of 853x480 images verified by NN
It's kind of obvious which one is closer to the ground truth, but I also decided to make these fragments aligned so that I can capture some image quality metrics to verify that in more mathematical style.
BTW, it was a huge PITA because the character on the screens above was animated and I wanted to catch DLSS in dynamic because on a static image it would have converged to SSAA in a matter of a few frames.
Image fragments with original res and without any additional scaling:
DLSS Quality vs 9x SSAA:
PSNR (more is better): 35.7854
RMSE (less is better): 1064.64
MSE (less is better): 17.2956
MAE (less is better): 651.178
Native 720p No AA vs 9x SSAA
PSNR (more is better): 32.671
RMSE (less is better): 1523.79
MSE (less is better): 35.4302
MAE (less is better): 905.421
The same image fragments, but 9х magnified:
DLSS Quality vs 9x SSAA
PSNR (more is better): 37.2114
RMSE (less is better): 903.449
MSE (less is better): 12.4547
MAE (less is better): 561.239
(Native) No AA vs 9x SSAA
PSNR (more is better): 35.32
RMSE (less is better): 1123.25
MSE (less is better): 19.252
MAE (less is better): 689.93
So DLSS not only looks way more alike 9x SSAA, but it's also closer to SSAA image in terms of image metrics (less math difference between pixels).
Though, no real surprises here since this is exactly how DLSS is supposed to work if everything is set up correctly - converge to the ground truth over time.
I made quite a few screenshots, but, unfortunately, they all were with slightly different camera angles, so I wasn't able to perfectly align them, but in case you are wondering they are all here (including the screenshots above):
Blender 2.92 Linux & Windows Performance: Best CPUs & Graphics Cards – Techgage
March 3, 2021
The graphics chip and graphics card market shares in the fourth quarter of 2020 | 3DCenter.org
March 9, 2021
Looks like allocating production to Samsung was the right move for Ampere.
System Shock Demo and The Fabled Woods Add NVIDIA DLSS At Lightning Speed Using Unreal Engine 4 DLSS Plugin
This is awesome, I am excited for more devs to leverage this plugin!
This version is working much better for me then the first beta version.
GPU TWEAK III (asus.com)
Fixed a number of incompatibility issues due to security changes in Windows 10 KB4601319. This helps to address a number of game and application crashes that have been found since that update.
GPU Tweak III OSD blacklist has been expanded dramatically. This should greatly improve the stability of the OSD when interacting with many programs. Please test GPU Tweak III with your favorite games and programs to help us further improve in this effort.
Temporarily removed the "Apply previous settings on program start" function due to instability. This function will be reworked and improved in a later version.
Fixed a performance issue that increased core loading on AMD CPUs that caused significantly more resource usage than expected. Along with the changes to bring that CPU usage in line, a number of minor resource usage optimizations have occurred.
The OC and Silent Modes no longer adjust memory clock speeds for non-ASUS cards, as it was causing instability.
Adjusted user defined curves for both on-board and external fans in a number of ways to make saving and applying them more consistent.
Installation and uninstallation changes:
GPU Tweak III now checks for DirectX upon installation, and if DirectX 10 is not on the system, a supplementary DirectX installer will begin. Note that this DirectX installer may prompt the user to optionally install other Microsoft software.
At the end of installation and upgrade, GPU Tweak III now asks for a restart before using the software.
Improved the uninstaller so that it functions in Windows Safe Mode and deletes more files on usage.
Fixed one instance that can cause the GPU Tweak III main screen to crash upon opening, leaving only the Monitor.
Note: This type of issue has a number of different causes, so despite this fix, other instances of this issue may occur.
CPU information for the OSD is now retrieved using CPU-Z. This improves accuracy and fixes some previous bugs.
Custom wallpaper transparency has now been implemented.
Corrected the dashboard fan speed metrics for a number of graphic card models.
Save and Undo buttons in the UI now show a grey disabled state when no changes have been made.
Fixed an issue in the Home page where the Save As and Save buttons would appear after clicking Cancel.
Fixed behavioral issues with the "Start minimized" function.
Fixed other behavioral issues with the VF Tuner buttons.
Fixed two issues in the OSD settings page involving the Cancel button: clicking that button could cause incorrect settings in the Information box to load, or clicking that button after switching between GPU Tweak III and Classic Modes would not revert that change.
Fixed another issue that could reset settings to default if the user closes and reopens GPU Tweak III after changing an option.
Smoothed out the movement speed of the scroll bar.
Adjusted the default OSD color scheme.
GPU-Z now shows N/A while it is loading, instead of erroneously showing Nvidia RTX 2080Ti information.
GPU-Z libraries have been updated. This fixes some inconsistencies in the computing technologies.
Updated content in GPU-Z and made formatting changes.
Known Issues v.188.8.131.52
Core function issues:
The OSD may not load in certain games. If you find a game where the OSD does not load, but the software still runs properly, please notify us.
We've received lots of feedback about the VF Tuner, and we are developing improvements.
Custom curves using the VF Tuner may not apply properly to the system.
When running two custom fan curves with the same settings, one fan may get locked to a set value.
AMD Radeon Software fan curves override GPU Tweak III curves.
After restarting the system and GPU Tweak III runs, custom fan curves may not be visible. Closing the software and reopening can fix that issue in some instances.
The 0dB fan function sometimes causes fans to repeatedly start and then shut off again at regular intervals.
When the user's display is set to automatically sleep, the GPU Tweak III main window may close, leaving only the Monitor.
When "Start with Windows" is not selected, the Monitor appears still appears on system start.
When multiple programs are in the same place, the layering of different GPU Tweak III windows (main screen, Monitor, Settings, and OC Scanner) may be inconsistent, so some elements may fall behind other programs while others are in front.
Fahrenheit temperature settings only affect the Monitor, not the other parts of the interface.
The lower limit for underclocking boost frequencies shown in the interface is too high.
Frame Rate Limit requires any running games or programs to be restarted before taking effect. Currently this function only extends to 255 FPS.
OSD fonts in GPU Tweak III style only allow for Calibri and Calibri Bold adjustment. Full font adjustment will be added in a later release.
GPU-Z clock values do not update in real time. To update the clock values, close and reopen GPU-Z.
Some interactions with the OSD may cause an error to appear referring to "mFinitialize Fail". This error does not affect operation.
Mouse step size controls have not been implemented yet.
Hotkeys have a number of issues. Notably, the OSD Toggle hotkey is not working properly, and the Timer is visible but not implemented yet.
The Default display wallpaper is just a blank black background.
Screenshots will always be saved to the default folder, not personally selected folders.
Game and application file names are not sorted properly in the Profile Connect window.
GPU-Z only shows settings for one graphics card, and there is no way to select between installed cards.
Programs that have multiple different .exe files with the same names may not swap settings correctly and may use the wrong profiles. For now, set all related .exe files to the same settings for a smooth experience.
Hut 8 Buys $30M Worth of Nvidia’s New Crypto Mining GPUs
...In the company’s Q4 earnings call, Nvidia CEO Colette Kress said she expected about $50 million in total CMP sales during the new product’s first quarter this year, per CoinDesk’s prior reporting. Hut 8 filled 60% of that target in one order....
I had a theory about DLSS.
Anyone else think it would be more lucrative, and more beneficial for the industry if Nvidia made a new Gsync module that includes a tensor core chip for processing DLSS via the monitor. Giving monitors themselves this upscaling technology. The PC renders and sends for instance a 1080p image to the screen where the monitor then uses DLSS to upscale to 4K or whatever resolution specified in the game. With more plugins coming into new game engines if the game supports it then you can enable DLSS. Even if you only have say a GTX1060 or even weaker GPU that doesn't have tensor cores for DLSS support. Like I've said before I currently don't like the look DLSS gives, but I can see its use case for lower end hardware.
This could even be used for TV's as well to drive higher than 4K resolutions, could help the industry get closer to cheaper 8K TV's or even higher.
I am sure this would be more financially viable for Nvidia, they would be able to sell more monitors and TV's than they would a $1500 GPU. This could become an industry standard. Not only this but it would free up a ton of GPU die space for more shaders or even more RT cores.
wouldn't that add tons of latency to have part of a gpu in your monitor ?
Yea I did think about that, like I said it was just a theory.
keep proprietary stuff to gpus.
new switch is supposed to have it,and it's gonna be a breakthrough for handheld performance im sure.so far you're only getting 720p on old maxwell gpu.this one will get next gen lovelace gpu withh dlss support.it's gonna be night and day difference in quality.
I just hope we're getting +7" screen options.5.5-6.2 is too small for gaming.
well,for AIBs,absolutely.For you or me - same difference.
I feel for amd-only partners,like sapphire powercolor and xfx,amd have been giving them scraps compared to nvidia's aibs.
it's nothing short of amazing that nvidia can still widen the gap when they are having to use 8nm as emergency against 7nm big navi.6800xt/6900xt are killer cards,they're at the limit of what 7nm tsmc and amd brilliant gpu engineering team can do atm,and yet 3080/3090s still sell better.
proprietary stuff always ends up causing more issues, segments the market, and eventually either dies off or becomes open source. For years Nvidia blocked physX on AMD, now that is dead as CPU's and game engines have either got a better option or have incorporated it as an open feature. Hairworks is also basically dead (same with tressfx but at least this was open source), all the "gameworks" stuff is more or less long gone now and was widely criticised for bringing terrible performance as most games just tacked it on with glue and duck tape. Gsync also was locked away unless you had an Nvidia GPU and now a lot of the monitors have become "Freesync compatible" allowing for AMD GPU owners to use VRR on a Gsync display.
the word "proprietary" to me means only one evitable thing..... a monopoly. And we seriously don't want that especially on PC. I predict DLSS will either become an open standard or merge into/with something else that everyone can use. That is, if including a DLSS module inside monitors proves to bring too much latency something I think could be solved with newer hardware.
As for the switch pro, rumours are pointing towards a 720p 6-7" OLED display. I read also that DLSS will not be used when playing in handheld mode as the newer SoC will be more than capable of driving 720p60Hz. This would mainly be to save on battery, and then once the switch pro is docked and able to draw more consistent power from the wall DLSS will be enabled in order to drive games up towards 1440p/4K for newer TV's.
I don't think this is a true successor to the Switch, more of a hardware revision like a PS4 Pro. Games will still have to run on the current Switch hardware as well. The Switch is only 4 years old (2017) and still selling by the truck load. I think the older model will be discontinued and we will just have the Switch Lite and the Switch Pro and then eventually a Switch Pro Lite down the road with them also fazing out the older Lite model. I doubt we see a true switch successor until at least 2023/4.
Nvidia announces new wave of founder edition GPUs
Got notification on email, Checked email, clicked on link, lol, I got Rick Rolled, Happy April Fool's day everyone, really caught me off on this one, Excellent, thanks for sharing.
I meant if you're gonna introduce a feature like dlss,it better come with a gpu not a monitor since you keep the latter for longer.don't like it - change brands.
dlss will never become an open standard.some other type of image reconstruction might,but not this one.
gsync compatible is for freesync monitors on nvidia cards not the other way round.you will never enable freesync on a gsync module display.
see what I meant about putting proprietary features on monitors now?