We need adaptive Resolution like AMD

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by X7007, Dec 22, 2019.

  1. Cyberdyne

    Cyberdyne Guest

    Messages:
    3,580
    Likes Received:
    308
    GPU:
    2080 Ti FTW3 Ultra
    What nonsense? Why are you talking like an aristocrat? AMD needs their adaptive resolution implemented by the dev as well. It's a simple dynamic resolution feature, not some big new innovation (like rtx).
    What better IQ?? I've seen plenty of comparisons, they are identical.

    Looks like the AMD fanboys noticed the mentioning of AMD in the wrong forum section.
     
    Last edited: Dec 25, 2019
  2. S3r1ous

    S3r1ous Member Guru

    Messages:
    152
    Likes Received:
    25
    GPU:
    Sapphire RX 6700
    you are joking right?
    Dynamic Resolution is game changer, consoles had this for years, its one of low level hardware/software solutions that allows them to reach resolutions like 1080p and 4k when underpowered (but not really, most of the time)
    nvidias regular drivers have disabled dithering, washed out colors(heavy compression) and lower iq (hard to notice since its optimized just enough) because they priorotize performance over these things, Quadro drivers are proof of this (higher iq)
     
  3. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,040
    Likes Received:
    7,381
    GPU:
    GTX 1080ti
    It doesn't need to be matched.

    no they didn't.

    900p-1080p adaptive rendering was always engine side, the console's capabilities being weak were the only thing the hardware had to do with it.
     
  4. Cyberdyne

    Cyberdyne Guest

    Messages:
    3,580
    Likes Received:
    308
    GPU:
    2080 Ti FTW3 Ultra
    If you need dynamic resolution to reach 4k then it's not really 4k...

    So lowering IQ in the interest of performance is bad? You mean like how adaptive resolution lowers IQ in the interest of performance?
     

  5. MrBonk

    MrBonk Guest

    Messages:
    3,385
    Likes Received:
    283
    GPU:
    Gigabyte 3080 Ti
    [​IMG]
     
    SyntaX, artina90 and pharma like this.
  6. otimus

    otimus Member Guru

    Messages:
    171
    Likes Received:
    1
    GPU:
    GTX 1080
    I'd rather have checkerboard rendering as a system level option for any hardware than that, but there's nothing wrong with more options. Checkerboard rendering probably gives the most visually pleasing results, though, as the solution to higher resolutions with less GPU tax. I wonder why that hasn't really been attempted. Isn't Nvidia doing something with it and SLI or something?
     
    yobooh likes this.
  7. yobooh

    yobooh Guest

    Messages:
    260
    Likes Received:
    15
    GPU:
    Gigabyte 970 G1
    This too would be amazing!
     
  8. S3r1ous

    S3r1ous Member Guru

    Messages:
    152
    Likes Received:
    25
    GPU:
    Sapphire RX 6700
    I dont know who you are arguing with or what you arguing because i didnt say every game was using universal solution. Then again neither i or you have personal access to devkits or source code so... ???
    Point is they have it now in very limited barely supported way in their drivers.
    I agree devs had to come up with tons of creative solutions to make use to the fullest the very weak performance but direct access they had to the hardware and they reached pretty much here.

    Its strange, we now have "better" api's like dx12/vulkan and it pretty much gets close to console level devkit in its access and we should be able to do this without much hassle.
    Then again the APIs just take too much boilerplate code to get anything going so only recently have developers even started to make dx12/vulkan atleast one of the options as standard.
    Consoles are becoming more like pc's so why not get the features on pc too...
     
    Last edited: Dec 26, 2019
    yobooh likes this.
  9. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,040
    Likes Received:
    7,381
    GPU:
    GTX 1080ti
    :rolleyes: speak for yourself.
     
  10. Cyberdyne

    Cyberdyne Guest

    Messages:
    3,580
    Likes Received:
    308
    GPU:
    2080 Ti FTW3 Ultra
    It's always engine/game side, nothing to do with the console or API support. What you're saying makes no sense in the slightest.
     
    pharma likes this.

  11. Chastity

    Chastity Ancient Guru

    Messages:
    3,745
    Likes Received:
    1,668
    GPU:
    Nitro 5700XT/6800M
    I really don't understand what the big commotion is. Just do the following:
    1) Create your desired resolution as a custom, and do the ratio math. (I have 2 @ 75% and 70% of 4K)
    2) Set video scaling to GPU in NVCP
    3) Apply Sharpen filter through Freestyle, as desired
    4) ???
    5) profit!
     
    Sajittarius likes this.
  12. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    Dynamic/Adaptive resolution is a crude way of balancing out the quality and performance with some minimum framerate target in mind. It allows for sharper and more detailed image (high render resolution) when the scene is not too demanding on the hardware and provides high enough frame rates when the hardware is struggling (by dropping to lower render resolution). I don't think this is the best way but it can works and might be deemed better than nothing. AMD's latest implementation just makes the subjective comparison more difficult because they add user input metrics to the mix (e.g. how fast you move your mouse) which is arguable.
     
  13. Chastity

    Chastity Ancient Guru

    Messages:
    3,745
    Likes Received:
    1,668
    GPU:
    Nitro 5700XT/6800M
    I'm sure NVIDIA will make a comparable technology available. Each side steals from the other, including their pricing scheme.
     
  14. X7007

    X7007 Ancient Guru

    Messages:
    1,879
    Likes Received:
    74
    GPU:
    ZOTAC 4090 EXT AMP
    DSR has other issues and again, I will explain better now. This, you ask why? Because TV's has Native Scaling. 4K 3D TV's has passive 3D that works at exactly 3840x2160 without using Top & Buttom or SBS which means better latency and better resolution overall because you don't squash the resolution using 3D mode. means better IQ and better fonts in 3D. if you want numbers that are at least 60% better quality overall. Now many games have resolution and resolution renderer like COD MW and FarCry 5.
    So I don't think this should be a problem for Nvidia or any other brand. money is not issue, they have plenty and we know this tech works fine for everything.

    If you want to know where Gsync and AMD Variable Frames came from. it's from LucixLogix. they had Virtual Sync which worked Perfect for every game. I could play Skyrim without the limitation of the ugly vsync and it worked perfect, and many other games. This was the best tech for the year and it's gone because Nvidia saw how they can get money from this and people buy this and it works worse many times. So Nvidia is Greed Company, plain and simple. and Lucidlogix cost money for pro version, but it was forever. Gsync cost a fortune and doesn't work half of the time properly or with many issues.
    [​IMG]
     
    Last edited: Dec 28, 2019
  15. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    You have no idea what you're talking about. There is nothing similar between Virtual VSync and any variable refresh rate technology. I don't think I have found a single game where G-Sync didn't work. Stop talking out of your ass just because you have an axe to grind. Don't lie to other people with misinformation.
     
    SyntaX likes this.

  16. CrazyGenio

    CrazyGenio Master Guru

    Messages:
    455
    Likes Received:
    39
    GPU:
    rtx 3090
    The only thing we have similar to lucidlogic virtu is fast sync, wich is way better, apparently you never have in your life used g-sync or at least freesync, you probably think is just tearless high framerates but it's more than that, the monitor itselfs adapts their refresh rate to the game fps, for example if a game has a range of unstable 100-130 fps, then the monitor refresh rate will adapt to that unstable framerate to show a smooth image transition so the frame drops and ups can't be noticed too much and have the lowest latency possible.

    But of course it's guru3d, the forum where only a few only know about 3d, like i still remeber here a discussion where a guy where telling people that 1440p and 1080p where the same and it's better to have 20 inch 1080p than a 27 inch 1440p monitor because the 20 inch 1080p monitor will have higher ppi, if people here can't tell between resolutions i can't expect them to know about g-sync or freesync.
     
    SyntaX and yasamoka like this.
  17. X7007

    X7007 Ancient Guru

    Messages:
    1,879
    Likes Received:
    74
    GPU:
    ZOTAC 4090 EXT AMP
    You forgot how many bugs there were with gsync, clearly. don't say wrong information about people if you do not remember what was and what is.

    I have gsync on my laptop Asus G51JT. and again, Gsync doesn't worth the premium price. you could buy TV at the same price with Variable Frames.
     
  18. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    You're a joker. I've been using G-Sync for 2 years now. A friend of mine has been using G-Sync for 3.5 years. There are no issues - let alone "many bugs". Also, it doesn't really matter "what was". "what is" is that G-Sync does not have issues, let alone serious issues.

    Good luck buying one of a handful TVs, some with FreeSync support, and one with Adaptive Sync over HDMI 2.1 which no GPU yet uses.
     
    SyntaX likes this.
  19. Cyberdyne

    Cyberdyne Guest

    Messages:
    3,580
    Likes Received:
    308
    GPU:
    2080 Ti FTW3 Ultra
    DSR downsamples resolution as a form of anti aliasing. That's not what the person you're responding to is talking about. DSR hurts performance.
    Your TV doesn't have anything like DSR, no idea where you're getting this from. If you made custom resolutions on your TV using a Nvidia gpu, it would still use your TV's scaling.
    Not to even mention that none of this has anything to do with dynamic resolution, which benefits from different resolutions that it can pick automatically on the fly. Granted, the person you're replying to didn't seem to understand this either.
    You seen to also be confusing dynamic refresh rates with dynamic resolution. Neither of which have anything to do with virtual refresh LucixLogix snake oil.
    The layers of misunderstood information is deep with you.
     
  20. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,040
    Likes Received:
    7,381
    GPU:
    GTX 1080ti
    There are issues, you just haven't encountered them.
     

Share This Page