When Upscaling to native resolution, do some values scale better to native than other? Why?

Discussion in 'General Hardware' started by BlindBison, Dec 26, 2019.

  1. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    Hi there guys,

    I've been reading recently about upscaling to native resolution from a base resolution than is less than native and have seen people state that certain values upscale better than others.

    1) For example, if one has a 2560 x 1440p monitor, would 1280 x 720p scale "better" than, say, 1920 x 1080p since 720 divides evenly into 2560 x 1440p? Or, should a user simply get the resolution as high as they can then upscale from there? (For example, many console games on Switch run at 900p when docked then upscale to 1080p for tvs).

    2) Regarding Nvidia's "GPU scaling" toggle (found in the sharpening setting in the control panel), I understand that this will do the scaling on the GPU, but -- as opposed to what? Why would you use this toggle instead of the default scaling? Is it better? If so, why not enable it by default? I've had a tough time tracking down info regarding GPU vs default scaling so I wanted to toss that in here to see if anybody knows more on that.

    3) Integer Scaling VS traditional scaling -- Integer Scaling seems to be a new approach for Nvidia/AMDs GPUs now, but I keep seeing that it's only best for some types of games or resolutions. Is Integer Scaling always the better choice? Or, is it only better if your upscaling a perfect multiple of your native resolution? (720p upscaled to 1440p w/ integer scaling for example).

    Anyway, thank you for your help and time, have a good one.
     
    Last edited: Dec 26, 2019
  2. umeng2002

    umeng2002 Maha Guru

    Messages:
    1,432
    Likes Received:
    335
    GPU:
    4080 Super
    1. Yes. Scaling to a resolution that's an integer multiple of the original resolution is the best since it minimizes scaling artifacts. The algorithms used to upscale aren't as simple as you'd think and from a simple mathematical perspective, scaling to a non-integer resolution simply creates more noise or blur or both.
    2. When not using GPU scaling, the GPU will format the picture output so the monitor's "dumber" scaling will handle scaling how ever it's set in the monitor. So if you run a game a 800*600, the monitor sees a 800*600 signal. With GPU scaling, the GPU scales that 800*600 image to whatever the monitor's native resolution is... maintaining or not stretching out the aspect ratio depending on your settings.
    3 Integer scaling is the "dumb"/ "simple" scaling algorithm that ISN'T the default method the GPU uses to scale images. Integer scaling simply multiplies the resolution by an integer number and that's it. No blending or sampling other, nearby pixels, etc in an attempt to make the image look natural. It produces a razor sharp scaling... which looks ugly for anything other than pixel art.
     
    386SX and BlindBison like this.
  3. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    @umeng2002 Thank you very much! That's helpful -- for #2 then would you recommend the Default display scaling or GPU scaling then?

    If GPU scaling is the "smarter" scaling of the two, I'd expect it to be preferred there, but I expect I'm missing something since it's not the default.
     
    Last edited: Dec 27, 2019
  4. umeng2002

    umeng2002 Maha Guru

    Messages:
    1,432
    Likes Received:
    335
    GPU:
    4080 Super
    Some applications will scale themselves to the monitor's or Windows Desktop's resolution. With GPU scaling, the GPU's built in scaler does the work. Why it's not the default? Who knows. Why does nVidia still not run their graphics cards in message-signaled interrupt mode by default?

    So it comes down to who do you want to do the scaling?

    The program (if it has that feature)
    The GPU (using it's built in hardware)
    The monitor

    Here is a decent guide by the masters of ambiguity, themselves:

    https://nvidia.custhelp.com/app/ans...-image-sharpening-in-the-nvidia-control-panel

    To clarify a bit:

    The "GPU Scaling" in the image sharpening and in the "adjust desktop size and position" menu are the same option. When you engage GPU scaling in the image sharpening menu, the GPU scaling option is greyed out in the "adjust desktop size and position" menu because the other image sharpening menu turned on the GPU scaler.

    The GPU really only has one hardware scaler built in. So with image sharpening, you obviously want to sharpen the scaled up image. Not sharpen the small image then blow it up.

    This is a fairly new software feature for nVidia, so the driver implementation is a bit janky - read the caveats in that link I posted.
     
    Last edited: Dec 28, 2019
    BlindBison and 386SX like this.

  5. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    @umeng2002 Thanks! That's really helpful and a great point regarding GPU scaling in conjunction with Sharpening if I'm understanding you correctly.

    So, if you're going to use the Nvidia Image Sharpening feature then you probably also want to enable that checkbox for GPU scaling too because then it will sharpen the image after the GPU upscales it rather than before (correct me I'm wrong, that's how I understood your meaning there).

    I expect if you check GPU scaling and enable image sharpening then the run at native res then no upscaling will take place, but I do use the image sharpening at about half the default strength (0.25 sharpening w/ default ignore grain setting of 1/6th in control panel) so I will enable that checkbox then, thanks a lot for explaining all of that it's very helpful!

    Will read over that link you sent now, I appreciate your including it.

    I do have one follow-up question -- suppose one does the following:

    1) Enables Image Sharpening and checks the GPU scaling box in the control panel (so that it sharpens the image AFTER the upscale not before as you described above).
    2) Enable Integer Scaling since we'll be running the game at an integer divisible value of native res
    3) Run the game at half native (so, 720p on 1440p monitor or 1080p on 4k for example)
    4) Probably do NOT use integer scaling if you aren't going to use an integer divisible resolution of native res (if you do 80% of 1440p for example in something like Hunt Showdown that has built in resolution scaling, probably don't integer scale that though if you're sharpening you'd still want gpu scaling checked so that sharpening takes place after the upscale rather than before)

    My expectation based on my current understanding is that this would get you 'optimal' results, no? Thank you for all your time, I really appreciate it.
     
    Last edited: Dec 30, 2019
  6. umeng2002

    umeng2002 Maha Guru

    Messages:
    1,432
    Likes Received:
    335
    GPU:
    4080 Super
    First, don't confuse the "integer scaling" for using the GPU or the software or even the monitor's scaler with an integer multiple... yes both can be integer multiples, but the "integer scaler" simply makes a pixel into a 2x2 or 3x3 or 4x4, etc. block of pixels (hard edges and all). With the GPU's native scaler or the programs scaler or the monitor scaler, even with an integer multiple scale, there is still some blurring and smoothing going on. That's why "integer scaling" is only going to look good with pixel art games... a lot of pixel art games also do inter scaling in software too.

    You don't want to use image sharpening with the "integer scaler" as the "integer scaler" is razor sharp. If a game does "integer scaling," I'm sure the GPU can't notice it and would apply image sharpening anyways. But if the GPU is set to do the integer scaling, I doubt nVidia would give you the option to apply the sharpening filter.

    Although, when using the normal scaling techniques, whether in GPU or software, scaling it with integer multiples looks better. It still won't be razor sharp like "integer scaling," but there will be less of the other bad artifacts... with a decent amount of blur... hence the sharpening filters.

    Second, without checking the GPU scaler in the global section of the image sharpening menu, nor have it on in the "adjust desktop size and position" menu, I would assume the image sharpening will be applied after a game scales itself to the monitor's native resolution. I'm not sure if the image sharpening can be applied if the game fails to upscale or if the GPU scaler option isn't ticket.

    I just always use the GPU scaler and select the override mode in the "adjust desktop size and position" menu..

    https://www.nvidia.com/content/dam/...rce-game-ready-driver-integer-scaling-ftl.png
     
    BlindBison likes this.
  7. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    @umeng2002 Thanks for explaining all of that, that’s very helpful to know.

    I do know that the sharpening filter still works when the GPU scaling checkbox is disabled, but ill likely check the GPU scaling box anyway in line with what you’ve said here though most times I expect to run games at native resolution.

    Alright then so only use Integer scaling for Pixel Art games and resolutions that are half native tend to upscale better (720p upscale to 1440p for example).

    thanks!
     
  8. darkforcegg

    darkforcegg Guest

    Messages:
    3
    Likes Received:
    0
    GPU:
    RX 570 4GB
    i know it abit late to ask but the resolution which divisible by 8 will upscale to 1080p better than the one that didnt ?
     
  9. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,732
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    State your question fully, its ambiguous as is.
     
  10. darkforcegg

    darkforcegg Guest

    Messages:
    3
    Likes Received:
    0
    GPU:
    RX 570 4GB
    I mean both the width and height that divisible by 8 ex: 1664 x 936
     

  11. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,732
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    Those figures are not related to 1080p in any way.
    You need to state your question clearly.
     
  12. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,732
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    Ah, I think I understand what you are asking ...
    Its not any res being divisible by 8 that makes it work better.
    Its taking the res you want to upscale to and using an integer divisor on each axis to find a res you can use.
    ie 1920 x 1080 could be upscaled with integer scaling from 960 x 540 ...
    Or even 1920 x 540 or 960 x 1080,
    And higher divisors, I used 1 and 2.
     
    BlindBison likes this.
  13. darkforcegg

    darkforcegg Guest

    Messages:
    3
    Likes Received:
    0
    GPU:
    RX 570 4GB
    Thanks mate but there is any way that i can upscale games with less artifacts ? Like any values resolution that work with best result
     
  14. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,732
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    If you had a recent NVidia gfx card I would say use DLSS.
    But aside from that you are at the mercy of the best upscaler in your setup, whether that be the games upscaler, your gfx card or your monitor.
    You need to try them all with different resolutions to decide.
     
    BlindBison likes this.
  15. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    10,451
    Likes Received:
    3,129
    GPU:
    PNY RTX4090
    OMFG I literally didn't even know this existed in the drivers, with the "GPU Scaling" option basically being hidden within the Image Sharpening setting. So is this basically Nvidia's version of FidelityFX?

    I just tried this out in cyberpunk, enabled GPU scaling in the global driver settings and then enabled image sharpening for cyberpunk in its own profile.

    Loaded up the game, set to fullscreen, and dropped to 1080p (so in theory the GPU scaler is now upscaling to 1440p). The image whilst obviously grainy looks a hell of a lot better than it did before. And performance/smoothness has skyrocketed.

    Strange why they would hide this in the sharpening setting, this "GPU scaling" option should be its own setting in the global settings tab.

    Also, the article that @umeng2002 posted says that some GSYNC monitors have better scalers that what some Turning GPU's have. So with me being on a 1080Ti and having a GSYNC compatible monitor (LG 27GL850) how can I find out which has the better scaler? The GPU or Display?

    Cheers.
     

  16. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,732
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    Try it.
    Feed the display a lower res than native image and the displays scaler will be used.
    If you feed native res to the display (whether already scaled up or otherwise), its scaler wont be used.

    ps
    The reason sharpening is placed with the scaling option is because they go hand in hand.
    A lower res image scaled up does not look as sharp.
    This can be improved with the sharpening tool.
     
    Last edited: Jan 1, 2021
    AsiJu likes this.
  17. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    10,451
    Likes Received:
    3,129
    GPU:
    PNY RTX4090
    I have tried it but using the display scaler doesn't upscale the image it just blows up a smaller rendered image which looks terrible and for some reason this is how I thought it was supposed to be. Unless a game had built in resolution sliders for upscaling resolutions in the games engine. Now using this "hidden" GPU scaling along with sharpening and I am getting a much cleaner image than before. So it tells me my display's built in scaler is either terrible or the image sharpening in the driver does a much better job of cleaning things up.

    I switched res from 1440p to 1080p in Cyberpunk and in the menu screen for a brief moment I thought I hadn't switched resolution, I noticed the change took effect due to my RT OSD changing size. Sure it looks muddy and soft but its surprisingly good for such a drastic change in resolution and the fact I'm only on a 27" monitor probably helps as well. Obviously you would chose a resolution a little closer to that of native.

    Just blown away I never noticed this setting before now. I always left the scaler on Display in the "adjust desktop size and position" and never bothered really to mess with GPU scaling in 3D settings as I never touched image sharpening.
     
    AsiJu likes this.
  18. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,732
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    The highlighted is EXACTLY what upscaling is.
    How bad it looks is dependent on how bad the upscaling method is. This can vary depending on the starting res.
    Many displays upscalers are chronic but some are better than gfx cards.
    Although the DLSS 2.0 software/hardware upscaler is better imo for those games that have it.
     
    Last edited: Jan 13, 2021
    BlindBison and CPC_RedDawn like this.

Share This Page