Is Nvidia Integer Scaling able to render 1080p/1440p games with excellent sharpness on 4k monitors?

Discussion in 'Videocards - NVIDIA GeForce' started by devetimon, Sep 11, 2020.

  1. devetimon

    devetimon Member

    Messages:
    12
    Likes Received:
    1
    GPU:
    rtx2060super
    Hi, I'am asking this question to know if someone among you ever tried the Nvidia Integer Scaling method explained in this article https://www.anandtech.com/show/14765/nvidia-releases-geforce-436-02-driver-integer-scaling-and-more (scroll down the article until you find ""Integer Image Upscaling At Last").

    In a nutshell, by this method it should be possible to play 1080p/1440p games on 4k monitors having a native image quality as you were playing on native 1080p/1440p monitors. You should not be experiencing blurred images, aliasing nor other issues. The only thing you should do is access Nvidia Control Panel > Adjust desktop size and position > scaling > integer scaling > perform scaling on GPU and then select the scaling resolution.

    Please note: according to the article above and other infos on the internet, this method only works with Turing cards (not Pascal nor older cards). Plus, please note, this is a different method than the Image Sharpening GPU scaling present in Nvidia Control Panel > Manage 3d settings. Nvidia Integer Scaling is a complete different thing and should be used alone.

    I would like to know if someone among you tried it and can really confirm the native sharpness of 1080p/1440p games on 4k monitors or if you did experience any issues
     
  2. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Integer is a full value is it not so instead of a floating point and rounding IE 1.000 it's 1 thus it can scale 1920x1080 up to 3840x2160 since it's exactly 2x2 that resolution but I don't think it works for 2560x1440 which is ~1.34x
    For AMD the GPU upscaling option needs to be used to override the displays upscaling algorithm but sharpening shouldn't have anything to do with it.

    Just a different method though I'd imagine it's still bilinear based but without rounding so there's no smoothing and it's pixel perfect but I don't know if there's any significant advantage of setting the display scaling to 1:1 so it matches the pixels which should also be sharp but instead of limiting scaling to full value dividable resolutions you'd get black bars going at it this way.


    Older scaling method too I believe just that it's good for older titles and lower resolution due to avoiding the blur issue probably even more pronounced on LCD type displays compared to the older CRT models.

    EDIT: I don't see what aliasing has to do with it though that's down to the image displaying straight lines as a hardware limit of how the display tech works plus a ton of different types with modern shading and lighting techniques (Specular, shader not just mesh/edge aliasing.) not that these are too important for older titles it's just potentially sharper over using floating point upscaling and without the blurring side effect at least when done through the GPU instead of setting the display to match or 1:1 scaling mode.

    There's also a possibility for bicubic or other scalar modes but from what I know only GeDoSaTo utilized that for D3D9 downsampling and otherwise it's standard bilinear though this ultimately comes down to image sharpness really. (Plus these other modes like lanczos are much costlier.)

    Bit curious myself on some of it so this isn't a definitive answer or anything I'm not too well versed in it either I know a bit here and there from what I've read about it after Intel renewed interest in integer scaling by adding support for it upon user request and then NVIDIA and finally AMD supporting this mode of upscaling as well eventually. :)


    EDIT: Ah I made a little mistake I think with that scaling comparison between integer GPU support and 1:1 on the display hardware.

    1:1 display scaling would be black bars around everything. Integer would actually scale it assuming the resolution matches. :D
     
    Last edited: Sep 12, 2020
  3. Anarion

    Anarion Ancient Guru

    Messages:
    13,599
    Likes Received:
    386
    GPU:
    GeForce RTX 3060 Ti
    For 1080p yes (one pixel will then be exactly 4 pixels in size). You can't scale 1440p to 4K without rounding.
     
    JonasBeckman likes this.
  4. The Goose

    The Goose Ancient Guru

    Messages:
    3,057
    Likes Received:
    375
    GPU:
    MSI Rtx3080 SuprimX
    I tried this on the division 2 but had issues getting full screen with hdr10 enabled and 1440p on a 4k screen, but it is the first time ive used it so might of had the setting wrong in nv panel.
     

  5. devetimon

    devetimon Member

    Messages:
    12
    Likes Received:
    1
    GPU:
    rtx2060super
    Thanks for your answers.
    Yeah, sorry. I actually meant 720p Integer Scaling for ---> 1440p monitor, while 1080p Integer Scaling for---> 2160p monitor. But apart from that theoretical remark, has anyone among you tried to enable Integer Scaling from the Nvidia control panel and can tell me if you really noticed sharper and crisper images in games with 1080p resolution on your 4k monitor? That is exactly I wanted to know with my question
    (please note: there are some limitations as explained in this article https://tanalin.com/en/articles/integer-scaling/ Nvidia Integer Scaling doesn't work with DSR enabled; with custom resolutions; in surround mode; when Image Sharpening is also enabled; and HDR should be disabled when testing Integer Scaling)
     
  6. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,011
    Likes Received:
    7,353
    GPU:
    GTX 1080ti
  7. sirDaniel

    sirDaniel Guest

    Messages:
    105
    Likes Received:
    2
    GPU:
    940MX
    This article is misleading. I never see n and probably never will somebody watching videos with integer scaling/neirlest neighbour, or playing emulated console games with NN. This pixelosis would burn Your eyes.
    Also Anandtech article is a marketing presentation for me. Key word is "What remains to be seen then is how well this works in practice" So even reading this twice You cant know if thats good or bad.

    Integer scaling is nice to have, but on the other hand it will not make You fly high. I never used it in PC games, (i have no turing) but i see some potential. But suprisingly, its advertised on the wrong front, look here https://www.techpowerup.com/256801/intel-adds-integer-scaling-support-to-their-graphics-lineup again emulation. So why dont they show off with pc game screens? Maybe because You will get sharp image, but not crispy..

    This is kind of marketing now. You will not get cheaper card, you will get "enhancement" instead, that will "reduce" quality of your image. IS, DLSS, VRS, what else.
     
  8. MT_

    MT_ Guest

    Messages:
    4
    Likes Received:
    0
    GPU:
    GTX 650 Ti Boost
    I am the author of the article about integer scaling, and I play games (with my IntegerScaler app for Windows 7+) and watch movies at FHD (with MPC-HC) on my 24-inch 4K monitor (Dell P2415Q) every day. Logical pixels are small enough to be almost indistinguishable in this case, while sharpness loss caused by blur added by regular non-integer scaling is really noticeable under the same conditions. Now imagine scaling 4K to 8K, or 8K to 16K. The pixel density corresponding to 8K at 27″ is already real in 13.3″ 4K laptops.

    DLSS must be supported by the specific game, and even if supported, might introduce unpredictable image artifacts. Integer scaling works with any game and does not affect original image data in any way.
     
  9. sirDaniel

    sirDaniel Guest

    Messages:
    105
    Likes Received:
    2
    GPU:
    940MX
    Hello, I am not against Integer scaling, i think this technology is wrongly advertised.

    I have downloaded Integerscaler Application (I am on win10, HD screen). I've opened 640x272 movie in mpc hc, then enabled Alt F11. The mpchc window took only part of the screen, probably it only doubled pixels. Movie was pixelated, just like when you would use internal nearestheightbour scaler. It simply had aliasing. Where is an income here? Did I make something wrong, or You do resize mpchc window manually triggering internal scaling? (that way picture comes back to ok, antialiased).
    How is that possilbe you see blur everywhere? Maybe there is incompability between win7 and your monitor somewhere? If you use just any better videorenderer like madvr, mpcvr other than windows default, you can set scaling algorihm that makes wonders with video picture quality.

    Game engines are smart. They can use different set of textures, hence different quality of details depends of resolution (eg. LOD). If You intentionally play pc game at lower resolution, and then do IS, you loose any possible enhancement of high resolution.

    So why do not play PC game on native resolution? You have not blur on Your monitor on native, have You? Why do not use madvr and its heavy algorithms to scale to native or even enhance movie picture?? It looks like Integer scaling and the application is advertised for people with low end machines, am i right?

    I see potential in really old games like advertised on Your main page of the project. Eg. Simcity have no bigger resolution than 640x480. Then indeed, tv or monitor scaling may give bad results. Also when somebody is geek and like old style look of 2d game art then Integer Scaling of course is for him!! Frankly saying, IS Application is also good replacement for games than do nor work well on higher resolutions, like You mentioned on the site. It is nice hack to have nearby. Personally, i see no reason to enable it on the daily basis though, when native resolution is a cure for any blur, and gpus are not such a crap nowadays.
     
  10. MT_

    MT_ Guest

    Messages:
    4
    Likes Received:
    0
    GPU:
    GTX 650 Ti Boost
    It’s easy to call an article misleading if you didn’t even read it entirely and carefully.

    See:
    As for specifically the video case you told about, 640×272 is too low resolution unless your display size is at most 10″ or so and viewing distance is at least 50 cm. With my 24″ 4K monitor, Full HD is the minimum resolution where logical pixels are small enough to be almost indistinguishable when using integer scaling. At HD (1280×720), pixels get quite noticeable (though not critical for movies unlike games). Needless to say, I have no idea why someone with a GPU capable of 4K@60Hz in modern games, would watch such a low-resolution video as 640×272. For FHD and HD that correspond to perfect 2.0 and 3.0 scaling ratios, IntegerScaler is unneeded, and MPC-HC’s built-in Nearest-Neighbour capability is enough — and it’s the way I watch videos with MPC-HC.
     

  11. sirDaniel

    sirDaniel Guest

    Messages:
    105
    Likes Received:
    2
    GPU:
    940MX
    Okay, so You compensate aliasing in videos by taking longer distance view to the monitor. And the minimum video resolution is 1280x720 to watch without noticing particular pixels on Your monitor.

    Its true, that picture of 1280x720 video on FHD TV will be more blurry than watching video 1920x1080 (native). So to avoid that problem, some advanced scaling algos were invented and already free available in compurer renderers. For use in MPC-HC for example. I can watch any resolution type of video, including good quality downsising, and postprocessing (sharpening), correcting etc. Some of algos need beefy GPU though.

    Did You compare NN/IS with other algos like cubic, lanczos etc with videos containg scenes with straight lines, like power cables, fences etc?

    I think it is 24" 4K monitor that has about 185 pixel density (remember also retina brand?) just masks aliasing introduced with NN/IS. Such monitor's pixel density makes anti-aliasing less necessary than usual. So that is the trick, that makes Your video not any better, but limits in use cases. Still ends up with artificial video image.

    i suppose pc games would need larger distance view or TV with bigger pixel density, because pc game image is usually much sharper than a movie image.
     
  12. MT_

    MT_ Guest

    Messages:
    4
    Likes Received:
    0
    GPU:
    GTX 650 Ti Boost
    Blur has nothing to do with antialiasing. At integer scaling ratios and logical pixel small enough, blur just unreasonably decreases sharpness.

    Integer scaling is not about masking aliasing, it’s about having no quality loss compared with a display with the corresponding native resolution. So Full HD on a 4K monitor with integer scaling looks as good as (and to me, even better than) on a Full HD monitor.

    Imagine a Full HD monitor with an untouched Full HD image, and another Full HD monitor with the same image artificially blurred, and you’ll get an idea how Full HD looks on a 4K monitor of the same size when using regular blurry scaling.
     
  13. devetimon

    devetimon Member

    Messages:
    12
    Likes Received:
    1
    GPU:
    rtx2060super
    I don't have the possibility to try integer scaling on a 4k monitor at the moment so I tried it on my current FHD 27inch monitor from a resolution of 960x540. Unfortunately the result was bad to my eyes, games didn't look nice but I noticed too many pixels, and text inside games was unreadable (I don't know if it is correct to call them pixels). I tested Control, DMC 5 and Doom Eternal but the result was the same. GPU used: rtx 2060 super. Maybe to appreciate integer scaling you need at least a 1440p monitor. It's just my feeling, maybe I am wrong
     
    MotherSoraka likes this.
  14. MT_

    MT_ Guest

    Messages:
    4
    Likes Received:
    0
    GPU:
    GTX 650 Ti Boost
    Absolutely. With antialiased 3D content, logical pixels should be small enough to be indistinguishable. This shoud be obvious as long as the user understands how integer scaling works.

    Pixels are almost indistinguishable at FHD on a 24-inch 4K monitor. With a larger monitor of the same native resolution at the same logical resolution, or at a lower logical resolution at the same monitor size and native resolution, pixels get noticeable, so in such low-pixel-density cases, in fact the only integer-scaling usecase is maintaining pixelation in pixel-art or old games instead of preventing losing sharpness in modern 3D games or videos.

    Both usecases are important, just not both are applicable depending on the effective pixel density defined by the combination of the logical resolution, the native resolution and the viewing distance.
     
    Last edited: Oct 3, 2020
  15. Digika

    Digika Member Guru

    Messages:
    149
    Likes Received:
    1
    GPU:
    GTX 670
    I'm bewildered by this thread. A guy asked simple question - is thing X under Z conditions possible, yes/no. Instead, he got bunch of random answers, bunch of condescending, long-ass unrelated lectures and a bunch of unrelated stuff.
    No, you thinly-veiled wall of text where you think an answer is implied - isn't one.
     
    JonasBeckman likes this.

  16. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    JonasBeckman likes this.
  17. Corrupt^

    Corrupt^ Ancient Guru

    Messages:
    7,270
    Likes Received:
    600
    GPU:
    Geforce RTX 3090 FE
    Haven't used it myself yet, but if I recall for it to work the way you want, the resolution of the game has to align perfectly with a multiple of the monitor resolution. In theory, 1 pixel of 720p should be able to cover 4 pixels (2 vertical, 2 horizontal) on a 1440p monitor. Won't be able to work like that going from 720p to 1080p for instance. 540p is the equivalent on 1080p.

    You could also try a 640x480 resolution game on 1280x960 and use some black borders on the monitor so it won't stretch the 1280x960 resolution out.

    But it all depends on what the monitor does with it as well, so not 100% sure.
     
  18. Anarion

    Anarion Ancient Guru

    Messages:
    13,599
    Likes Received:
    386
    GPU:
    GeForce RTX 3060 Ti
    It works exactly as it should 720p looks pixel perfect on 1440p display. However, on 27" 1440p display 720p is just so low resolution, that it looks like crap since one pixel is getting closer to 1mm x 1mm in size. For 27" 4K screen it would still be acceptable, though I find 1080p to be too low for 27" screens.
     

Share This Page