1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

1080p gaming on 4k TV?

Discussion in 'The HTPC, HDTV & Ultra High Definition section' started by BuildeR2, Sep 13, 2018.

  1. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    6,967
    Likes Received:
    112
    GPU:
    Sapphire 7970 Quadrobake
    Check the TV website about hdr support in the apps of the TV itself. What's the issue with networking?
     
  2. BuildeR2

    BuildeR2 Ancient Guru

    Messages:
    2,682
    Likes Received:
    65
    GPU:
    EVGA 1080 Ti FTW3
    The TV supports all kinds of HDR, HDR10, HLG and what not. It is just Windows getting in the way the more research I do on the subject.

    As for networking, whatever MS changed regarding SMB/HomeGroup/LAN in the Creator's Update completely breaks my access to any other computer in my house. I spent literal days troubleshooting it and looking all through forums threads for answers, but the only thing that works is to use 1709 and everything works fine. I even bought 2 dedicated NIC from non Intel brands because some people had luck outside of Intel NIC's, but anything above 1709 still left me with zero network capabilities.

    Once RS5 is officially released I'll clean install it on a spare SSD and see if I can get all of the HDR stuff working, then decide what to do.
     
  3. BuildeR2

    BuildeR2 Ancient Guru

    Messages:
    2,682
    Likes Received:
    65
    GPU:
    EVGA 1080 Ti FTW3
    Alright, so after another 6 hours of testing and tinkering I think I've got it all figured out and working! I've got HDR in games, in MPC-HC and in streaming services. The first breakthrough came when I tried telling the TV that my PC was a BluRay player. That got real HDR working in games, and the TV's little indicators in the menus showed HDR +WCG and some other things all active. The input lag was pretty high, so I'm working out a way to keep it in BluRay mode to get HDR but incorporate game mode as well.

    The trick to getting HDR to work for movies and TV shows was to make sure they had exclusive fullscreen in their options AND turning off Windows 10 fullscreen optimizations for the player .exe. After making sure both those were done for MPC-HC, I loaded up Wonder Woman and it was super rich and bright unlike the dull and washed out way it was before. Sadly, it had a ton of judder/stutter. I tried a bunch of TV settings with no luck, then ended up trying to match the desktop frame rate to the content frame rate. Smooth as butter after that!

    It has been arduous and tedious, but now I have figured out nearly all of the peculiarities of SDR web browsing/work versus SDR gaming versus HDR gaming versus HDR movies and so on. Yes, I'm still on 1709 but it sure seems like it is working now. Even the TV shows all of the HDR stuff registered as being active now, whereas I couldn't get any of those checked before.
     
    lucidus likes this.
  4. lucidus

    lucidus Ancient Guru

    Messages:
    11,003
    Likes Received:
    520
    GPU:
    GTX 1070
    @BuildeR2 are you using madvr? It should handle all your playback needs elegantly.
     

  5. BuildeR2

    BuildeR2 Ancient Guru

    Messages:
    2,682
    Likes Received:
    65
    GPU:
    EVGA 1080 Ti FTW3
    Um, I don't think so? I just installed the latest version of MPC-HC and went from there. It isn't the media player really giving me issues, but the interaction between Windows 10, my new TV and me. What would madvr do that I don't have right now?

    Also, just for anybody who may know, since my TV can do 4k/60 or 1080p/120 does that mean I can play games @ 1080p/120 and use DSR to get downsampling at the same time? Is DSR limited by HDMI cable bandwidth, thus limiting a 1080p DSR'd to 4k to 60FPS?
     
  6. lucidus

    lucidus Ancient Guru

    Messages:
    11,003
    Likes Received:
    520
    GPU:
    GTX 1070
    You're having trouble with refresh rates and judder, that's why. madvr handles all that better. I don't have an hdr screen to test with but I've read plenty of positive reports for using madvr with hdr playback. Give it a try if you want to, it works with mpc hc. You just have to change the output from evr (custom) to madvr after it is installed. http://madvr.com/#
     
  7. BuildeR2

    BuildeR2 Ancient Guru

    Messages:
    2,682
    Likes Received:
    65
    GPU:
    EVGA 1080 Ti FTW3
    That looks pretty cool just from reading the homepage. I'll have to give that a try in the next few days and see how it goes. Thanks.
     
  8. lucidus

    lucidus Ancient Guru

    Messages:
    11,003
    Likes Received:
    520
    GPU:
    GTX 1070
  9. BuildeR2

    BuildeR2 Ancient Guru

    Messages:
    2,682
    Likes Received:
    65
    GPU:
    EVGA 1080 Ti FTW3
    What resolutions do you guys have? I'm having a hard time deciding what to create. Are you just scaling down both axis by 5%/10% increments, or going by overall pixel count?

    I think I might finally have time to mess around with this in the next few days. It looks good from the few pages I read about.
     
  10. tensai28

    tensai28 Maha Guru

    Messages:
    1,052
    Likes Received:
    232
    GPU:
    2080ti MSI X TRIO
    Your answer is right inside the comments you just quoted. 3200x1800, 3456x1944 and 3584x2016. Go to Nvidia control panel and create a custom resolution under the list of resolutions. You'd just have to enter in the resolutions we mentioned and click OK. It's very easy.
     

  11. BuildeR2

    BuildeR2 Ancient Guru

    Messages:
    2,682
    Likes Received:
    65
    GPU:
    EVGA 1080 Ti FTW3
    Indeed. I know how to create them, but I'm trying to figure out whether % scaling on one or both axis (3840-384) scales better or if overall pixel count % decreases converted to "x" by "y" (8294400 pixels-829440) scale better when under native 4k. Has anybody tried the console thing of scaling only one axis? 2560x2160, 1920x2160 kind of stuff. Can that even be done on PC?

    For example, it looks like 3456x1944 is minus 10% on each axis, but 3648x2052 is minus 10% pixel count versus 4k.
     
  12. anxious_f0x

    anxious_f0x Maha Guru

    Messages:
    1,286
    Likes Received:
    56
    GPU:
    Titan X (Pascal) SLi
    I haven't really looked at other resolutions other than 3200x1800, there's very few games that struggle at that resolution on a 1080Ti it seems.

    I'd say just create a few resolutions between 1080p and 4K and just experiment, see what looks best and still offers decent FPS.
     
  13. 0blivious

    0blivious Ancient Guru

    Messages:
    2,335
    Likes Received:
    85
    GPU:
    MSi 1070 X / 970 / 780Ti
    As I recall.... Going from 4k down to 1080p is a direct 4 to 1 pixels conversion. It's a perfect conversion. 4 pixels becoming 1 pixel makes a perfect square again. It's really not "blurry" just lower resolution than 4k would be. Anything that's not a perfect division or multiplication of the native x4 is going to be a bit blurry on any flat monitor.

    I'm running a spare PC on my (43") 4K tv right now using 1080p. It sure isn't 4k but it looks quite crisp, no worse than the same sized 1080p TV looks.
     
  14. RealNC

    RealNC Ancient Guru

    Messages:
    2,379
    Likes Received:
    672
    GPU:
    EVGA GTX 980 Ti FTW
    That depends on the display's scaling method. Just because there's a perfect 4:1 pixel ratio doesn't mean the display will use integer scaling. Most displays actually don't do that. However, TVs usually use good upscalers, PC monitors are not so good at it. The downside is that the TV's upscaler usually has latency.

    If you use GPU scaling, then you will always get bilinear upscaling, regardless of pixel ratio, which is the worst upscaling method you can have, but there's no latency. If you want to make 1080p look less blurry on 4K with GPU scaling, you can use ReShade though, and activate the LumaSharpen filter. The default of 1.0 works well, unless the game uses TAA which blurs more at 1080p, in which case you can up the sharpening value.
     
  15. BuildeR2

    BuildeR2 Ancient Guru

    Messages:
    2,682
    Likes Received:
    65
    GPU:
    EVGA 1080 Ti FTW3
    I haven't had much time, but I made a ton of resolution steps in the custom list so that I can use whatever best fits any game. For example, Wildlands won't even run at a solid 60FPS at 1080p so I decided to settle for 4k/30 with DWM Vsync while in a borderless window. The game isn't super serious so this works for now. Witcher 3 maybe more like 1620p or 1800p without Hairworks. Until faster GPU's are out there this will surely be a constant dance.
    Yeah, after experimenting with DPI scaling and changing some media player options it isn't too bad. Watching older TV shows from the 80's or 90's is pretty rough, but that is kind of expected.
    It is a 65 inch Samsung NU8000 4k TV, which seemed to have some pretty low latency numbers for gaming out of all the TV's I've seen. I was tired of waiting for the super expensive BFGD's, and I can't go back to a 20 something inch monitor at this point. It feels WAY more responsive than my old screen so I'm going to just leave scaling to the TV for now and just experiment on a game by game basis, depending on engine limitations and/or what AA options are available.
     

  16. tsunami231

    tsunami231 Ancient Guru

    Messages:
    8,692
    Likes Received:
    175
    GPU:
    EVGA 1070Ti Black
    Guess it depends on the tv? I for most part game at 1080p on my TCL s517 4k tv and it dont look and different then 1080p tv at that res, at normal viewing distance, but 2 feet infront of the55" and everything looks like crap. and what is kinda amazing to "me" is I tried playing skyrim se at 4k vs 1080p and at normal viewing distance I saw no difference really, other then framerate hit, I tied 4 on Dues EX MD and saw huge difference along with huge framerate hit, so maybe 4k's amazing ness is dependent on the the game

    Gamemode is must on these tv though, I had gamemode on my old LN32A550 and never used it cause it made no difference in input lag it was exactly the same on and off, other then on killed colors and ability to change any setting, but on the s517 gamemode makes huge diffrence with it off input lag was way worse then my old tv, but with it on it significantly better

    NU8000 was the tv i wanted but the prices just wasnt gona happen

    TCL s517 is great for it prices just dont want to be watching sub 1080/720p images with horrible compressions cause things start to look like they were water colored, Dish network image look like that on alot of it channels due to fact there 1080i/p feed really half that res along with there 720i/p and there horrid compression, every other feed I thought at it the image is amazing compared to my old ln32a550
     

Share This Page