Confused over recent adds 1080p 144Hz

Discussion in 'Computer Monitor Forum' started by djmorgan, Jun 16, 2017.

  1. djmorgan

    djmorgan Guest

    Messages:
    109
    Likes Received:
    0
    GPU:
    1 X MSI GTX 1080ti X
    I am confused if using a nVida card the control panel will show 1080p as 60Hz so how can it be 144Hz at the monitor?

    And doesn't 60Hz mean 60 +/- fps at the monitor.:bang:
     
  2. RealNC

    RealNC Ancient Guru

    Messages:
    5,100
    Likes Received:
    3,377
    GPU:
    4070 Ti Super
    Eh? If you select 60Hz in the control panel, the monitor will run at 60Hz. If you select 144Hz, it will run at 144Hz.

    Not sure I understand the question :-/
     
  3. djmorgan

    djmorgan Guest

    Messages:
    109
    Likes Received:
    0
    GPU:
    1 X MSI GTX 1080ti X
    Yesterday Guru ran this advt: http://www.guru3d.com/news-story/iiyama-introduces-new-1080p-screens-with-free-sync.html


    It seemed to me it was saying that a 1080p monitor/resolution could be 144Hz! now with my GTX 1080ti I know if I have a native resolution of 1920 x 1080 then I can set 120Hz (HDMI) but if I set 1080p, and I don't know why anybody would do that to a monitor, then limited to 60Hz.

    [​IMG]

    [​IMG]

    I believe that setting the display frequency 60Hz or 120Hz has an impact on the monitor fps.

    Does that make it any clearer what I was asking?:nerd:
     
    Last edited: Jun 16, 2017
  4. RealNC

    RealNC Ancient Guru

    Messages:
    5,100
    Likes Received:
    3,377
    GPU:
    4070 Ti Super
    1080p is 1920x1080.

    What you're seeing in your nvidia control panel are TV resolutions. Those are limited to 60Hz.

    But every resolution that is 1920x1080 pixels and not interlaced, is "1080p." Refresh rate has got nothing to do with it.
     
    Last edited: Jun 16, 2017

  5. djmorgan

    djmorgan Guest

    Messages:
    109
    Likes Received:
    0
    GPU:
    1 X MSI GTX 1080ti X
    Thanks and my bad about refresh rate.

    So would you say that native PC at 1920 x 1080 is 1080p although not stated would you also say that there would be no logical reason for somebody to select 1080p 1920 x 1080 @ 60Hz if sending the signal to an FHD PC Monitor?

    :nerd:
     
  6. Exascale

    Exascale Guest

    Messages:
    390
    Likes Received:
    8
    GPU:
    Gigabyte G1 1070
    You have to make sure the HDMI level is set to 0-255 in both the monitor and by default any TV resolution monitors will be using the incorrect 16-235 HDMI levels.
     
  7. RealNC

    RealNC Ancient Guru

    Messages:
    5,100
    Likes Received:
    3,377
    GPU:
    4070 Ti Super
    Yeah, there's no reason for that. It's the same as "PC" 1920x1080 @ 60Hz.

    The only difference might be a very slight offset in the refresh rate. You can test them here:

    https://www.vsynctester.com

    One might use 60Hz (PC), the other might be 59.94Hz (TV). Or they might be both the same.
     
  8. djmorgan

    djmorgan Guest

    Messages:
    109
    Likes Received:
    0
    GPU:
    1 X MSI GTX 1080ti X
    With the exception 'PC' 1920 x 1080 can be 120Hz! whereas 1080p 1920 x 1080 can only be 59.94/60Hz.

    Thus my question and relating to the advertisement in post 1
     
  9. RealNC

    RealNC Ancient Guru

    Messages:
    5,100
    Likes Received:
    3,377
    GPU:
    4070 Ti Super
    Again: 1080p means 1920x1080. No refresh rate. It describes the resolution. So both are the same resolution.

    Some monitors list special TV modes in their EDID, which you can then see in the nvidia panel. This is for better compatibility with things like DVD or Blu-Ray players, or consoles. These TV modes are usually only listed over HDMI, although on some monitors you also see them over DVI or DP.

    But the bottom line is: it's the exact same resolution. 1920 pixels wide, 1080 pixels tall. And you can safely ignore them.
     

Share This Page