1. greengiant71

    greengiant71 Member

    Messages:
    39
    Likes Received:
    0
    GPU:
    AMD 7950
    Hello, I am in dire need of some assistance. I'm looking to invest in a 32 inch hdtv but I am not sure if the HZ make a difference or not nor do I know of any good ones.

    I run a custom built rig:

    AMD 2 x4 955
    EVGA GTX 570 HD
    Win 7 x64

    I read a post on a different forum explaining the Hz difference but I'm a bit lost with this. Does Hz not matter? I'd like to run the TV off of my PC, not a gaming console.

    An Explanation of the Workings Involved with all Current 60hz and 120hz HDTV s, Computer Monitors, and Other Displays


    :: Original Post updated for clarification between 120hz HDTVs and TRUE 120hz monitors. Edited remarks are in brackets and bold to exhibit the morphology of the post, and to help in the clarification of responses, by members, that occurred before the update. This post was updated on 2/10 and 2/17.::

    Key Terms and abbreviations:

    Hz=Hertz (cycles per second) In context, how many times a screen refreshes per second.

    FPS= (Frames per second) In context, how many individual static images that occur per second to emulate motion. More frames per second increases perceived fluidity of motion.

    LCD= (Liquid crystal display)

    HDTV= (High definition television)

    Vsync= (Vertical Synchronization) An effect that can be added to an application that forces it to buffer at the same rate that the screen refreshes. This ensures that only whole frames are seen onscreen, thus, eliminating screen "tearing."


    Purpose


    I have observed a lot of members here on ocn asking if they should get a 120Hz set over a 60Hz set. That, I cannot answer for you; I will however attempt to explain the differences. Please note, this is a rather shorthand annotation of the disparities between the display options. If you have any further questions on the topic, I will do my best to help clarify; PM me, or let me know in a response.


    Regarding all 60hz Displays and Current 120hz LCD HDTV Displays


    If the primary application of you display is gaming, I suggest against purchasing a 120hz[HDTV]. The benefit of 120hz technology in 120hz LCD HDTVs will not be utilized in gaming applications. The reason for this is, a 120hz LCD HDTV refreshes at 120hz, although, it will only accept a 60hz or lower input[even if they did accept 120hz inputs, it would simply render it at 60hz and double the frames; 60hz x2=120 instead of TRUE 120hz. In other words, 120hz LCD HDTVs are not TRUE 120hz displays]. Consequently, even if you are gaming at above 120 frames per second, you will not gain the benefit of seeing those additional frames.

    For watching movies, and other forms of video media, it does actually serve a purpose. Most movies are filmed with a frame rate of 23.967 frames per second; watching these movies on a 60hz TV results in a 3:2 pull down. In other words 24fps (rounding) does not multiply into 60(Hz) as a whole number, and will result in some stuttering issues with 24fps movies. Most cable television comes in at 30 frames per second in the states (25 elsewhere) and does in fact multiply into 60. This results in no stuttering effect. 120hz television sets account for the fact that movies are displayed at 24fps and cable at 30fps in that 24x5=120 and 30x4=120, resulting in a smooth experience for both mediums of viewing.

    [Knowing this, 120hz LCD HDTV displays, are in fact refreshing at 120hz. Unfortunately, they are achieving 120hz in a manner that is not consistent with our long held ideas of how screen refreshing "should" work. This renders their moniker of "120hz" to be a bit misleading, and also forces us to differentiate between 120hz, and TRUE 120hz displays. Perhaps it would be best to dub current 120hz LCD HDTV displays as having "60x2 Hz" technology.]

    Regarding the Interpolation Effect on 120hz LCD HDTV Displays


    The “cartoonish†appearance that occurs with some 120hz sets is due to a technology called interpolation. Interpolation takes the 24fps video and 30fps cable, and smoothes it out artificially by adding interpolated frames instead of repeating frames. In other words, if you input a 24hz signal into a 120hz TV, the TV repeats each frame 5 times to reach 120hz, With interpolation on, those 4 "extra" frames are interpolated. [From Wikipedia, "In the mathematical subfield of numerical analysis, interpolation is a method of constructing new data points within the range of a discrete set of known data points." Meaning that the set is using some kind of algorithm to estimate where these artificial, nonexistent frames should be between existing ones.]

    I'm nearly certain that all sets have the ability to turn the interpolation effect off if you don't find it to your liking. [Also, it would be behoove you greatly to turn off interpolation while gaming. Since the fps is not static like that of a movie, the interpolation technology "goes out the window" and will cause glitching on the screen. I say this statement with some hesitation in my uncertainty of how the interpolation effect works with Vsync enabled. With the Vsync option on, and the assumption that your hardware is capable of running a game at average frame rates above 60 per second, your FPS will rendered at a static 60. My hypothesis here would be that interpolation would occur correctly, albeit one would experience massive input lag. My reasoning for this assumption stems from the mathematical definition of interpolation (stated above). For interpolation to occur, the display must have at least two separate and sequential frames available to be able to approximate the location of an interpolated frame. This would lead to input lag, due to the fact that the display must have an at least an occurring frame and a past frame in order to display an interpolated frame. The interpolated frame would be the frame that is actually being displayed on the set in that moment, leaving you at minimum, a full frame behind (perhaps higher depending on how much buffered information the set needs to process and interpolate]

    Regarding TRUE 120hz Computer Monitors and Future TRUE 120hz HDTV Displays


    [Regarding TRUE 120hz monitors, (and future TRUE 120hz LCD HDTVs) these displays will indeed improve your gaming experience (few exceptions). Because the screen is refreshing 120 times a second, the image projected will seem smoother and decrease tearing, even when gaming at below 60fps. Note though when gaming at a frame rate of exactly 60fps (vsync on at 60), 120hz LCD HDTVs and TRUE 120hz monitors should theoretically perform identically (both displays refreshing 120 times a second and repeating each frame once at 60fps). More information is being outputted to your eyes at 120hz even if it's just repeating frames, making the experience seem smoother. Don't forget though, to truly see the benefits of a TRUE 120hz monitor, you must be gaming at an average FPS greater than 60(ideally over 120fps). When your average frame rate is above 60(for this example let’s say you are averaging 120fps), you WILL see the in between frames that a 60hz monitor could not display. For a list of TRUE 120hz monitors, see nvidia's page on monitors that are compatible with their 3d technology.]

    [Note though, video card solutions that are recommended for 60Hz 2560x1600, will not perform as well at 1920x1080 @120hz due to the amount of information that needs to be processed. In other words, it’s more taxing on a GPU to display 1920x1080 @120hz than 2560x1600 at 60hz. You will need one heck of a powerful GPU solution to see all the benefits of 1920x1080 @120hz. Beyond this resolution, at this high a refresh rate, dual link DVI will no longer have enough bandwidth to broadcast a signal to a monitor. As this point we'll have to move to a more advanced cable method like Display port (seen on a lot of the new HD 5xxx series cards).]

    Anny suggestions / comments
     

Share This Page