Microsofts definition of 1080i vs. 1080p

Discussion in 'The HTPC, HDTV & Ultra High Definition section' started by gulbane, Oct 24, 2006.

  1. gulbane

    gulbane Master Guru

    Messages:
    293
    Likes Received:
    0
    GPU:
    eVGA 6600GT 128MB
    Do the gurus on this board have any thoughts on this? Anyone this that 1080p is the "future" for console gaming? Should I bust my ass and get a 1080p HDTV and spend $1000 more or pick up a more feature filled 1080i rig and be happy?


    --------------------------------------------------------------------------
    Clarifying Thoughts on High Definition Game Rendering

    I was talking to Bruce Dawson, one of our senior software design engineers here, about some questions I had around 1080i and 1080p. Frankly, I was particularly curious about why Sony has continued harping on 1080p as being "TrueHD", especially since the 360 has enabled 1080p output as well (coming soon to homes near you!) I was trying to figure out if I was just missing something, and his emailed answer was particularly clear and helpful to me, and since there's nothing confidential here I thought I'd share it with you.

    The really interesting statistic that popped for me is how much less time a game console has to render a 1920x1080 scene versus a 1280x720 scene. (Remember this is on the same console, whichever one you like. This is not a comparison of different console's rendering capabilities to each other.) Simply put, for a 1080i/p game the console has 55% less time per pixel to render any special effects, anti-aliasing, illumination, etc. than for a 720p game. Yes, even Resistance has fallen off the bandwagon and admitted they can't hit 1080i/p as previously claimed. (It also helps explain why Gran Turismo HD is so underwhelming.)

    Anyway, Bruce's text is below. Hope it helps clarify a few things for you!

    Many developers, gamers, and journalists are confused by 1080p. They think that 1080p is somehow more challenging for game developers than 1080i, and they forget that 1080 (i or p) requires significant tradeoffs compared to 720p. Some facts to remember:

    2.25x: that’s how many more pixels there are in 1920x1080 compared to 1280x720
    55.5%: that’s how much less time you have to spend on each pixel when rendering 1920x1080 compared to 1280x720—the point being that at higher resolutions you have more pixels, but they necessarily can’t look as good
    1.0x: that’s how much harder it is for a game engine to render a game in 1080p as compared to 1080i—the number of pixels is identical so the cost is identical
    There is no such thing as a 1080p frame buffer. The frame buffer is 1080 pixels tall (and presumably 1920 wide) regardless of whether it is ultimately sent to the TV as an interlaced or as a progressive signal.
    1280x720 with 4x AA will generally look better than 1920x1080 with no anti-aliasing (there are more total samples).
    A few elaborations:

    Any game could be made to run at 1920x1080. However, it is a tradeoff. It means that you can show more detail (although you need larger textures and models to really get this benefit) but it means that you have much less time to run complex pixel shaders. Most games can’t justify running at higher than 1280x720—it would actually make them look worse because of the compromises they will have to make in other areas.

    1080p is a higher bandwidth connection from the frame buffer to the TV than 1080i. However the frame buffer itself is identical. 1080p will look better than 1080i—interlaced flicker is not a good thing—but it makes precisely zero difference to the game developer. Just as most Xbox 1 games let users choose 480i or 480p, because it was no extra work, 1080p versus 1080i is no extra work. It’s just different settings on the display chip.

    Inevitably somebody will ask about field rendering. Since interlaced formats display the even lines on one refresh pass and then the odd lines on the next refresh pass, can’t games just render half of the lines each time? Probably not, and even if you could you wouldn’t want to. You probably can’t do field rendering because it requires that you maintain a rock solid 60 fps. If you ever miss a frame it will look horrible, as the odd lines are displayed in place of the even, or vice-versa. This is a significant challenge when rendering extremely complex worlds with over 1 million pixels per field (2 million pixels per frame) and is probably not worth it. And, even if you can, you shouldn’t. The biggest problem with interlaced is flicker, and field rendering makes it worse, because it disables the ‘flicker fixer’ hardware that intelligently blends adjacent lines. Field rendering has been done in the past, but it was always a compromise solution.
     
  2. bug77

    bug77 Banned

    Messages:
    3,468
    Likes Received:
    0
    GPU:
    Palit 8800GT
    By his arguments, you're better off rendering at 640x480 with 32xAA...
     
  3. AlecRyben

    AlecRyben Guest

    Messages:
    7,740
    Likes Received:
    0
    GPU:
    5x580 2x590 2x780Ti 1x970
    Of course, that's exactly what i am running! BD
     
  4. Spezzy

    Spezzy Master Guru

    Messages:
    396
    Likes Received:
    0
    GPU:
    eVGA 7800GTX 256 525/1400
    I run 2048x1920 with 200000000000000xXx Vertical Anistrophic Anti Aliasing sync on. Jeez! What's with you guys?! lol
     

Share This Page