Guru3D.com Forums

Go Back   Guru3D.com Forums > General Chat > The HTPC, HDTV & High Definition section
The HTPC, HDTV & High Definition section Home Theater PC Enthusiasts or want to talk in High-Definition ? This is Guru3Ds Premier Community of HD and HTPC.


 
 
Thread Tools Display Modes
Prev Previous Post   Next Post Next
Microsofts definition of 1080i vs. 1080p
Old
  (#1)
gulbane
Master Guru
 
Videocard: eVGA 6600GT 128MB
Processor: AMD Athlon 64 4000+ San Diego
Mainboard: ASUS A8N-SLI Premium
Memory: 2 gigs OCZ Platinum PC3200 2-3-2-5
Soundcard: Sound Blaster Audigy 2 Value
PSU: Antec 500W Smart Power 2.0
Default Microsofts definition of 1080i vs. 1080p - 10-24-2006, 12:05 | posts: 293

Do the gurus on this board have any thoughts on this? Anyone this that 1080p is the "future" for console gaming? Should I bust my ass and get a 1080p HDTV and spend $1000 more or pick up a more feature filled 1080i rig and be happy?


--------------------------------------------------------------------------
Clarifying Thoughts on High Definition Game Rendering

I was talking to Bruce Dawson, one of our senior software design engineers here, about some questions I had around 1080i and 1080p. Frankly, I was particularly curious about why Sony has continued harping on 1080p as being "TrueHD", especially since the 360 has enabled 1080p output as well (coming soon to homes near you!) I was trying to figure out if I was just missing something, and his emailed answer was particularly clear and helpful to me, and since there's nothing confidential here I thought I'd share it with you.

The really interesting statistic that popped for me is how much less time a game console has to render a 1920x1080 scene versus a 1280x720 scene. (Remember this is on the same console, whichever one you like. This is not a comparison of different console's rendering capabilities to each other.) Simply put, for a 1080i/p game the console has 55% less time per pixel to render any special effects, anti-aliasing, illumination, etc. than for a 720p game. Yes, even Resistance has fallen off the bandwagon and admitted they can't hit 1080i/p as previously claimed. (It also helps explain why Gran Turismo HD is so underwhelming.)

Anyway, Bruce's text is below. Hope it helps clarify a few things for you!

Many developers, gamers, and journalists are confused by 1080p. They think that 1080p is somehow more challenging for game developers than 1080i, and they forget that 1080 (i or p) requires significant tradeoffs compared to 720p. Some facts to remember:

2.25x: that’s how many more pixels there are in 1920x1080 compared to 1280x720
55.5%: that’s how much less time you have to spend on each pixel when rendering 1920x1080 compared to 1280x720—the point being that at higher resolutions you have more pixels, but they necessarily can’t look as good
1.0x: that’s how much harder it is for a game engine to render a game in 1080p as compared to 1080i—the number of pixels is identical so the cost is identical
There is no such thing as a 1080p frame buffer. The frame buffer is 1080 pixels tall (and presumably 1920 wide) regardless of whether it is ultimately sent to the TV as an interlaced or as a progressive signal.
1280x720 with 4x AA will generally look better than 1920x1080 with no anti-aliasing (there are more total samples).
A few elaborations:

Any game could be made to run at 1920x1080. However, it is a tradeoff. It means that you can show more detail (although you need larger textures and models to really get this benefit) but it means that you have much less time to run complex pixel shaders. Most games can’t justify running at higher than 1280x720—it would actually make them look worse because of the compromises they will have to make in other areas.

1080p is a higher bandwidth connection from the frame buffer to the TV than 1080i. However the frame buffer itself is identical. 1080p will look better than 1080i—interlaced flicker is not a good thing—but it makes precisely zero difference to the game developer. Just as most Xbox 1 games let users choose 480i or 480p, because it was no extra work, 1080p versus 1080i is no extra work. It’s just different settings on the display chip.

Inevitably somebody will ask about field rendering. Since interlaced formats display the even lines on one refresh pass and then the odd lines on the next refresh pass, can’t games just render half of the lines each time? Probably not, and even if you could you wouldn’t want to. You probably can’t do field rendering because it requires that you maintain a rock solid 60 fps. If you ever miss a frame it will look horrible, as the odd lines are displayed in place of the even, or vice-versa. This is a significant challenge when rendering extremely complex worlds with over 1 million pixels per field (2 million pixels per frame) and is probably not worth it. And, even if you can, you shouldn’t. The biggest problem with interlaced is flicker, and field rendering makes it worse, because it disables the ‘flicker fixer’ hardware that intelligently blends adjacent lines. Field rendering has been done in the past, but it was always a compromise solution.
   
Reply With Quote
 
 

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



Powered by vBulletin®
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
vBulletin Skin developed by: vBStyles.com
Copyright (c) 1995-2014, All Rights Reserved. The Guru of 3D, the Hardware Guru, and 3D Guru are trademarks owned by Hilbert Hagedoorn.