1080i vs. 1080p: What's the Difference?
Progressive
(1080p) video is considered better than interlaced (1080i), but it's not always
clear why; here's what's actually happening on your TV screen.
Today's HDTVs can
display beautiful, 1,920 X 1,080-pixel video, but the actual quality of what
you're viewing depends on the source material. A lot of the time, you're not
seeing exactly 1080p. In fact, most TVs today have two modes with similar
names: 1080i and 1080p. Both have the same screen resolution, so what's the
difference between the two? Here are five things you need to know:
1080i
video is "interlaced." 1080i video plays back at 60 frames per second, but
that's a bit deceptive, because it's actually broadcast at 30 frames per
second. The TV then displays those frames twice, the first pass is
1,920 X 540 for the even scan line field, and the second pass is 1,920 X 540
for the odd scan line field. The process by which this occurs is called
interlacing. It contributes to a sense of motion and reduces perceived flicker.
1080p
video is called "progressive scan." In this format, 1,920-by-1,080-pixel high-definition
movies are progressively drawn line after line, so they're not interlaced. On
paper, that may not seem like a huge deal. But in the real world, what you end
up seeing looks sharper and more defined than 1080i, particularly during scenes
with a lot of fast motion.
Sometimes 1080p is
termed "Full HD" or "True HD," to distinguish it from 1080i
or 720p video. Blu-ray discs contain 1080p video at 24 Frames Per Second, and
then, using a method known as 3:2 pulldown, display it at 30 frames per second
on screen.
Data compression can confuse the issue. Sometimes cable companies will deliver a 1080i picture, but then compress the data significantly in order to use up less bandwidth. The result can mean smeared details or pixelated color gradations in certain scenes. It's still technically HD, and still looks better than standard-definition cable, but it's not as good as it could be.
Data compression can confuse the issue. Sometimes cable companies will deliver a 1080i picture, but then compress the data significantly in order to use up less bandwidth. The result can mean smeared details or pixelated color gradations in certain scenes. It's still technically HD, and still looks better than standard-definition cable, but it's not as good as it could be.
This also happens
with 1080p streaming Internet video, but in that case, it's usually dependent
on the speed of your data connection. In fact, Blu-ray is currently the only
practical format for watching lots of pure 1080p content. Even the latest Apple
TV, which supports 1080p streaming, does so in a compressed format that loses a
bit of quality (although it still looks quite good).
Both
formats look similar on smaller TVs. As a general rule, you need a larger TV to notice the
difference between 1080i and 1080p. Depending on your eyesight, you can
probably pick up the difference on a 32-inch LCD if you're particular about it.
But most consumers don't really see a marked difference until at least a
42-inch screen, if not larger. In fact, many people are perfectly happy with
720p HDTV sets even at higher sizes; we recently named one, the 51" Samsung PN51E490B4F, best Choice for budget
large-screen HDTVs.
1080p
isn't even the best anymore. Technology never stands
still, of course. Five years from now, you'll probably just want Ultra High
Definition (aka 4K) video instead. (For a closer look at 4K video, check out What is Ultra HD?) But for now, if you're a videophile who
appreciates a sharper picture, 1080p is definitely the way to go—both in HDTV
capability and in the source material you're viewing.
informative blog , thank you visit us
ReplyDeletebest hd cctv in dubai
Thanks for sharing tripod turnstiles dubai
ReplyDeleteThanks for sharing poe switches
ReplyDeleteTHANKS FOR SHARING SUCH A GREAT WORK
ReplyDeleteGOOD CONTENT!!
poe switches
Thanks for sharing mobile nvr solutions dubai
ReplyDelete