Você está na página 1de 78

A Guide to HDTVs Introduction One of the most desirable pieces of technology steadily infiltrating homes across the world

is flat panel High Definition Television (HDTV). These TVs retail with a perplexing range of names, features, sizes and technical specifications, guaranteed to confuse all but the most tech-savvy buyer. Even after you purchase your exciting new HDTV and get it home, working out how to get the most out of it and ironing out the kinks can be a chore. I'm sympathetic to the fact that most people aren't necessarily interested in the details and tech jargon surrounding HDTVs. They just want to know how to select the best display, and how to set it up for optimal viewing enjoyment. Unfortunately things are never that simple. There's a lot of conflicting and often deliberately misleading information available on the Internet which makes things very difficult. This is compounded by the TV manufacturers filling up technical spec sheets with vague marketing terms and dubious claims. Furthermore, there is currently no perfect display technology; they all have pros and cons, and require certain compromises. The only way to make an educated decision, both at the time of purchase, and subsequently when adjusting the settings on your TV, is to develop a good understanding of the fundamental workings of HDTVs, combined with careful consideration of your own particular circumstances and tastes. This guide helps you to achieve just that, with information ranging from the very basic to more advanced topics for HDTV buyers and owners alike.

Video Basics
If you're not familiar with the terminology frequently thrown around when discussing HDTVs, then now's the time to get a handle on the basics of how video material is played back before going any further. Below is a very condensed run-through of the key concepts.

The picture on a modern HDTV is made up of lots of individual still images of digital material called Frames, shown in rapid succession to create the illusion of moving video. A frame is a single still image, like a digital photo. And just like any digital image, it's made up of lots of individual dots called Pixels, which are the smallest unit of graphical information. The Resolution of a video image is measured in pixel width x pixel height, with the most common resolutions for digital video being: 640x480, 720x480 and 720x576 for DVD; and 1920x1080 for Blu-ray. The resolution is often shown in shorthand notation such as 480i or 1080p, referring to the pixel height of the image, and whether it is Progressive (p) or Interlace (i) video - more on these last two shortly. LCD and Plasma flat panels have a fixed number of pixels on their screen, and this determines their Native Resolution, again measured in pixel width x pixel height. For example, a 1080p flat panel display has a 1920x1080 native resolution. For any TV to be considered "High Definition", it must be able to natively display a 720p, 1080i or 1080p image. In marketing speak, HDTVs limited to a 720p maximum (e.g. a 1280x768 native resolution) are referred to as "HD Ready", while those which can do 1080p natively are called "Full HD". Where the digital video source being played back on the TV has a resolution which doesn't match the TV's native resolution, the source video will be automatically rescaled up or down by the DVD or Blu-ray player, or by the TV itself, to best fit the screen. Furthermore, if the video maintains its original Aspect Ratio that is, the ratio of its width to its height - such that it isn't squashed and doesn't have portions cut off, then you may see black bars to the sides or above and below the image.

Original video content can be shot at varying framerates of 24, 25, 30, 50 and 60 Frames Per Second (FPS). The majority of movie content is filmed at 24 FPS. This is a low framerate, and if left unaltered, during fast action it can appear choppy, and can also produce noticeable flickering. To appear more pleasing to the eye, the video needs to undergo some changes. One way to do this is to adapt the original video frames to a higher Refresh Rate, measured in hertz (Hz), which is the number of times per second the screen updates the image it displays. Frame rate and refresh rate are not always the same. In a cinema for example, the projector will actually flash (refresh) each frame of a 24 FPS movie two or three times per frame, resulting in a 48Hz or 72Hz refresh rate, which reduces the perception of flickering that a 24Hz image would otherwise show. The primary benefit of a higher refresh rate is that it leads to less visible flickering, while higher framerate results in smoother motion.

On traditional Cathode Ray Tube (CRT) TVs, it was originally decided for reasons to do with mains power frequencies to use 60Hz as the standard refresh rate in NTSC countries, such as North America and Japan, and a 50Hz standard in PAL countries, which includes most of Europe, China, Africa and Australasia. There is some trickery involved in converting a 24 FPS movie into the 50Hz or 60Hz standard as relevant. On an analog CRT television, the image is actually composed of Fields, not frames. A frame is a whole image, while a field is part of an image. To save on bandwidth in broadcast television, a process known as Interlacing was used whereby each frame on a TV screen was actually composed of two separate fields, each containing half a frame. One field would show only the odd-numbered lines of one frame, while the other showed only the even-numbered lines of the next frame. So two slightly different half frames (fields) would be interlaced together, and when shown rapidly in sequence on a phosphor-based CRT, the human eye didn't notice the interlaced fields. The benefit of this method was that it doubled both the original frame rate and refresh rate, resulting in much less flicker and smoother perceived motion than if the original source was unaltered.

But how is a 24 FPS movie actually converted into a 50 Hz or 60 Hz refresh rate? The number 24 doesn't divide evenly into either 50 or 60. In PAL countries, it commonly involves speeding up the 24 FPS movie to 25 FPS, which is only a 4% increase in speed and thus not noticeable. When doubled via interlacing, that 25 frames per second becomes 50 fields per second (50Hz) which is the PAL standard. Things are more complex for NTSC video. If the source is 24 FPS, a process known as Pulldown is used, also known as 2:3 Pulldown or 3:2 Pulldown. Instead of repeatedly interlacing two slightly different fields, pulldown employs an alternating pattern, such as 2 fields, then 3 fields, then 2 again, then 3, and so on - see the diagram below. This 2:3:2:3 field pattern repeats every four frames. With 10 fields being generated for every 4 frames, this equals 60 fields per second (60Hz) for every 24 frames per second (24 FPS), which accomplishes the required conversion. The main problem with Pulldown is that it introduces some Judder, whereby the uneven repeating field pattern can make motion appear slightly jerky at times.

This process changes slightly on HDTVs, because modern digital displays always show whole frames, and don't generate fields made up of partial frames. This is known as Progressive scan video, and provides a smoother image than interlaced video. On an HDTV, any interlaced video (e.g. 1080i broadcasts) must be converted to progressive via a process known as Deinterlacing. This deinterlacing is not perfect, and depending on the method used, may result in some visual glitches known as artifacts. As a rule, progressive video is always smoother and clearer than interlaced video, especially for fast motion. Fortunately, video content stored in the form of Bluray discs and DVDs is usually encoded in the original 24 FPS progressive scan format, also known as 24p. This means no deinterlacing is required, but conversion to 50Hz or 60Hz (and multiples thereof, such as 100Hz or 120Hz) using speeding up and/or some form of pulldown may still need to occur. There is an alternative available for film purists who own an HDTV and a Blu-Ray player which are both capable of native 24p playback: the movie can be played back at its original 24 FPS without any conversion such as pulldown. The TV may still refresh each frame multiple times to achieve a higher refresh rate to reduce flicker (e.g. 48Hz, 96Hz or 120Hz), but the original film frame rate is unaltered. The above is of course a highly simplified summary, and there are a lot of complexities, nuances and omissions which videophiles will undoubtedly point out. For now though, it's enough if you feel you have a reasonable understanding of what's covered above. We will expand upon some of these topics later in this guide.

Screen Size & Viewing Distance


Before considering anything complex, let's first address the most commonly-asked question when people go to purchase a new HDTV these days: "How big should I go?" Unfortunately the most commonly-provided answer these days is: "The bigger the better!". This is not a universal

truth - bigger is not always better; indeed bigger can sometimes be worse, especially if you sit close to a large screen. There is no precise scientific value for the distance you should sit from a particular sized screen, but several very general rules which are commonly cited include: 3 x Picture Height - Measure the actual height of the screen area, then multiply that by three to determine the viewing distance. 1.5 - 3 x Diagonal Size - Use the diagonal size of your screen - which is what the listed screen size actually refers to - and multiply it by 1.5 and 3 to get the minimum and maximum recommended distances respectively. THX Viewing Angle - THX recommends that the TV screen take up 40 degrees or less of your field of view to give an immersive experience. Divide your diagonal screen size by 0.84 to get the viewing distance required to meet this recommendation.

To make things easier, you can also use a Viewing Distance Calculator which takes into account a few of these types of recommendations. Fill in the relevant details at the top of the calculator and click the Calculate button. Click the 'Switch to metric units' button if you want to use meters instead of feet, or simply remember that roughly 3.3 feet = 1 meter. Now here's the controversial part: viewing distance calculators may tell you that, to take one example, you can sit up to 6.5 feet away from a 50" TV and still fully resolve all the detail on a 1080p screen. However this advice should not necessarily be taken literally to mean that you should sit 6.5 feet or less from a 50" screen. Similarly, the 'THX recommended viewing angle' result from the calculator says that at 6.5 feet viewing distance, a 58" screen is actually recommended, which reinforces the mindset that bigger is better. Big screen flat panels are getting cheaper by the day, and people are now automatically opting for the largest screen size they can afford based on this sort of advice. This is not the correct approach. Instead, the three

most important factors you will need to take into account when considering screen size and viewing distance are: pixel structure, personal taste, and the source material you will typically view. Pixel structure: Flat panel TV's are fixed-pixel displays. This means that a 1920x1080 HDTV for example has a total of 2,073,600 pixels, regardless of its screen size - whether 32", 42", 65" or 103", the same number of pixels are on the TV. So the larger the screen size, the larger the individual pixels. Keep in mind also that a 720p display has less pixels than a 1080p display (typically 1280x768 = 983,040 pixels). If you sit close enough to any fixed-pixel display, it will become obvious to your eyes that the image is actually composed of small dots or rectangles. This is also known as Screen Door Effect, because it looks like the image is being viewed through the fine mesh of a screen door - see the picture below for an example.

If you can see any hint of pixel structure on the screen at any time, you're sitting too close to the TV. Move back to the point where you can't see any pixel structure, and that is your true minimum viewing distance. Personal Taste: Image size is a subjective choice in large part, which is why some people sit in the first few rows of a cinema, some sit in the middle, and some sit towards the back. The THX Viewing Angle recommendation in the Viewing Distance Calculator further above takes this into account by providing a 'Maximum THX viewing distance' number which corresponds to the THX requirement for a movie theater's screen to have at the very least a 26 degree viewing angle when viewed from the back seats. So you can use that recommendation instead if you're the type who sits towards the back of a theater for example. Otherwise the only viable option is for you to visit a store and find out how close you can get to a particular screen before the sheer size of it becomes overwhelming for you, and you find it uncomfortable to track the image with your eyes. It's true to say that most people become used to a larger screen over time, however this is a matter of degree. You will almost certainly adjust to a screen which at first seems slightly larger

than you expected. But don't automatically expect to become accustomed to a 65" screen at five feet for example, despite a viewing distance calculator or other people telling you otherwise. Source Material: This is the most important consideration, and one which all the distance calculators and viewing recommendations often fail to take into account. While you may be able to easily afford a very large screen, and indeed not feel the size to be overwhelming at all, the reality is that the majority of source video material today is actually laden with subtle and not-sosubtle image quality flaws, even when viewed using the best quality players and TVs available. See the Source Material section for more details and examples of the common flaws visible in all forms of digital video. Though present on most Blu-rays, these issues are exacerbated on DVD and poor quality Digital TV broadcasts. There are ways of improving image quality and reducing these flaws, which we examine in the Calibration section. Ultimately however, it is a fact that viewing any image on a larger screen and/or closer to the screen will mean you will notice more of any flaws in the image. Only a very small percentage of the best quality Blu-ray transfers will stand up to closer scrutiny without noticeably exhibiting these sorts of flaws. Keep the issue of source material quality foremost in your mind before automatically opting for a larger screen, because quality should be more important than size when it comes to HDTV. If a reasonable proportion of your viewing is of mediocre quality, particularly digital TV and DVD sources for example, then it would be best to opt for the size which ensures cleaner image quality across a range of sources, not just the biggest screen you can afford. OK so where does that leave you with your choice of screen size? The viewing distance rules and calculators we discussed earlier are still applicable, however remember that they only provide you with a range within which you might be comfortable with a particular screen size. Ideally, you should take an average quality live action movie disc which represents your commonly viewed source material (not the best Blu-ray you own) with you to a store and ask them to play it back for you on various TV sizes. Since the quality of the movie on the disc is already known to you, what should become apparent is the difference in the quality of the TVs and the distance at which particular screen sizes exhibit more flaws, and at which the image becomes overwhelmingly large to you, and difficult for your eyes to follow comfortably, or shows hints of pixel structure on the screen. Finally, if you're concerned about how a particular TV will fit into your existing display area, then check the manufacturer's website as they usually have the specifications on the width, height and depth of each TV. Some manufacturers even have applications which help you to better visualize how a TV will look in your viewing environment. Panasonic's free Viera AR Setup Simulator App for example allows you to use an iOS device with a camera, such as an iPhone or iPad 2, to simulate the placement of one of their Viera TVs in your viewing environment. When it comes to size and viewing distance, any calculated minimum or maximums should only be taken as a general guide. None of these calculators or rules, nor the individuals who give you advice, take into account your commonly viewed source quality or your personal taste. Don't be goaded by others into buying a bigger TV just for the sake of size over quality or to keep up with the Joneses. By the same token, be aware that it is highly likely that you will adapt to a TV which is at least one size larger than you would initially prefer.

LCD vs. Plasma


Let's turn to perhaps the biggest area of contention and confusion for the average HDTV buyer: the type of display technology to choose. At present there are two major types of display technology which are commonly used in the consumer HDTV arena: Plasma TV and LCD TV. Within the LCD category, there are various types of displays, including the more recent LEDbacklit LCD displays. It should be noted at this point that the Projectors category, which includes Rear Projection TV (RPTV), is not discussed in this article. Front projectors are a perfectly valid choice for home theater enthusiasts, especially those who want the largest screen real estate. But they are not included in this guide simply because they are not as convenient nor as versatile as flat panel displays. Projectors are more suited to specialist home theater applications requiring fairly strict control over the light in the viewing environment. Rear projection TVs are more convenient in this respect, but have been almost completely phased out of production, and are also not discussed. A fourth display type which you may have heard about, called Organic Light Emitting Diode (OLED), is extremely expensive and not yet produced in large enough screen sizes to be considered a genuine proposition for HDTV buyers. OLED is discussed in more detail under Future Technology in the Conclusion section of this guide. So it boils down to the classic LCD vs. plasma war which wages daily on the Internet. As mentioned in the Introduction to this guide, there is currently no perfect display technology, and none is on the horizon. This guide doesn't pretend to resolve the LCD vs. plasma debate; each of these display types has its pros and cons, and requires a level of compromise. The only way you can make the choice is by understanding the technologies involved, the technical specifications and what they really mean, the various quirks and issues of each, and thus the suitability of these displays to your particular circumstances. The Underlying Technology As we'll soon come to see, every pro and con for LCD and plasma derives from the underlying technology they use. These two display types may look similar when sitting side-by-side in a store, but they're very different in the way they approach the reproduction of video.

Plasma

A Plasma TV earns its name from the fact that the primary component used to generate light output is a highly charged gas known as plasma. A plasma screen is made up of a grid array of small gas-filled cells which are very similar to neon lights. Each plasma pixel consists of a set of three of these phosphor-coated cells called sub-pixels, one for red, green and blue respectively. The charged gas in a cell lights up the phosphor coating inside it, producing the relevant colored light. These lit pixels shine through a glass panel to display an image.

Plasma TV has a lot in common with the traditional CRT TV many of us grew up with. In a CRT, an electron gun at the back of the set shoots beams at a glass screen coated with phosphors to produce light, and there are red, green and blue phosphors to produce a color image.

LCD

An LCD TV has a grid array of small liquid crystals which can each change shape, twisting to allow various amounts of the light shining from behind them to come through and help produce an image on the screen. Each pixel in an LCD display is made up of three sub-pixel crystals, one for red, green and blue respectively. The light shining from behind the crystals is called Backlighting, and traditionally, LCD displays have used Cold Cathode Fluorescent Lamps (CCFL). The introduction of Light Emitting Diodes (LED) as the backlighting for LCD displays has improved the way in which they can perform but has somewhat inaccurately earned them the title of LED TVs. This gives the impression that the screen is made up of LED lights, which is false. The correct term is LED-backlit LCD TV, or LED-LCD for short. There are two main types of LED-backlit LCD displays (Full Array and Edge-lit) and one associated factor (Local Dimming) which can apply to either as covered below:

Full Array LED - The LED lights are situated just like a normal CCFL-backlit LCD TV, in an array across the back of the screen. The image quality results are similar to CCFL-backlit LCD, but there may be better screen uniformity and better color reproduction. Edge-lit LED - The LED lights are situated at the edges of the panel and their lighting is then projected towards the middle of the screen and distributed via a diffuser. This is the most common configuration and the main benefit is that it allows for very thin LCD TVs, at the cost of screen uniformity.

Local Dimming - An important factor which can be used with either type of LED backlighting, local dimming provides the best results when combined with a Full Array LED-backlit screen. It basically allows portions of the backlighting to be independently dimmed or brightened depending on the scene. This provides much better contrast ratios and black levels on an LCD screen.

So in summary, plasma is a self-emitting technology because it creates light directly within each pixel. LCD works on the opposite principle, with each pixel filtering the light from a source shining from behind the pixel. The commonality between plasma and LCD-based displays is that they're both relatively thin, hence the name flat panel, and they're both fixed-pixel digital displays with red, green and blue sub-pixel structures. It's their differences however that make them more or less suitable to certain applications, and this is what we examine throughout the rest of the guide.

Brightness, Black Level & Contrast Ratio


A major determinant of the image quality of any display is its brightness, black level and contrast ratio. These related factors affect how much depth and detail the image on an HDTV screen appears to have. A display where blacks look more like greys, or where the image isn't particularly bright, or looks flat is not a particularly desirable one. Unfortunately choosing the right display is not just a case of picking one which is the brightest, or looks to have the darkest blacks in the showroom. For one thing, the typical TV store's display area is quite bright, which favors TVs which are brighter, but which may not have good black levels. It is difficult to distinguish a TV's true contrast ratio in a bright environment. There are also a range of tricks of the trade manufacturers use to enhance these aspects of any TV, at the cost of other areas of image quality. Let's understand the fundamentals first before getting onto those.

The Brightness of a display is measured by its Luminance, usually presented as a figure in either candela per square metre (cd/m2) or foot-lamberts (fL), where 1 fL= 3.426 cd/m2. You will see various reviews or technical specifications quoting the maximum and average brightness of a display, sometimes with very high values up to 500 cd/m2 or more shown. In practice a target value of between 80 - 120 cd/m2 is suitable for most displays in a normal viewing environment. While an LCD can typically provide greater maximum brightness than a plasma, both display technologies can usually achieve sufficient brightness to suit most people. Keep in mind however that when discussing the brightness of a TV, there are two other factors to consider: the amount of ambient light in your viewing environment, and the contrast ratio of your TV - we'll discuss these shortly.

One important difference between plasma and LCD is that while the liquid crystal in an LCD can twist to varying degrees to allow different amounts of light to filter through, and hence vary its brightness that way, plasma phosphors are either lit up brightly (on) or dark (off) at any one moment. Once lit up, they only stay lit for the merest fraction of a second. By using Pulse Width Modulation (PWM) to pulse the amount of current flowing through the cell, the phosphors are lit up hundreds of times a second to maintain brightness, and by varying the width of the pulse so that each phosphor stays on for slightly more or less time for each pulse, the level of brightness of the image can be varied.

Black Level is another oft-quoted but not-fully-understood metric which is a critical element of good image quality. The ability to create darker blacks allows a TV to have a higher contrast ratio, a term which will be explained shortly. The darker the blacks, the greater the appearance of depth and richness in the image shown. Black level is actually not particularly complex; it's just a

measure of the level of brightness of a display when showing video black. It usually has as a very low luminance value, such as 0.004 fL (0.013 cd/m2), but rarely reaches 0.0 cd/m2/fL (true black) because most HDTVs can't achieve this.

To confirm this for yourself, show a black screen on a typical LCD or plasma TV in a pitch black room, and you will still see some light coming from the screen, as the photo above demonstrates. So why isn't black on an HDTV actually equivalent to zero luminance, which is the total absence of light?

On an LCD-based display, black is simulated by the twisted crystals of the panel being completely shut, along with a polarized layer behind the crystals, to prevent light from the backlight filtering through when not required. Yet precisely because the backlight is always on, and the structure of the crystal array is not perfect, some amount of light will leak through the crystals and be seen - this is discussed further in the Screen Uniformity section. More recently, with the advent of local dimming backlighting, some LCD-based displays can switch off portions of their backlight to produce close to true black in parts of the image which require total darkness. Unfortunately this method isn't perfect, as there may be haloing of light around any brighter parts of the image.

On a plasma display, each pixel can be independently switched off to remove its light output, and since there is no backlight, in theory a plasma can produce true black. The reality is that each plasma cell has to be consistently pre-charged so that it can respond quickly enough when light

output is required, giving plasma its extremely fast response time. The side-effect of this precharging is that there is always some residual glow in the pixels, and thus true black is usually not possible on a plasma. On average though, plasmas provide much darker black levels than LCDs.

It should be noted that what some consider the king of black levels, the traditional CRT TV, does not necessarily achieve perfect black either. A CRT's black is darker than either plasma or LCD, primarily because a CRT can simply have its electron beam avoid lighting up particular portions of the screen. However when displaying any scene containing brighter elements, some stray light may affect the dark areas of the screen. In other words, when a CRT is showing an all black screen, black levels are pretty much true black, but when displaying a normal scene containing a mix of brighter and darker elements, black levels on a CRT are not true black, and may be similar to or even worse than a plasma screen.

Now that we understand the way HDTVs can display brightness and the lack of it on the screen, it's time to look at a metric which is supposed to show the range between these extremes on any TV. Contrast Ratio measures the difference in the luminance between the whitest image and the darkest image that a display can show. It's usually presented in a format such as 4,000:1 - this example would indicate that the whites on this display can be up to 4,000 times brighter than the blacks. The main benefit of a high contrast ratio is that in a scene containing both bright and dark elements, a TV can reproduce both elements correctly. That is, the dark areas will look suitably dark, while the bright areas will remain bright. Displays with poor contrast ratios will give more of a "washed out" image due to less of a difference between dark and bright areas.

Unfortunately manufacturers quickly became aware that consumers were paying attention to contrast ratio figures, and since there is no standard enforced as to how to consistently measure

it, contrast ratio figures have now been elevated into ridiculously high numbers, such as 5,000,000:1 or 9,000,000:1. This can be achieved for example by taking measurements from a pixel when it is completely switched off, then comparing it to the pixel when it is lit to the maximum possible level of brightness, then contrasting the two numbers, even though this in no way represents real-world contrast ratios in normal scenes consisting of both bright and dark images. As we see in the Motion Handling section, a similar approach is taken to Response Time measurements. It makes the contrast ratio numbers you see in technical specifications virtually meaningless. In order to achieve darker blacks and whiter whites, which in part justify these unrealistic figures, a technique known as Dynamic Contrast Ratio is now frequently used in HDTVs. The way it works is that the display constantly alters the brightness of the entire image, reducing screen brightness for scenes which are predominantly dark, and increasing the overall brightness for scenes which are mostly bright. The true measure of contrast ratio, also known as Static or Native Contrast Ratio, should provide the difference between the darkest and brightest luminance possible in the same scene - and most displays don't have native contrast ratio capabilities anywhere near the dynamic contrast ratio numbers. A dynamic contrast ratio has several unwanted side-effects, including greyer blacks in bright scenes, and washed out whites in dark scenes. Additionally, depending on how it's implemented, the constant shift in overall panel brightness may become noticeable to the viewer, resulting in what's often described as Floating Blacks, Fluctuating Brightness or Fluctuating Gamma. If the TV has any option to disable dynamic contrast then you can turn it off, however some TVs do not have any such option available. Furthermore, for plasma owners, something known as the Automatic Brightness Limiter (ABL) can't be turned off. This is a protective feature built into plasmas to control power consumption, since on a plasma brightness is directly related to the amount of power consumed. In scenes with a high Average Picture Level (APL) - that is, scenes which have a high proportion of bright elements - the ABL will reduce the overall brightness output of the plasma panel to stabilize power consumption; conversely in scenes which have a lower proportion of brighter elements (low APL), the overall brightness of the scene is allowed to be higher. For example a full white screen is not going to have as much luminance as a small window of white on a dark background, precisely due to ABL. From this discussion we can gather an important fact: most HDTVs, whether LCD or plasma, now use some type of dynamic contrast ratio. This can have annoying side-effects, and can render contrast ratio figures meaningless. So how does someone make sense of all of this? The answer is to take into account your viewing environment, combined with the measured black level and maximum luminance of a display as typically given in reviews. These factors will be sufficient to make a determination, as explained below.

Even the best of us can't see differences in the range of brightness at any one time beyond a notional 1,000:1 contrast ratio. Our eyes work by having a form of built-in dynamic contrast ratio, whereby depending on the ambient lighting of our surroundings, we can detect a lesser or wider range of differences up to the 1,000:1 ratio. As our surroundings become darker or

brighter, our iris adjusts to allow more or less light in, and this affects our perception and general sensitivity to brightness and darkness at any point in time. For example, in a pitch black room, if someone shines a weak torch in your eye it can effectively blind you; in a bright sunlit room, the same torch would have much less effect. Similarly, in a pitch black room, you will notice some light coming from a black screen on even the best HDTV, while even a small amount of ambient lighting in the room can make that same screen look completely black.

Taking advantage of this property of our eyes, if most of your viewing is done in a brighter environment, such as a sunlit room, then a display capable of higher levels of brightness, such as an LCD-based TV, is advisable. The black levels on such a display may not necessarily be great, but your perception of black levels will also be reduced in a bright environment, making it less of an issue. Plasma TVs suffer more than LCDs from having their image "washed out" when subjected to bright ambient lighting. Many plasmas and some LCDs come with special coatings on the screen, known as Anti-Reflective (AR) Filters, designed specifically to counter reflections and glare and thus help preserve a good image under bright light. While the AR filter helps, in practice it still doesn't fully prevent plasmas from suffering more than LCDs in a bright environment.

In darker environments, our perception of light becomes heightened, and thus blacks can look more like greys if the TV doesn't have a good black level. For this reason, plasma is more appropriate for those who do most of their viewing in a darker environment given its superior black level. Alternatively, if you have an older plasma or an LCD TV with relatively poor black levels, or indeed any TV viewed in near darkness, you can install what is known as Bias Lighting - soft ambient lights situated behind the TV which greatly improve perceived contrast on the TV. For those who have mixed viewing habits in both dark and bright environments, an LED-LCD with a full-array local dimming backlight is a reasonable compromise, capable of both good blacks at night and higher brightness levels during the day.

Color Reproduction
Modern TVs work on the principle of mixing the primary colors Red, Green and Blue (or RGB for short) to reproduce a wider variety of colors. This is called the RGB Color Model. By lighting up some or all of the small red, green and blue elements within each pixel, the combination is perceived at a distance by our eyes as a single color for each pixel. The full range of possible colors derived from combinations of red, green and blue is collectively known as the Color Space of the RGB model.

The diagram above shows the spectrum of visible colors as a horseshoe shape, with an RGB Color Space represented by the triangle within it. Each point of the triangle is a primary color blue on the lower left, green at the top, and red to the right - thus the entire contents of the triangle are the range of colors possible by mixing these three primary colors. The point near the centre of the triangle is the color of white, which in this case conforms to the D65 standard, giving a color temperature of around 6,500 Kelvin - see the Calibration section for details.

The actual number of colors available in any digital image displayed on a TV depends on how many gradations of red, green and blue have been used to sample that image. This is also known as its Color Depth and is measured in Bits Per Pixel. For example, an image with 24 bits per pixel color depth allows 8 bits of data for each of the three sub-pixels, or color channels as they're known. Since a bit is the lowest form of computer data, and has only an on or off value, 8 bits equates to 28 = 256 permutations of data. In this case, this means that 8 bits per channel allows red, green and blue to each have up to 256 gradations. Therefore a total of 256 x 256 x 256 = 16,777,216 combinations of RGB color are available to be used for each pixel in the image, providing the potential for very natural-looking color.

Digital video is actually a bit more complex than this, but for now the key point to understand is that the bits for each color channel add up to the total bits per pixel, and the higher these are, the greater the range of colors which can be used. Be mindful of the difference between bits per pixel and bits per channel if you see references to color depth. If someone is talking about "8 bit" RGB color for example, it needs to be clarified further: 8 bits per pixel RGB only allows for reproduction of a maximum of 256 colors in total; 8 bits per channel (= 24 bits per pixel) on the other hand allows for a maximum of 16,777,216 colors. So the difference is significant. There are two limiting factors when it comes to color perception and reproduction. Firstly, the human eye can generally distinguish around 10 million colors at most. Secondly, Blu-ray content is currently color encoded at 8 bits per channel maximum, which equates to 16.7 million colors. While newer standards such as Deep Color or xvYCC allow superfluous support for up to 16 bits per channel, it is not available in any standard source. Now almost all recent LED-LCD and plasma TV specifications show support for 8, 10 or 12 bits per channel. Remembering that 8 bits per channel is 16.7 million colors, given everything from the content to the displays has more than enough color depth for our eyes to perceive smooth natural color, then there shouldn't be any problems, right? Unfortunately, as you may know by now, things are never that simple. The range of colors that a TV can reproduce is only a portion of the full color space, and is known as the TV's Color Gamut. There are differences in this color gamut due to the way each type of TV creates colors, as well as the way its settings are configured. On an LCD, each pixel is divided into red, green and blue sub-pixels. Each one of these colored sub-pixels is a liquid crystal which can be individually twisted to allow varying amounts of light through to form different colors in the pixel. The type and quality of backlighting plays the most important role in an LCD's color reproduction capabilities however, since the precise color spectrum of the light being cast by the CCFL or LED backlight restricts the range of colors the pixel can accurately reproduce. Use of a newer generation of CCFL backlights has improved the color reproduction capabilities of LCDs, and this has been further improved with the introduction of RGB LED backlights, which use a combination of red, green and blue LEDs to allow a more accurate color gamut.

The image above shows a close-up of the pixel structure of an LCD panel, where red, green and blue sub-pixels can clearly be seen. On a plasma, each pixel again has red, green or blue sub-pixels, which are actually red, green and blue phosphor cells that can be independently lit as required. There's no backlight to affect the color output, but because plasma phosphors must be pulsed hundreds of times per second to adjust their brightness as we saw in the Brightness, Black Level and Contrast Ratio section, this affects the number of color gradations they can natively reproduce. To overcome this, plasmas use a technique called Dithering to achieve higher color depths. Dithering involves simulating a color in a pixel by rapidly switching that pixel between two similar colors. This increases the perceived color gamut of a display, at the expense of potentially introducing some visible noise into the image. These "dancing pixels" as some people call them are not to be confused with film grain, which although similar is inherent to most movie sources and is discussed further in the Source Material section. Furthermore the effect can be significantly reduced by calibrating your TV, as discussed in the Calibration section. Dithering is not unique to plasmas, and some LCDs also use it in a similar fashion to simulate an increased color depth. This leads us to the concept of Color Accuracy. HDTVs need to have a color gamut which matches as closely as possibly the Rec. 709 standard in order to display colors as they are intended by those who produced the source material. The standard defines the exact values for the primary colors red, green and blue, as well as the secondary colors cyan, magenta and yellow, within the color space, along with the correct Color Temperature of white. If these values are not set correctly on a TV, the image may appear to have a particular tint to it, and colors may appear unrealistic. Different picture mode presets on a TV will have different color accuracy characteristics, and in the past few years manufacturers have started including at least one picture mode which comes close to having accurate colors out of the box, such as THX or Movie mode.

In any case, all HDTVs can have their color gamut adjusted using the right instruments, as discussed in the Calibration section. The goal is to bring the color gamut as close as possible to the standard. As a side note, one aspect which TVs frequently tout these days is having a color gamut that is greater than the standard. There are various color settings on recent TVs which also try to achieve this. It sounds like a good thing at first, but in practice it can actually be the opposite: a wider color gamut can result in colors which are less accurate and more saturated than they should be. Thus the gamut still needs to be adjusted back to the standard, and these color settings typically need to be disabled. If the source material has a greater color depth than the display can natively support, or if it was encoded at a lower color depth, and/or if the settings on the TV are badly configured, then you may see a phenomenon known as Posterization, also called False Contouring or simply Color Banding. This exhibits itself in the form of distinctly visible color gradations or bands in certain parts of the scene. When displaying the best possible content, most recent displays, both LCD and plasma, will show little if any noticeable posterization. Most color banding which you may notice is inherent to some extent in the source, and the reasons for this are discussed further in the Source Material section.

The screenshot above demonstrates a scene which has noticeable posterization when examined closely (click to enlarge), particularly visible in the cloudy region at the top. The only real way to determine the level of color banding on any TV is to read various reviews, and to also test it for yourself by doing comparisons using a disc of your own with known image quality in the most accurate picture preset on different TVs.

The accurate reproduction of color on an HDTV is a complex task, and most TVs don't do it perfectly. Plasmas generally have an advantage over traditional LCDs in producing a more

accurate color gamut. They also produce a color image more familiar to our eyes because their phosphor-based technique closely resembles that of CRT TVs. LED-LCDs with full array RGB backlighting are now also capable of producing accurate colors which meet the standard. In both cases however, nothing short of calibrating the TV will produce truly accurate colors, as manufacturers tend towards having inaccurately bright and oversaturated colors for TVs out of the box, mainly to make them seem more vivid when displayed in stores.

Viewing Angles
If you're the sole viewer of an HDTV, then for most intents and purposes the viewing angle is not a major issue, as long as you can situate yourself directly in front of the TV at the optimal position and keep yourself there at all times. If however your viewing position is such that you can't always be centered and exactly perpendicular to the screen surface, or if there are multiple people who will be viewing the TV at the same time, then viewing angle becomes a greater concern. The comparison is relatively straightforward: plasmas allow for viewing the screen from any angle (i.e. up to 180 degrees vertical or horizontal) with virtually no deterioration in black level, brightness, color reproduction or crispness of the image. The AR filters on plasmas can result in a slight reduction in brightness at wider viewing angles, but this is usually not major.

LCD TVs - and here the use of LED backlighting makes no difference to this issue - all suffer from a noticeable degradation of the image when viewed at progressively larger angles to the side or above or below dead center. This is simulated in the images above. The viewing angle within which image quality remains optimal can be around 30 degrees from center at best. Once again, this is due to the way the technology works. On a plasma, each individual pixel directly casts its own light from the screen. On an LCD, the source of light is the backlight behind the twisted liquid crystal, so light is not emitting directly from the pixel but from slightly behind it. This results in lower brightness at an angle.

The precise way the LCD technology works, and hence the exact viewing angle capability of any particular LCD TV, depends on the panel type. The three most common panel variants are:

Twisted Nematic (TN) Panel - This technology is the one most commonly found in LCD computer monitors because it is the least cost solution with the fastest response time (i.e. lowest input lag for gaming), at the expense of the worst viewing angles and poorer contrast ratios and color reproduction. Vertical Alignment (VA) Panel - This technology improves on TN panels in most respects, including the best contrast ratios, better color reproduction and wider viewing angles, but has slower response times. In-Plane Switching (IPS) Panel - This technology, with various names like Super IPS and IPSPro, provides the widest viewing angles and the best color reproduction, with only slightly slower response times than a TN panel (which can still result in input lag) - but is the highest cost technology. Variants of the IPS panel are the preferred choice for good quality LCD screens. Regardless of which panel type is used however, LCD TVs still cannot provide anywhere near the viewing angles of a plasma. Viewing angle cannot be altered by changing any settings or by any other method, since it's a physical limitation based on the panel technology used. Be certain you check the viewing angles for any TV in a store before purchase if this is of any concern to you.

Image Retention, Burn-in & Dead Pixels


Image retention, burn-in and dead pixels are maladies which afflict all HDTVs. Depending on your usage patterns and the source material you commonly view, these issues may determine which type of HDTV you should go for, and how you use it over time. But first we need to cut through the sheer myth and hysteria that surrounds these issues, particularly image retention and burn-in. Since this is an area of major concern for many people, we'll cover it in detail in this section.

Image Retention and Burn-in On any phosphor-based display, such as a traditional CRT TV or a modern plasma TV, displaying static image content continuously, or frequently, for long enough periods can result in

the individual phosphors in their screens to be aged unevenly, and even become permanently damaged. To begin with, when a static image is displayed on a plasma TV, the phosphors which are subjected to the greatest voltage (brightness) may build up and retain a charge for a slightly longer period, which usually results in a ghost-like residue of the image which was just displayed, most noticeable against solid backgrounds. This form of Image Retention is extremely common and usually harmless, and can typically be seen for example if you stop a movie during the end credits where bright white text had been displayed against a black background - a residual image of the white text may be seen for a few seconds afterwards. As long as the pixels are then subjected to moving images of varying brightness and color, this sort of image retention usually washes away within a matter of a few seconds.

The photo above shows image retention on a plasma caused by the static HUD of the Fallout 3 video game. The letters 'CND' and 'AP', as well as a long horizontal bar and two shorter vertical bars, are seen most clearly as a faint pink residue at the bottom of the white background.

The next degree of image retention occurs if you view content with static portions for extended periods. This ages the phosphors unevenly, so that those phosphors subjected to higher levels of brightness lose their luminance faster and hence become slightly dimmer over time compared to those which have not been subjected to as much brightness. The best example of this is if you continuously watch movies or TV shows with black bars around them, without viewing much

else in the way of full-screen content. Over time the areas where the black bars are displayed will appear as slightly brighter patches compared to the rest of the screen during regular viewing. Alternatively, you may frequently watch a TV channel which has a fixed station logo, and over time a ghostly image of that logo will be seen over other content. This type of image retention is more serious, and can may even be a form of burn-in, but in many cases may go away over a longer period by viewing a variety of normal full-screen content.

The image above simulates the effect of a TV screen which has run a lot of 4:3 content. The screen now has lighter patches on the left and right sides of the screen, corresponding to the areas where black bars have been shown for extended periods. The final stage, in which phosphor damage becomes permanent and irreversible, is called Burnin. This looks very similar to the image retention discussed further above, but is a more serious issue because it is effectively permanent. It occurs when a static image or an image with static portions, particularly a very bright one, is displayed for extended periods with no change. The phosphors become damaged to the point where they can no longer change their color or brightness sufficiently, and a very clear image will be permanently etched into the screen.

Contrary to popular belief, LCD-based TVs can also suffer from temporary image retention, and even a form of permanent image retention similar to burn-in. As we discussed earlier, LCD panels consist of a grid of liquid crystals which change shape to allow light from their backlight to filter through. If a static image is shown for sufficiently long periods with no rest, these crystals can retain a charge in such a way as to keep their twisted shape and not return to their natural rest state. Although usually temporary, in extreme cases it can become permanent, leaving a fixed image much the same as burn-in on plasmas. On an LCD this burn-in

phenomenon is known as Image Sticking, and is described in more detail in this Philips White Paper. An example of severe LCD image sticking is shown here.

The good news is that both plasma and LCD technology have improved significantly over the past few years to the point where image retention/image sticking/burn-in is not a major issue. Even as far back as 2005, this Pioneer White Paper demonstrated that after 48 continuous hours of displaying a static menu image on a plasma screen, the serious image retention was subsequently removed with 24 continuous hours of regular movie playback. Since then, the use of Pixel Shifting technology to imperceptibly move around the content of the screen regularly so that each pixel is not displaying the exact same content for long periods, as well as improvements in the design of the actual screen structure, have made modern plasmas very resistant to burn-in.

Still, temporary image retention and burn-in are a fact of life on any HDTV. Plasmas are much more susceptible to it of course, but it can occur on LCDs, though it takes much more abuse for it to happen. It depends on the quality of the display and the frequency, duration and brightness with which static or repetitive content is shown. To reduce temporary image retention, and prevent burn-in or image sticking on both plasmas and LCDs, you should take the following steps:

The most obvious measure: reduce the amount of content with any static portions such as TV shows with fixed channel logos or tickers, games with bright fixed screen elements, static menus, desktops or wallpapers, or movies/TV shows with black bars. Vary the type of content you view such that no portion of the screen displays the same image for any extended period of time. There is no hard or fast rule as to how many hours this "extended period" can be - it can be as little as a few hours, or as long as several days or weeks, and varies depending on your settings and the quality of the display. Use the Aspect Ratio function of your TV or DVD/Blu-ray player to zoom in any content which has static portions like black bars, tickers or logos if you intend to view it for an extended period, or repeatedly over time. Do not use very high levels for the Contrast control, and avoid any preset picture modes which do so, such as Dynamic mode (dubbed Torch mode). More details of what the settings do is in the Calibration section. If Pixel Shifting or Pixel Orbiting is an option on your TV, make sure it stays on. Be aware however that certain picture presets can automatically disable this orbiting function. Refer to the TV manual for more details. If a Scrolling Bar or some other specific anti-burn-in feature is available, run it only if you see more serious image retention which does not disappear after a few minutes. It is not necessary or even advisable to run this function frequently otherwise, as normal viewing of a wide variety of content is the best form of image retention removal and protection.

Enable any Screen Saver functionality on your TV, DVD or Blu-ray player so that it kicks in when the display or player is idle for a period of time. This prevents burn-in if you fall asleep in front of your TV for example. Refer to the relevant user manuals for details. On a plasma, conduct a "Break-in" period during at least the first 100-200 or so hours when the phosphors are new and burn more brightly, and thus more susceptible to burn-in. See further below for details. On an LCD, turn off the power for 8 hours after every 24 hours or so of cumulative viewing usage to prevent retained charge in the crystals. For example, turn the TV power off overnight after every week or two of normal viewing. Also power off the TV if it's to be left unused for long periods. On an LCD, avoid high ambient temperatures, as this accelerates the degradation of the liquid crystals in the display. The higher the temperature, the greater the risk of image retention and image sticking. Provide greater ventilation and cooling around the TV. The general rule is that the brighter the static portions of an image, and the longer they are continuously displayed without any break, the greater the chance for long-term image retention or permanent pixel damage. Protect your investment by playing it safe, but don't go overboard either. Viewing a movie with black bars for a few hours for example is perfectly fine on any display and is precisely what they are designed to do; constantly viewing a TV station with a bright fixed logo and with a high Contrast setting on the other hand is a recipe for potential disaster.

Break-In Break-in is a term used to describe a process which involves ageing the phosphors on a plasma TV safely during the initial 100-200 hours in which they are most susceptible to being damaged. This is because when phosphors are new they will burn the brightest, increasing the risk of burnin. This somewhat-dated Panasonic FAQ recommends a break-in period of 100 hours, while this Samsung FAQ recommends up to 250 hours. Professional ISF calibrators will recommend anywhere up to 500 hours of phosphor ageing before an instrumented calibration is to be performed, to let the phosphors settle and achieve uniform brightness. There doesn't appear to be any precise figure, but the common denominator in all of this is that it makes sense to play it safe for the first 100 hours at the very least. During this period try to avoid displaying any static content, don't display any black bars, and don't use an extremely high Contrast setting, nor picture presets which have a high brightness level, such as Dynamic mode. There are a few ways you can get through the break-in period with a minimum of fuss. The simplest is to just watch a variety of regular content in a conservative picture mode, such as Cinema, Movie or THX mode, at the default settings, or close to them. If viewing a movie or TV show with black bars, or fixed station logos or tickers, use the Aspect Ratio function on the TV or player to zoom in to the point where the bars, logos or tickers are no longer visible. Do this for several hours a day over a month and you're in the clear. You can accelerate the process and also make it even safer by using this free Break-in DVD which runs a series of uniform color patterns designed to age the phosphors evenly. Routinely run the DVD overnight for example, and

combined with regular viewing, after a couple of weeks you'll easily reach over 100-200 hours of break-in Although what you've read so far may alarm you, modern plasmas and LCDs are actually very resistant to being permanently damaged by burn-in or image sticking as long as they're not abused. Break-in is simply a sensible precaution, not an absolute necessity. A note of caution though: although plasmas are ideal for gaming because of minimal input lag, especially in Game mode (See the Motion Handling section for more details), anecdotal evidence suggests that long-term image retention is still a problem when gaming on a plasma. It's easy to lose track of time while gaming and as a result you may play continuously for several hours at a time. The typically very bright static portions of a game image, such as the Heads Up Display (HUD) elements (Score, Health Bar, Crosshair etc.) may leave stubborn image residue or in some cases become burnt-in to the display. As such, gaming during the initial break-in period is not wise, and subsequently, you should either reduce the opacity of any static elements like the HUD, or remove them altogether where the option is available, and also break up your gaming session by frequently switching to regular content for a few minutes or running the scrolling bars every hour for example. Once again, it is better to be safe than sorry.

Fixing Severe Image Retention If you already have some form of image retention or image sticking on your HDTV, try the following:

For mild image retention, display a full screen of regular moving content. This should eventually wash away any signs of the retained image within a few minutes to a few hours. For example, run a movie with no black bars or static logos for a couple of hours and see if this removes or reduces the retained image. If you see an improvement, repeat as necessary. For more stubborn image retention, if you have a Scrolling Bar or similar function on your TV to wash away image retention, run it for 5-10 minutes to see if it reduces the problem. Repeat as necessary and combine with the first method above to remove the image retention. For serious image retention, run full-screen regular content with no black bars or static logos for an extended period, such as 24-48 hours continuously, to reduce or remove the image retention. On an LCD, aside from running a full screen of regular moving content for an extended period, at some point turn off the TV, unplug it from the power, and leave it off for 24 hours to see if this helps resolve the issue. If none of the steps above resolves the image retention, then it is quite likely a form of burn-in or image sticking which is permanent for all intents and purposes. Image retention, burn-in and image sticking are explicitly not covered under the manufacturer's warranty as these issues are risks inherent to the technology. However if you notice stubborn

image retention or image sticking/burn-in after a relatively short period of displaying a static image, then it points to an actual fault with the TV, and should be actionable under warranty. Dead Pixels Although the Dead Pixel phenomenon is usually attributed to LCD technology, it can also occur on plasmas. This is because "dead pixel" is a general term used to cover any form of pixel defect, and there are various ways in which a pixel in any fixed-pixel display can actually be defective. In a modern Full HD HDTV with a resolution of 1920x1080 there are 2,073,600 individual pixels, and each pixel has sub-pixel elements of red, green and blue, so the potential for pixel defects is great. A pixel can have one of two general faults:

Stuck Pixel: The pixel has failed and is stuck permanently on. Depending on how many of the sub-pixels within the pixel have failed, the pixel will take on a particular color. For example, if the red, green and blue sub-pixels have all failed and are stuck on, the stuck pixel will be a white dot. If only the red sub-pixel has failed and is stuck on, then the pixel will appear as a red dot against certain backgrounds. Dead Pixel: The entire pixel has failed completely and is stuck in the off position, meaning it does not emit any light. It will appear as a black dot.

The photo above shows a stuck pixel on a plasma TV. Against a white screen it appears as a reddish dot when examined closely, but is not visible at normal viewing distance. To detect any faulty pixels on your display, run several different solid color slides such as those found in the Break-In DVD linked to further above. Any problematic pixels will soon become obvious when the screen is closely examined against various uniform backgrounds. The bad news is that a dead pixel is permanent and unfixable without replacing the entire panel. The good news is that in some cases, stuck pixels can reportedly be fixed by various methods such as those listed for LCD panels here. However be wary of dubious claims made by some software as to

their capability to fix pixel faults, and don't risk damaging your screen by doing anything more risky than gently massaging the area. The easiest solution is to run the TV with a variety of content to see whether any faulty pixel fixes itself through normal usage. In practice the majority of pixel faults are unfixable, and since manufacturers may not replace your display if only one or two faulty pixels are found, it may be best not to go looking for them. Unless the faulty pixels are numerous or noticeable, they are typically invisible at normal viewing distances while viewing normal content, and hence not worth seeking out and becoming obsessed with, especially as any replacement stands a fair chance of having the same or even more pixel (or other) faults. Technological limitations mean that manufacturers have no way of guaranteeing that every one of the pixels on a multi-megapixel display will function perfectly, particularly for consumergrade products which are always a compromise between cost and quality. Faulty pixels are usually covered under manufacturer's warranty, specified as a particular number of pixels per screen size or panel type or particular location required to be considered defective and thus eligible for a warranty claim. This varies from manufacturer to manufacturer, so be sure to read their dead pixel policy before purchase if this is a concern.

Motion Handling
As we briefly covered in the Video Basics section of this guide, the standard for most video is 50Hz or 60Hz, which usually translates to 50 or 60 frames per second respectively on a progressive scan HDTV. Taking 60 FPS as an example, this means that every second an HDTV shows 60 frames of video to form a moving image. This equates to a new frame of video being displayed every 16.67 milliseconds (ms), where 1,000ms = 1 second. The way in which they handle motion is a point of major difference between LCD and plasma TVs. When handled inadequately, you will experience issues such as motion blur, smearing, judder, phosphor trails or input lag.

On an LCD-based display, the total time taken for a pixel to change from black to white to black again is called the Response Time and is measured in milliseconds. The type of backlight used is not relevant to this process, since the measure is of the time taken for the liquid crystals to twist. Most modern LCD TVs advertise response times of 4 ms (120Hz models) or 2 ms (240Hz models), and in more recent cases, even as low as 1 ms (480Hz models). On a phosphor-based screen, such as a traditional CRT TV or a plasma TV, the response time is effectively well under 1ms, with the latest plasmas having pixel response times which are nearinstantaneous (around 0.001 ms). This is because once a phosphor is lit up, it can be instantly switched off again. One of the reasons for this rapid response time is that Plasma cells are in a constant state of pre-charge. This rapid response time is also where the marketing term 600Hz Sub Field Drive comes from. It means that for 60 frames per second of video, the plasma uses Pulse Width Modulation to control the power to the phosphors, pulsing them at 10 times per frame to maintain a bright image - 10 pulses per frame x 60 FPS = 600Hz. This does not equal 600 frames per second though, it is simply 60 FPS with 10 flashes per frame. See the Brightness, Black Level & Contrast Ratio section for more details of plasma pre-charging and Pulse Width Modulation. In theory, since only a 16.67 ms response time is needed for 60 FPS video, both the faster LCDs (4 ms or less response time) and any plasma TV (less than 1 ms response time) should be able to easily reproduce motion without any issues whatsoever, right? Wrong. There are issues inherent to both the technologies as well as the source material involved which prevent this.

LCD TV manufacturers often quote response times which measure the time taken to switch between two arbitrary colors, such as two tones of grey, or whichever other color transition is the fastest, rather than the correct but slower black-white-black time. This means the response time measure they provide in their technical specifications represents the optimal speed with which certain transitions occur, not the average or worst possible speed. Even the fastest LCD TV with the lowest quoted response time may exhibit motion blurring or other motion artifacts at times simply because the crystals can't change fast enough through certain transitions. While plasmas are extremely fast and don't have any such issues with response time, there are two potential drawbacks of plasma's phosphor-based technology. The first is that a plasma pixel isn't constantly lit up the way an LCD pixel is, and these "gaps" in the image when a phosphor momentarily loses its brightness, however small, can result in some people perceiving a form of flickering, especially on brighter backgrounds. The second issue is a result of the way in which phosphor-based colors are mixed in a pixel and then certain portions of those colors fade away more rapidly than others leaving a typically green or yellow trail behind a moving object on the plasma screen. More details, including ways of reducing these phosphor trails, are in this article. In practice the majority of people do not really notice any flickering or phosphor trailing, especially as newer 3D-capable plasma TVs also have faster phosphors. However a small proportion of people find one or both of these issues very noticeable and quite annoying, so it is something to look out for on plasmas.

The heart of the motion handling issue lies in the way our eyes perceive images. Due to a property of the human eye known as Persistence of Vision, we retain a brief afterimage of what we've seen for around 10 ms or more after it's disappeared. While watching a plasma this doesn't cause any problems; the phosphors can be turned on and off almost instantaneously, and they do not retain an image (afterglow) for more than 1 ms at the most. In fact as a plasma pixel momentarily goes dark between refreshes, our persistence of vision helps to smooth things out by retaining the image to help fill in that gap. On an LCD, the afterimage in our eyes can cause issues: the LCD pixels can't change quite as quickly, and they remain a particular color until they change - they usually don't blank out or switch off between frames. The result is that on an LCD, the persistent image in our eyes can actually clash with the residual image on the screen, and there is greater chance for perceived motion blurring if there is even the slightest difference between these two.

The screenshot above is split into two portions: the left side shows ideal motion handling, while the right side simulates the motion blur which can occur if motion is handled poorly. Manufacturers are well aware of the inherent motion handling drawbacks of LCD panels, and as response times have dropped, they have introduced various techniques to increase the refresh rate and reduce image persistence problems on LCDs. This is why 120Hz and 240Hz LCDs are becoming more common. One or more of the following techniques may be implemented on a particular LCD TV to achieve higher refresh rates and/or frame rates and thus achieve smoother motion:

Inserting black frame(s) between original video frames - This mimics phosphor-based displays which briefly turn off phosphors between each refresh. Motion Interpolation - By inserting a number of artificially-generated intermediate frames between each real (original) frame of video, the perceived frame rate increases. Strobing the backlight - The backlight is flashing instead of always on, a method typically used to double the perceived refresh rate, similar to the way a cinema projector flashes the same frame multiple times as mentioned in the Video Basics section. While these techniques can noticeably improve the appearance of motion on LCDs, they can also introduce artifacts to the image. Motion interpolation in particular can make motion appear unnaturally smooth, earning the nickname "Soap Opera Effect" in reference to the fact that it makes cinematic content look like it's been shot on the video cameras used to make daytime soap operas. Fortunately, motion interpolation can usually be turned on or off as desired.

Motion handling is most important when viewing fast action content, such as sport or action movies. If your viewing consists of a significant amount of these types of sources, then you should consider plasma over LCD, since despite improvements, plasma's motion handling is superior and more natural to the eye than LCD. Pulldown Judder and 24p One aspect of motion handling which can occur on any display technology is loosely known as Judder. We discussed the pulldown conversion process in the Video Basics section of this guide converting 24 FPS movie material into the 60Hz-based NTSC standard for video playback will result in some framerate unevenness known as judder: a slight jerkiness which is particularly visible as the camera pans around. It occurs on both plasma and LCD, because it's independent of the way the screen refreshes itself.

For those who don't like pulldown judder, playing back movies at their original unaltered framerate is possible. This Native 24p option must be explicitly supported by both your HDTV and Blu-ray player, so check the technical specifications of both. Since DVDs and Blu-rays are typically encoded with 24 FPS (24p) movies, and if your TV and player support it, native 24p allows you to see movies without any frame rate alteration through pulldown. The downside is that 24 FPS is not a particularly high frame rate, which is why various methods of converting the frame rate to the higher 50Hz or 60Hz standards were implemented in the first place. The native 24p content will have noticeable motion judder/frame break-up during fast camera pans. It may also have additional flickering, depending on the refresh rate at which your TV handles native 24p content. Whether plasma or LCD, some TVs will flash each frame multiple times to create a higher refresh rate in multiples of 24, such as 48Hz, 72Hz, 96Hz or 120Hz. A higher refresh rate will reduce flickering but because the source is still playing back at 24 FPS (as intended), motion problems due to low frame rate will always be there. Some HDTVs also add the option of motion interpolation in native 24p playback, which will smooth out this motion judder. In some ways this defeats the purpose of watching native 24p, since the original 24 frames per second are now being altered through artificially generated frames inserted into the video, and once again, Soap Opera Effect may be noticeable. In terms of settings, your choice essentially comes down to which of the three common forms of source motion issues you prefer: pulldown judder, 24p judder, or the artificial smoothness of motion interpolation. Because pulldown has been used for so long on TV, most people have become used to it and don't really notice it. Furthermore, pulldown judder also provides a mild form of motion blurring which covers up 24p judder. See this article for a more detailed explanation of why pulldown judder may actually be preferable to native 24p judder.

Input Lag For anyone who wishes to use their HDTV as a display monitor for a gaming console such as an X-Box 360 or PlayStation 3, or for PC gaming, then an additional factor needs to be considered: Input Lag. This manifests itself in the form of a slight delay between pressing a button on your controller or keyboard, and the resulting action within the game on the screen. It's most noticeable in fast-paced action games or games where precise timing is absolutely critical. Despite the extremely fast response times of plasma TVs, and the increasingly faster response times of LCD TVs, input lag can still occur on all types of HDTVs, moreso than LCD computer monitors. This is because the video image on an HDTV typically undergoes various transformations to improve its appearance. This image processing can include rescaling the image, removing jagged edges, reducing the noise, adjusting the motion, sharpening the picture and enhancing the colors. All of these add some form of delay to the display of the image. Some of these image processing options can be disabled in the display options - see the Calibration section for details. Most recent HDTVs also come with a specific Game mode designed to reduce input lag by automatically disabling as much of the unnecessary image processing as possible; it may noticeably reduce image quality but it will also reduce any input lag. There is no easy way to accurately measure and compare the input lag on a display, particularly as input lag can be as little as 30 ms or less (i.e. 30/1000ths of a second). There are various tools for measuring this input lag, including: this Input Lag Test, this Online Monitor Test and the Human Benchmark. The best method is to actually try a display out before purchasing it, but a close second-best is to read various reviews and user feedback, as these should tell you the degree to which input lag is actually noticeable on a particular TV.

Screen Uniformity
The concept of screen uniformity applies to how the image is distributed across the screen. A uniform screen displays an image which remains consistent in terms of clarity, color and brightness at all points along the display area. There are fundamental differences in the screen uniformity of plasmas and LCD-based displays. On a plasma display, each phosphor-based pixel emits its own light through a glass panel. Plasmas typically have excellent screen uniformity, as each phosphor can burn equally as bright. However a plasma TV's screen uniformity is affected by various factors, the most significant being the quality of the glass panel, and the way in which the phosphors are aged. The glass panel on modern plasmas often has some form of special anti-reflective (AR) filter coating designed to reduce reflections and improve black levels. However the AR filter can be of varying quality, and may also not be applied uniformly. This can result in Dirty Screen Effect (DSE), which as the name implies, gives the screen the appearance of being dirty. The term is used somewhat loosely to describe a phenomenon which occurs when displaying panning scenes

containing large areas of light/pastel colors; very faint horizontal or vertical static bands of dusty appearance can be seen during such pans. This is more common to plasmas, as it is usually due to slight unevenness in the glass panel or the AR filter applied to the glass, but it may also occur due to small differences in the brightness of pixels. In rare instances the screen may actually be dirty, in which case turning the TV off and using a microfiber cloth and distilled water to wipe it clean can be attempted. Uneven phosphor ageing, image retention and burn-in can all affect screen uniformity on a plasma, and are covered in detail under the Image Retention, Burn-in & Dead Pixels section of this guide. But aside from user-initiated image retention or burn-in, plasmas can also have inherent pixel faults which result in non-uniform areas. Such issues, usually referred to as patches, blobs, or clouds, are typically areas of the screen where the pixels are biased towards a particular color, and are most noticeable when viewing a uniform background, particularly a solid white or mid-grey background.

The photo above displays shows a non-uniform screen. On an all-white background, this screen has a somewhat dirty appearance, with banding clearly visible. However, as bad as this looks, it may not noticeably affect the image when watching normal content. Plasmas also suffer to varying degrees from an issue known as Line Bleed, a term used to describe an effect whereby a ghostly shadow-like extension of a bright image can be seen. The most common example is when an image with bright horizontal lines, such as window blinds, or the lines on the TV's built-in menus, can be seen extending beyond the boundaries of their original area. This occurs on plasmas because the power is not evenly distributed along a row or column of phosphor cells, and thus can affect nearby cells. Line bleed is an inherent part of

plasma technology and is not considered a fault unless severe.

The photo above demonstrates line bleeding, visible as extensions of the lines in the background on the face of the woman in the foreground. Some of these issues may be reduced over time on a plasma as the phosphors age, and all issues can be minimized to some extent if the TV is calibrated correctly as covered in the Calibration section. Aside from line bleed, other issues such as clouding, blobs, banding or DSE are faults and if noticeable during normal viewing then should be reported to the manufacturer and covered under warranty. In short, plasmas should not have any significant screen uniformity issues so long as they are not abused. On an LCD-based display, uniformity is a greater concern, because each pixel is not generating its own light; the light is coming from a larger backlight of some kind and is being filtered through each pixel. The location of the backlight - whether it is an edge-lit display or a true backlit display - and how that light is diffused across the screen are one factor in determining the extent of screen uniformity issues on an LCD-based display. Another is the type of LCD panel used - see the LCD panel types covered under the Viewing Angles section - and whether during the manufacturing or subsequent transport process, any pressure was placed on the screen. This determines how much of the backlighting leaks through the closed crystals. The standard uniformity issues which may be encountered to varying degrees on any LCD-based display are typically described with one or more of the following terms: Backlight/screen bleed, clouding, flashlighting, spotlighting, or banding. They all refer to different manifestations of the way in which light from the backlight is shining through the liquid crystals even when they're fully closed (i.e. when displaying black). These issues appear as lighter patches against darker backgrounds, most visible when viewing a completely black screen in a dark environment. More

specifically, backlight bleed, flashlighting and spotlighting are usually references to patches of brighter light coming from the edges of the screen, as though a torch or spotlight is shining outward from the edges, or the backlight is "bleeding" light from the corners. Clouding and banding more commonly refer to what appear to be large clouds or thick bands of lighter patches, most visible around the middle of the screen.

The two photos above demonstrate different LCD screen issues. The first one shows clouding, with bluish patches visible on a dark screen; the second photo shows backlight bleed, visible along the edges of the screen on a dark background. This article has more photos which provide further examples of screen uniformity issues on LCDs.

There is another form of light bleeding from the backlight, in the form of a halo of light being visible around brighter objects. Local dimming LED-LCDs have been able to provide excellent black levels, however because this local dimming occurs by zones rather than per-pixel, the TV does not have the ability to control light in a perfectly precise manner. This can result in a halolike effect, most noticeable around white objects shown against black backgrounds.

In general the more expensive full-array LCD-LED panels with local dimming show less screen uniformity issues, and the cheaper LCD panels, particularly those which are edge-lit, are more likely to show these screen uniformity issues. Important aspects include the design of any diffuser used to distribute the backlighting and the quality of the LCD matrix itself. Any professional review of an LCD panel should include an indication of screen uniformity, and in some cases even a summary of readings taken at across various points of the screen, showing the results in terms of deviation from expected luminance. Aside from reading various reviews and testing any potential LCD TV for yourself, there is no way to ensure that it will not have screen uniformity issues, as again, this is inherent to LCDs in varying degrees. This means that unless your uniformity issues are particularly severe, you may have problems claiming it as a fault and hence getting recourse under warranty. If you already own a TV which has screen uniformity issues, the only solutions are to:

Reduce the brightness of the backlight on LCDs. Avoid watching it in a very dark environment. Install bias lighting behind the panel to decrease the perception of subtle brightness imperfections across the screen - see the Brightness, Black Level & Contrast Ratio section for details. You should not obsess over screen uniformity, as most recent HDTVs will exhibit some form of uniformity-related problem. In practice there are few perfectly uniform displays, especially at larger screen sizes. The key issue is whether the problem is noticeable during normal viewing, and not just in artificial tests such as solid color slides which always highlight even the smallest uniformity issues. In general plasma beats LCD in terms of screen uniformity, as uniformity issues are inherent to LCD technology. On a plasma you have a greater chance of being able to return or replace the TV under warranty due to any uniformity issues, as these are not considered normal for plasma technology.

Power Consumption
As concern over the environment grows, so do concerns over the amount of energy used by HDTVs, particularly as the screen sizes keep increasing. Consumers are also worried about hidden costs in purchasing an HDTV, and don't want to pay large energy bills for the privilege of watching big-screen content. Manufacturers have responded to these concerns by introducing new technologies which have lowered the power consumption of both LCD and plasma TVs.

Plasma TVs are notably less efficient than LCD TVs. On a plasma each pixel emits its own light, and as resolution and screen sizes, and hence the number and size of the pixels, have increased, plasmas consume more power to create a bright image. On an LCD, since it is the backlight which consumes power to create light, changing the number or size of the pixels filtering the light from the backlight does not have as large an impact on power usage. The backlight on an LCD is also a very efficient and extremely bright source of light, so it too can be increased without consuming large amounts of additional power. Both plasma and LCD have been subject to constant improvement to reduce their power consumption. On plasma, the introduction of new technology such as Neo PDP has allowed for larger phosphor cells which emit more light at lower power. On LCD, the use of LED backlighting which is more efficient than CCFL backlighting has further reduced power consumption. A range of "Eco" features have also been incorporated into the TVs to reduce average power consumption, such as adjusting the brightness based on ambient lighting, though this can be at the cost of image quality. In short, these changes mean that the latest HDTVs can consume much less power than their predecessors, and the difference between the two technologies is now smaller than it has been in the past. On both plasma and LCD, the amount of power consumed varies depending on the brightness of the image. Furthermore, the relatively low manufacturer-quoted power consumption figures are usually inclusive of enabling the Eco features, which as noted can reduce image quality. Therefore any real-world comparisons made between the two types of technology needs to be done when the TVs are calibrated to achieve similar optimal levels of image quality and brightness. Comparisons must also be made at similar screen sizes, since larger screens will use more power, all other things being equal.

To see the actual difference, let's compare two CNET reviews of close-performing TVs at the same 55" screen size: the Panasonic TC-P55VT30 high-end plasma, and the Sony Bravia XBR55HX929 high-end local-dimming LED-LCD. The plasma tested at 133.11 watts default, and 285.63 watts calibrated; the LCD-LED tested at 127.96 watts default and 61.39 watts calibrated. This is a typical pattern for these two technologies - plasmas are generally dimmer at default settings, particularly with the Eco-related options enabled, while LCDs are typically overly bright at default settings. So when calibrated, plasmas tend to use more power, and LCDs use less. They will also both use more power in 3D mode, which requires more brightness to compensate for the loss of light due to the dimming effect of the 3D glasses - see the 3D section for details.

Back to our comparison, and the difference seems very large after calibration: 285.63 watts for plasma vs. 61.39 watts for LED-LCD. When the cost difference is calculated, based on US energy charges of around 11.5c kw/Hr and a usage pattern of 5.2 hours per day, it equates to $62.71 per year for the plasma vs. $13.66 per year for the LED-LCD. Over 5 years it would cost roughly $314 to run the plasma, while the LED-LCD would cost $68; an advantage of $246 to LED-LCD. So despite the improvements, there's still no contest in terms of power efficiency: plasma is notably more costly to run than an equivalent LED-LCD. However when we examine total cost of ownership, the figures reverse. The plasma in our example retails for around $2,125, while the LED-LCD retails for $2,839. Add in the power usage costs and the total cost over 5 years is: plasma $2,439, LED-LCD $2,907; an advantage of $468 to plasma over the 5 year period. Even over a 10 year period the plasma will still be ahead by $291. This data gives you various options, depending on your priorities:

If reducing your carbon footprint is the primary concern, then without a doubt a new LEDLCD is the way to go, and at smaller screen sizes. If ongoing running costs are a significant factor, then in practice the figures are not particularly staggering and either plasma or LED-LCD is fine - whether $62 or $13 a year, we're not talking large sums of money either way. Also keep in mind that if you use your TV for less than an average of 5 hours a day and/or you purchase a screen smaller than 55", the total cost will be less than shown in our examples. If total cost of ownership is foremost in your mind, plasma is significantly cheaper due to its much lower initial purchase price. Importantly, these comparisons don't take into account image quality differences and the various issues discussed throughout the rest of this guide. They also assume the TV will be calibrated; in reality individuals may use dimmer or brighter settings according to taste and viewing environment, which can change the comparisons. You also need to consider the fact that older HDTVs use more power, so even upgrading from an old to a new TV of the same type will yield noticeable improvements in energy efficiency if you don't wish to switch technologies. Finally, if your carbon footprint and general environmental impact are a major issue for you, then energy comparisons are often of secondary consideration because they don't take into account the full

impact of the manufacturing methods and the energy used to create these TVs, nor their disposal methods and any long-term environmental harm their components may do.

3D
The use of 3D in movies first became popular in the 1950s, utilizing a method known as Stereoscopy which merged two images taken at slightly different angles into one film. When viewed through a simple pair of glasses with two different colored lenses, the image appeared to have the three dimensional quality of depth.

In the 21st century, 3D has been resurrected for mainstream movies and now home theatre viewing once again through the use of special glasses. As before, the 3D movie is made up of two slightly different images, however the glasses used are more complex. In the simpler method used in the 1950s, the two separate images were different colors, and hence the different colored filters on the glasses directed the correct (left or right) image to the appropriate (left or right) eye. The tinted glasses didn't allow for optimal image quality, so the latest methods use glasses which are polarized (passive) or have special shutters (active) to filter out the images and direct the correct image to your left or right eye as appropriate. More on the glasses shortly.

One point of confusion we need to clarify for those new to 3D TV is the fact that a 3D HDTV is not made solely for viewing 3D content. An HDTV which supports 3D is just the same as a regular 2D HDTV; the 3D functionality is simply an additional feature which can be enabled on demand. In fact 3D TVs are typically slightly better at 2D content than their non-3D cousins, because a 3D TV requires higher refresh rates and faster phosphors, and these benefit 2D content as well. So there are no drawbacks to purchasing a TV with 3D capability, except perhaps the cost.

Aside from a 3D-capable TV, there are other requirements to make 3D work. There are additional 3D-specific parameters in the video source which require a 3D Blu-ray player (or a Sony PlayStation 3) to decode, and if connected via an AV receiver, it too must have 3D support (part of the HDMI 1.4 High Speed standard) to pass on these parameters without changing them. If your receiver doesn't support 3D - typically models made before 2010 do not - then you can still play 3D content via the receiver by purchasing a 3D Blu-ray player with two HDMI outputs; one output goes from the player to the TV for video, the other goes to the receiver for audio. The HDMI cables themselves should support HDMI 1.4, but in practice older HDMI 1.3 cables will often be fine as long as they will have sufficient bandwidth (i.e. rated as "High Speed") - see the Calibration section for more details on HDMI cables. Finally, the Blu-ray disc itself must be encoded for 3D for optimal effect, but some TVs and Blu-ray players do allow for 2D to 3D conversion of normal video material with varying results.

The biggest choice to make regarding 3D TV is whether you opt for passive or active 3D technology. A 3D TV will usually only support one of these methods, not both. Similarly, each method requires a different pair of glasses. Passive technology involves the use of glasses which been polarized to block different kinds of light from each eye. The primary advantage of passive is that the glasses are lightweight and inexpensive, and work in a fuss-free manner, similar to 3D in the movie theatre. However passive 3D means that each eye is only receiving part of the full

1080 lines of resolution, and only when combined with both eyes does the image achieve 1080p. This may be sound like a technical quibble, but in reality it can result in a noticeable interlacing effect. Active 3D uses complex battery-powered glasses which employ LCD screens in their lenses to successively dim the left or right eye in sync with the image shown on the screen. The image is clearer than passive 3D, and some say it has a more convincing 3D effect, but the glasses are much more costly, bulkier, and can suffer from periods when the TV and the glasses lose sync. One last consideration is that virtually any passive 3D glasses will work with any passive 3D TV, but at the moment, active 3D glasses use proprietary technology and will only work with the same brand of active 3D TV. This is set to change in the future as 3D glasses become standardized.

In terms of the differences between LCD and plasma, because of its higher refresh rate, plasma can reproduce each side of the image (for each eye) at higher refresh rates than an LCD, which allows for smoother motion perception in 3D mode. The faster phosphors which 3D plasmas have also allows for a reduction in phosphor trailing, which benefits both 2D and 3D material. Because of their speed advantage, plasmas also exhibit less of a 3D phenomenon known as Crosstalk, which occurs in periods when the left and right images are not fully isolated, and a faint ghostly duplicate image appears in the scene. On the other hand, LCDs are capable of greater brightness which can be useful, since 3D requires brighter imagery to compensate for the dimming effect of the glasses.

There are some other considerations when it comes to 3D:

Roughly 10% of the population cannot perceive 3D images, so 3D TV will have no impact on them. You must try out 3D before purchase to see if you are one of these people. A reasonable proportion of people also become nauseous, or experience headaches or fatigue when viewing 3D content, particularly for an extended period. To reduce the negative impacts of 3D TV, manufacturers recommend sitting no closer than a viewing distance which is around three times the height of the TV. For a 50" screen which is just under 25" high, that means a 75" (6.3 feet) viewing distance or more is recommended. 3D content is still relatively limited - at the time of writing, around 70 3D Blu-ray movies are available, and a few 3D TV Channels are broadcasting around the world. There is however the option of converting 2D to 3D if your equipment supports it. Power consumption rises when using 3D, as you need brighter settings to compensate for the way the glasses dim the image when filtering for each eye. In many respects, the choice of whether to buy a 3D TV is not an important one, because manufacturers are now adding 3D capability to their mid to high-end TVs by default. Even if you don't want it or won't use it, it's there anyway. Your choice therefore is whether you specifically want to opt for a passive or the more common active 3D TV, and whether the TV comes with any glasses, or whether you have to purchase them separately - this can become quite expensive if you have multiple family members who want to view 3D at the same time. The only way to determine how good 3D looks, which type you prefer, and whether it's worth the additional investment in glasses, is to demo it in a store or at a friend's home. Be aware though that the 3D effect and any drawbacks such as crosstalk will also vary by content. For more details on 3D TV see this 3D FAQ.

Source Material
Possibly the most important area of HDTV, but one which is too often ignored or frequently misunderstood, is the flaws in source material. Displaying poor quality source material on even the best TV can result in disappointment, while displaying superb source material on a mediocre TV can make it seem much better than it really is. A very simple way to express this concept is "Garbage In, Garbage Out"; an HDTV's performance is only as good as the weakest link in the chain.

Contrary to popular belief, the advent of HD Video in the form of Blu-ray Disc has not meant that source quality is now unequivocally superb. To understand why, we need to briefly examine how a movie goes from source to disc to your screen, and the alterations and inevitable compromises which occur along the way. A typical movie is shot on film using an analog process. The difference between analog and digital video is important:

Analog video is obtained through a chemical process which captures and stores fluctuations in light as images on film. The level of detail of the stored image depends on the type of film and the camera technology used to capture it. An analog film image isn't stored at any particular resolution, but can be measured to have an equivalent resolvable level of detail of up to 6,000 lines or more. Digital video is a digitally encoded sample of an original image. Its overall quality is determined not just by resolution, but by how much video data is encoded in the sample, measured in bits per second, known as its Bit Rate. Most movies are shot on analog film of varying quality at 24 frames per second. The low frame rate is an industry standard resulting from decisions made early in the history of motion pictures. The actual film on which the movie is recorded is called the Master, and has the highest possible quality given it's the original source. At this point, the key factor in the quality of the movie is the film format which was used. Using 70mm or 65mm film for example allows for higher image quality than the standard 35mm film. Similarly, different types of 35mm film can be used depending on a film's budget and the look a director is after. This means that the amount of detail, film grain, the appearance of colors, etc. will vary from movie to movie based on the physical characteristics of the film medium and camera lenses used. These quality issues are all inherent in the source and will never go away unless the film is deliberately altered from its original form.

The master negative then typically goes through a Telecine process to scan and convert it into a format suitable for playback, also referred to as a transfer. Most common these days is a digitization process known as a Digital Intermediate (DI) which is done before the film is finalized. Regardless, since the film is being digitized at some point, a resolution must be chosen for the resulting digital copy. These digital scans are frequently referred to in short hand as a 2K Scan for a 2048x1536 resolution, or 4K Scan for 4096x3072 resolutions, or even higher. Note that unlike the shorthand used to denote HDTV resolutions such as 1080p, these refer to the pixel width, not the pixel height of the digital image. The digital scan is rarely the cause of any significant quality issues, as even a 2K scan has more resolution than Blu-ray's 1920x1080, and much more than DVD's 720x480.

If the film is showing any form of physical degradation, it may be cleaned and restored first. During the process of creating the final digital master, the film can also be remastered, allowing for a range of changes to the original which will be reflected in the final copy. For example, decisions may be made to digitally clean or alter the film, changing its look in terms of grain or color reproduction, or to include/exclude scenes which may or may not have been in the original master. These changes can definitely affect the image quality of the Blu-ray or DVD which we see.

Finally, the digitized master has to be converted into data which can physically fit onto a DVD or Blu-ray disc. A dual-layer DVD can hold up to a maximum of 8.7 GB of data, while a dual-layer Blu-ray disc can hold up to 50 GB. The original digital scan of a movie is uncompressed and at high resolution, and thus can be many hundreds of GB in size. So it must be reduced in size, typically using a video compression standard such as MPEG-2 for DVD, or H.264/MPEG-4 AVC for Blu-ray. This compression process results in some loss of detail, the degree to which depends on how large the original movie data was before being forced down to less than 50 GB or 8.7 GB to fit on a disc. This by the way is the fundamental reason for Blu-ray's image quality improvement over DVD: it can hold almost six times more data, allowing for a higher resolution and bit rate.

As a result of the various processes involved in creating and transferring any movie on Blu-ray, DVD or Digital TV, the following quality flaws may be exhibited:

Compression Artifacts: Almost all consumer digital video content has been compressed from the original source to fit a particular media size or bandwidth limitation. This compression results in visual anomalies, the noticeability of which depends on the method and degree of compression,

and the size of your screen. The effect of this compression is similar to taking a digital photo and then reducing it into a much smaller JPEG file to obtain a manageable file size. One common form of visible anomaly is Macroblocking, which appears as large blocks of low-resolution within the image. This image quality flaw is most commonly seen in Digital TV transmissions which use a lower bitrate to save bandwidth, as well as pirated movies which use lower bitrates to achieve smaller file sizes for faster downloading.

The screenshot above from the movie Avatar is shown in minimally compressed form on the left, and then again with much greater compression. By looking at the small image above, you may notice little if any difference between the original and compressed images - this highlights that your screen size and how close you sit to it has a major impact on the visibility of artifacts as discussed earlier. Click to expand the image and you should now notice the difference: the compressed side shows visible loss/blurriness of detail and macroblocking, most prominent on the blue tubes to the top left.

Film Grain: Most movies are shot on film stock which contains some level of Film Grain inherent to the process. Only more recent content shot entirely on digital cameras, such as the movie Avatar, may be devoid of such grain. A good way to check the level of film grain in any Blu-ray movie before purchasing it, or to see if it matches the level you see on your TV, is to look at the screenshots in the disc reviews at Blu-ray.com. People often blame their TV for dancing grain, when in reality it is frequently the grain in the original source which is visible, as shown in the screenshot below.

Ageing Film: Older movies are more prone to film grain, specks of dust, discoloration, blurriness and other visual anomalies resulting from the physical deterioration of the original film stock over time. This is made worse if a cheaper film format was used. The popularity or cultural significance of a film determines whether it is financially viable to do a full restoration on it, as a proper restoration which remains faithful to the original can run into the millions of dollars. Some films may never be properly restored, and even those that are may still show some agerelated quality issues.

Poor Remastering: Some movies are only "restored" by altering the digital scan, not by restoring the original master. This opens them up to being altered in ways which are not faithful to the original source. The most common issues which occur when restoring movies during remastering are due to excessive Digital Noise Reduction (DNR) and Edge Enhancement (EE). DNR is designed to reduce unwanted image noise due to scratches, dirt, excessive film grain, and image processing factors. Edge Enhancement artificially enhances the sharpness of an image, similar to the way the Sharpness control works on your TV. Ideally, these methods shouldn't be used at all, or used very mildly, as they are not really a substitute for cleaning and restoring the original. A physical restoration is the only way to increase the detail in a film without using artificial methods. Unfortunately, since the average consumer expects a clean and crisp look to movies on Blu-ray, DNR and EE are sometimes employed in a heavy-handed fashion, introducing visual anomalies to the film, such as loss of fine detail, and haloing around edges. A recent and very controversial example of this is the remastering of the first Predator movie.

The screenshot comparison above shows that the original Predator release on Blu-ray to the left has noticeable grain. The newer Ultimate Hunter Edition release of Predator on Blu-ray has been digitally altered from the original, such that it is much cleaner and devoid of grain. However if you click to enlarge the image and look closely, the new version has also been brightened and the colors altered (e.g. look at the hat). Due to excessive use of DNR, if you look at Carl Weathers' face, it is now waxy and artificial looking in the new version. This an example of remastering which is not faithful to the source.

Rescaling: If a movie or TV broadcast is encoded at a resolution which does not exactly match your TV's native resolution then the image must be rescaled to fit the screen. This image scaling is automatically done by a Video Scaler at some point, whether within the media player, AV receiver, or in the TV itself. For example, a Blu-ray player will upscale a DVD from its original resolution of 720x480 pixels to 1920x1080 pixels. However the upscaling process can't add more detail to the original information, so no matter how well it's done, there will always be some reduction in image quality which comes from spreading a fixed amount of video data over a larger number of pixels. Different video scalers handle upscaling with varying degrees of image quality loss. The impact of scaling on an image can include softening, jaggedness on edges, and effects similar to compression artifacts.

In general the exact same transfer of a movie will always look better on Blu-ray because it requires six times less compression than on DVD, and in turn the DVD version will often look better than the Digital TV version because of bandwidth limitations which often result in lower bitrates on Digital TV compared to DVD. However each and every movie will differ in appearance and overall quality based on the various factors covered in this section. The Blu-rays with the best overall quality are known as "Reference Quality", because they can be used as a reference source or baseline for comparing the quality of other Blu-rays. These reference quality Blu-rays will generally look good on virtually any display, and this is a major reason why you may see them being used as demo material in most stores. To get an indication of the most

visually impressive Blu-ray sources, you can refer to this Blu-ray.com List of discs reviewed as having the highest-rated video quality, or this Blu-Ray Picture Quality Tiers listing on AVS Forums.

Keep in mind however that just because a Blu-ray looks clean and crisp, doesn't mean that it's actually a good transfer or free from flaws. Filmmakers over the years have used a variety of methods and different types of film to capture movies. Some movies are supposed to look grainy, or have overly bright or muted colors, or appear blurry. This is what the director intended when making the film, or it was a part of the limitations of the production method used, and altering or removing these aspects means the transfer is no longer faithful to the original source. This may seem like an esoteric point of relevance only to film geeks, but in fact it's a practical one: you can't generate new information from out of nothing. If the source has limitations, the best you can do is to attempt to physically clean and restore the source to try to reveal new information in it, or use subtle digital restoration to recreate the original look. Heavy-handed digital alteration of the transfer to make it look cleaner and sharper just for the sake of making it look "high definition" typically only adds artificial information which serves to obscure the original information in the source.

Aside from the Predator debacle mentioned earlier, two more examples illustrate this point: The first is the movie 28 Days Later. This film was heavily criticized when released on Blu-ray as having a terrible picture quality. What many people do not realize is that the original movie was intentionally filmed on a standard 2001-era hand-held consumer digital video camera. Therefore the movie will never look any better. As such, the Blu-ray version is actually a very good and faithful transfer of this relatively poor quality video source.

The second example is The Godfather Collection on Blu-ray. This classic trilogy was completely restored for Blu-ray through a very expensive and time-consuming process overseen by the director Francis Ford Coppola. The result is extremely faithful to the look of the films when they were first released in theaters. This includes a lot of film grain and a brownish/yellowish tinge, which some have mistaken to mean the film is a poor Blu-ray transfer. This is definitely not the case. This is quite possibly the best these movies will ever look. Any alterations to these movies to remove grain or change the color will actually remove, obscure or distort the original information in the source rather than enhance it. A comparison of the Godfather releases is available here.

Importantly, if the source material contains any of the flaws discussed in this section, whether they are faithfully reproduced from the original, or the result of bad remastering or low bitrates, they will be seen on even the best HDTV, and the blame should not be placed on the TV. Any

flaws will also become more evident on a larger screen and/or at closer viewing distances. Furthermore, the same source flaws may be more visible or exaggerated if your TV's settings are inaccurate. All of this means that you should consider the quality of your commonly viewed source material and any inherent flaws it may have before deciding on a particular TV to purchase, how close to sit to it, and before diagnosing any perceived image quality problem on your TV.

Some more general source-related tips:

Black Bars: All movies have an Aspect Ratio which describes the ratio of the width to the height of the actual movie image. Although Blu-ray discs have a 1920x1080 resolution, and DVDs have 720x480 resolution, the actual portion of that resolution which contains an image may be smaller for some movies or TV shows. Where this is the case, black bars will appear above and below, or to the sides of the image, when it is played back on an HDTV. Check the Technical Specs on the IMDB page for a film to see the original aspect ratio of the movie as shot, and also check the Blu-ray.com page to see both its original aspect ratio and the aspect ratio as transferred to disc. You can also find the aspect ratio typically specified somewhere on the back of the Blu-ray or DVD case.

Where a movie or TV show is indicated to have a 4:3 or 1.33:1 aspect ratio, then black bars will appear to the sides of the movie or TV show when played back on a widescreen TV; a 1.85:1 or similar aspect ratio source will typically fill the entire screen; a 2.40:1 or similar ratio will result in black bars above and below the image on the screen. This is completely normal and stems from the fact that most movies are originally designed for viewing in a cinema, which typically has a wider screen than a standard 16:9 widescreen TV, while older TV shows and very old movies were designed for the squarer traditional 4:3 non-widescreen TVs and cinema screens respectively. The only way to make these images fill the entire screen on an HDTV is to expand them using the Zoom or Aspect Ratio functionality of your TV or DVD/Blu-ray player. This will both reduce image quality and crop off parts of the image. Note that if you see very small black bars or thin white lines or dots to the sides or above and below any image, you may also wish to turn on the Overscan feature on your TV as covered in the Calibration section.

Region Coding: All Blu-ray movies and DVDs are region coded to control the way in which they are distributed globally. DVD region codes are specified here and Blu-ray region codes here, and they are not the same. Some Blu-ray players and DVD players are actually region-free when it comes to DVDs, so they should be able to play DVDs purchased from anywhere in the world. You will need to determine this for yourself as there is no master list of such players. However the majority of Blu-ray players are not region free for Blu-ray playback, and so you need to be

careful when purchasing Blu-rays from overseas. You can often find extremely good deals on Blu-rays from places like Amazon.com or Amazon.co.uk, so the best thing to do is to check the RegionFreeMovies site to determine the actual region code(s) for a particular Blu-ray title to ensure it can be played on your Blu-ray player. Some discs are region free, meaning they will not enforce any region coding and can be played anywhere in the world.

Blu-ray Playback Issues: There are some quirks to Blu-ray playback which require further explanation. The two most common ones which people complain about are: longer loading times for BD than DVD; and the inability to stop and then resume from where you left off on certain Blu-ray discs. The BD loading time issue relates to the fact that during startup, the player needs to preload some information before the disc commences playback. Blu-ray discs contain complex software that needs to load and execute to provide advanced functionality compared with DVDs. BD loading times vary by hardware, with newer generations generally reducing loading times, and players like the PS3 which have greater processing power also being faster. The stop and resume issue is not specific to certain players, it's related to the fact that some Bluray discs are programmed with Java to provide additional functionality, and discs which are so programmed cannot simply be stopped and resumed. The resume function must be supported within the software on the disc, not the player. Some discs allow you to use the Bookmark button on your player's remote to tag a particular section of a disc and then you can resume from there at any time. Others which don't use Java can be stopped and resumed just like a DVD.

Blu-ray Firmware Updates: As noted, Blu-ray players need to execute complex programming on the discs, and as newer versions of this programming are included on recently-released discs, there may be incompatibilities which result in problems. Furthermore the player's own software may be optimized by the manufacturer over time to allow better performance or add new functionality. You must check for updates to the firmware - the built-in software - on your Bluray player regularly. Most players allow you to do this via wired or wireless Internet connection on the player itself, but you can also check on your manufacturer's website for updates and download and apply them manually by first burning them to a CD or DVD. Updating your firmware is an important step in ensuring trouble-free source playback.

Finally, calibrating your TV is the best method of accurately reproducing the source image and removing any unintended flaws which your TV is introducing to the source, and we look at this in the next section.

Calibration
You may have heard the term before, but just to be clear, Calibration on an HDTV is a process which involves configuring your TV's settings so that it provides the most accurate and pleasing image quality. Given the large number of brands, models and settings available across HDTVs, there's absolutely no way for me to provide recommended settings for each TV. In any case, using other peoples' settings is usually not optimal, given everyone has different viewing environments, source components and tastes. The best way to correctly configure your TV is with a combination of understanding what the various basic settings do, knowledge of the general procedure to run through, and awareness of the inherent limitations of each type of technology as covered in the rest of this guide.

Throughout this section you will repeatedly see the term "accurate" used. Calibration of a TV is an attempt to get it as close as possible to accurately reproducing source material the way it was intended. This is usually done by changing your settings so that the results on your screen match known standards for HD video material. The result is that if an actor is wearing a pale red shirt in a movie for example, a calibrated TV will show that shirt in exactly the same shade of red you would see if you were looking at the actor in person. If that pale red shirt comes across as a neon bright red shirt on your TV, it's not accurate.

In reality, some people will not like an accurate image for a range of reasons. This depends in large part on what that person has been conditioned to seeing. More often than not, the average person will be used to oversaturated colors, and will prefer cooler rather than warmer color temperatures. Thus when their TV is calibrated to the correct Rec. 709 Standard, whites will appear slightly yellowish to their eyes, and colors may seem muted. Similarly, the correct Contrast and Brightness settings may not represent a bright and vivid enough image, and the correct Sharpness may not look crisp enough to their eyes at first. Since personal taste is involved, there is scope for you to adjust the settings to reach a reasonable compromise between accuracy and the image which most pleases your senses. However it is always best to start off with the correct settings and then fine tune them as you wish. Initial Steps There are several initial considerations before doing any type of HDTV calibration: Make sure the TV is set to Home mode, not Shop or Dynamic mode, if such modes are available on your TV - check your manual for details. Shop mode or Dynamic mode is primarily designed for getting the customer's attention in a brightly lit store environment, and is not accurate, safe or energy efficient for your TV to run at home. On a plasma, you should first age the phosphors at least 100-200 hours, ideally up to 500 hours or more if possible, before any calibration - see the Image Retention, Burn-in & Dead

Pixels section for more details. Any calibration done before this break-in period is completed may have to be redone as the phosphors continue to settle. Perform the calibration in the lighting you normally use to watch the TV. That is, don't calibrate the TV in bright daylight if you mostly watch at night, and vice versa, because this will result in incorrect settings for your viewing environment. If you watch the TV in both brighter and darker lighting, then calibrate two separate picture presets to suit each environment. Cables An important aspect of getting optimal image quality is the cabling you're using. This requires a brief run-through for those unfamiliar with the different types. The hierarchy of common cabling from worst to best is as follows:

Composite: This type of cable is among the most basic, and carries the video on one cable (typically yellow). It may also have two other cables (typically red and white) for the left and right audio channels respectively. This analog cable is adequate for standard definition video up to 576i (720x576) but should be avoided on HDTV setups as it cannot provide high definition video output. You can however use composite cabling for connecting simple audio-only devices to the TV or AV receiver. S-Video: This is a single analog cable carrying video information in standard definition format up to 576i (720x576). It is not much better than Composite cabling, and should be avoided on HDTV setups as it does not support HD video output. Component: Much better than S-Video or Composite cables, a Component cable splits the video signal into three separate components, but carries no audio. These analog cables can support high definition video up to 1080i, and 1080p at times (1920x1080), but may face restrictions since they do not support HDCP (See further below). Optical: Toslink, known more commonly as Optical Audio, is an audio-only digital cable. It carries no video data, and can support most audio formats, but does not support the highest quality audio formats of TrueHD and DTS-HD. DVI: Digital Visual Interface (DVI) is a digital cable type which only carries video, not audio. It supports Full HD 1080p, and resolutions up to 2560x1600 at 60Hz. It also supports HDCP and provides the same quality video as HDMI. It is fine to use DVI in instances where there are a limited number of HDMI ports available. HDMI: High Definition Multimedia Interface (HDMI) is a digital cable which carries both video and audio on the one cable. It supports Full HD 1080p, and resolutions up to 4096x2160 at 24Hz, or 2560x1600 at 75Hz. It also supports high definition audio, including the highest quality

Dolby TrueHD and DTS-HD audio standards. It supports HDCP and is currently the optimal choice to use on HDTVs. If you use a non-digital cable you may experience noticeable degradation in image quality on an HDTV. More importantly, the High-Bandwidth Digital Content Protection (HDCP) encrypted copy protection method implemented on Blu-ray discs means that DVI or HDMI cable use is essential. Using cables and devices which do not support HDCP can in many cases result in the content either not being played back at all, or being played back at reduced resolution; this is up to the content provider to determine. The HDCP "handshake" as it's known - the verification of HDCP secure status between the playback device and the TV - is also one of the reasons why you may see your screen briefly blank out one or more times when turning on the player or initiating playback of a Blu-ray disc.

The quality of HDMI cabling is another important issue which needs further explanation, as it is a major source of confusion. This confusion has arisen due to what's been dubbed the HDMI Cable Scam: cable manufacturers who create inordinately expensive HDMI cabling with dubious promises of quality improvements over cheaper cables. Various reviews such as this one, this one and this one back up the fact that in general, cheaper HDMI cables perform the same as expensive ones. As this Analog vs. Digital Cable article explains, since HDMI is transferring information in digital form, if the HDMI cable experiences any interference or signal degradation, the resulting transmission errors will show up as freezing or dropped frames, and other very noticeable glitches. This is similar to what happens when you have a poor quality signal when watching a Digital TV broadcast. In other words you will definitely know if your HDMI cable is not performing well for any reason - it won't show up as marginally increased grain, or less vibrant colors for example. The subtle improvements in image quality which expensive HDMI cabling claims to bring are non-existent, and hark back to the days of analog cabling.

In short do not buy expensive HDMI cabling, especially for shorter cable runs of up to 6 feet (2m). HDMI cables retailing for as little as a few dollars each are just as good. As long as the cable is certified to meet the HDMI standard, and as long as it's classified as High Speed (Formerly called HDMI 1.3 or HDMI 1.4), it should have sufficient bandwidth to meet all of your needs, including full 1080p 3D TV, without any problems, and with exactly the same image quality as the most expensive cabling. The only potential benefits of more expensive cables are more robust connectors, and better shielding to prevent signal interference on very long runs of cabling. Make sure you finalize the source components and cable connections before calibrating your TV, as changes in the source or cabling, or changing the ports on the TV to which you connect your components will result in changes in the accuracy of settings. For best results, you should have every video output going directly from your source to your TV via HDMI cables, as this prevents the image being rescaled or otherwise reprocessed multiple times and hence avoids any loss in image quality between the source and the screen. If this is not possible, you can connect your various components to an AV receiver and connect the receiver via a single HDMI cable to your TV. In reality there are a multitude of component/receiver/TV configurations which are possible, and it will require some thought and planning on your part to ensure that (a) the signals go through the least number of transitions before reaching your TV; and (b) that each component, as well as any AV receiver intermediary, are configured to ensure that the signal is not manipulated in any way between the source and the destination. Refer to your receiver, component and TV manuals for more details.

Basic Calibration

The picture controls most of us would be familiar with include the Picture mode, and the Contrast, Brightness, Color and Sharpness controls. In addition to these, the Tint, Gamma and Color Temperature controls should be considered among the settings you can configure in a

basic calibration. The term 'basic calibration' refers to the fact that this isn't an instrumented method as such - anyone can perform these tests and set the appropriate value by eye alone. It doesn't provide optimal results, but it should provide relatively pleasing results with a minimum of fuss. For a proper instrumented calibration see Advanced Calibration later in this section.

To assist in basic calibration, you will need HD test material which provides an objective way of measuring the accuracy of the image on your TV. The best free test material I've found is the AVSHD 709 Calibration Disc. Download the AVCHD version from the link above and burn it to a DVD for use on your Blu-ray player. Another source of calibration test material for either a Blu-ray player or a DVD player is the free THX Optimizer which comes on several THXcertified Blu-ray and DVD movie discs as listed here. You can also buy specialized calibration discs such as the Digital Video Essentials HD or Spears and Munsil Blu-rays. The instructions below refer to the AVSHD method.

Picture Mode: Choosing the correct picture preset, also known as the picture mode or viewing mode, forms the foundation for all your other settings, so it must be selected carefully. Modern HDTVs come with several such modes, usually accessible at the top of the TV's Picture menu. It is important to note that each preset isn't simply just another storage point for your basic settings; these various picture modes have each been factory calibrated to contain fundamental differences in terms of the color reproduction method they use, the types of video processing which is applied to the image, and the overall level of image brightness - even at the exact same basic settings across all presets.

The presets come with various names such as Dynamic, Vivid, Cinema, True Cinema, Movie, Film, Standard, Custom, THX, Game and so forth. At least one of these modes on most recent TVs should be close to being accurate (i.e. meeting the Rec. 709 standard). You will have to do some research to find which mode this is on your particular TV. For example, on recent Panasonics this will be the THX, Professional or True Cinema mode; and on Samsungs this is the Movie mode. It's very important to select the correct mode if you want an accurate reproduction of colors and optimal image quality, so take your time finding the right one before you proceed with any other changes.

Brightness: This control effectively determines what is known as the Black Level - the brightness or darkness with which blacks are presented. Higher values will result in a washed-out milky look in darker areas of the image, while reducing black level too much results in what is known as Black Crush - deeper looking blacks at the cost of loss of detail in darker areas.

The correct value can be determined using the AVSHD calibration disc. On the AVSHD main menu select the 'Basic Settings' item at the top and the very first screen you will see is used to determine the correct brightness. Adjust your Brightness control so that only the bars numbered 17-25 at the right of the screen are visible and flashing. You can then press the Chapter Forward button on your remote to go to the next screen, where a correct Brightness setting should allow you to see at least bars 19-28 flashing in the middle dark area of the screen. Once you've found the correct Brightness setting, you may wish to lower it one or two notches further. This can help produce a slightly deeper look to the image and will also reduce some noise in the image, without any significant loss of detail in black areas.

Contrast: This control generally determines how bright the screen will be. It is also known as the White Level, in that it determines how bright whiter areas will appear. Its impact will differ on LCDs which have a separate Backlight brightness control. On such LCDs, the Backlight brightness setting determines the overall brightness of the screen, while Contrast will control how much detail is visible in white areas. Use a combination of both settings when conducting the tests below.

Go to Chapter 3 of the Basic Settings section of the AVSHD disc. The numbered bars from 230234 should always be flashing; if not, there's something wrong with the way your devices are outputting the image and you will need to investigate further. You can raise Contrast until only the bars 230-244 are visible, at which point any more Contrast will be too much. However this is not necessarily the ideal level; there are several important considerations when setting Contrast:

Eye fatigue and discomfort - A Contrast which is set too high, particularly in darker viewing environments, will result in eye strain due to the excessively bright nature of whites. So one test of whether you have the correct Contrast on your TV is whether it remains comfortable to watch after extended viewing. Image Retention and Burn-in - On plasmas, using a Contrast which is set too high greatly increases the chance of image retention and possibly burn-in. If you frequently see clear signs of image retention on the screen after brighter scenes, and they don't disappear after a couple of seconds, then your Contrast is likely too high. Energy usage and buzzing - A higher Contrast setting will mean the TV is using more energy, and this can also increase the background buzzing noise on plasmas. The correct level of Contrast is difficult to give as a precise value. The aim is to have a reasonably bright image, but one in which fine details in white areas are not lost and the image is not uncomfortable to view. As long as bars 230-234 remain visible in the AVSHD test

mentioned above, you can set any reasonable value which suits your personal circumstances. Avoid an extremely high Contrast (and/or Backlight brightness) setting as it is harmful for your eyes and harmful to your TV.

Color: This control is also known as Color Saturation, and controls how bright or dull colors appear to be. The higher the Color control, the richer the same color will appear. However this control can't correct major deviations in color accuracy, such as when whites and greys appear overly green or yellow - for that you must first use the correct Picture mode as discussed earlier, then use the Tint control if available. The process to adjust the Color control is covered below.

In the Basic Settings section of the AVSHD disc, go to Chapter 4 where you will see a screen which contains several colored bars and flashing color boxes within them. To complete this test, you will need to view the image through a special blue filter, available in one of several ways:

If your TV has a 'Blue Only Mode', turn that mode On for this test. Buy the THX Blue Filter Optimizer Glasses. Buy the Lee Calibration Filter Pack Buy or borrow a calibration disc like Spears and Munsil or DVE (either the HD or non-HD version), as they contain the blue filter. With the test image viewed through the appropriate blue filter, adjust the color control so that the grey bar with the flashing blue box, and the blue bar with the flashing grey box (marked 'Color' at the far left and right of the screen) both come as close as possible to being uniform solid blue bars with minimal flashing visible in the boxes. Once this is done, you can refine the colors to further reduce the flashing boxes by using the Tint control if available. Adjust Tint so that the

flashing magenta (pinkish red) and cyan (light blue) bars and boxes marked 'Tint' also reduce their flashing. You may have to readjust the Color control after adjusting Tint.

In practice this is a rough but reasonable way to get desirable color saturation. Once you have set it via this method, the Color control will inevitably need to be fine tuned as you view different source material, until you reach a good compromise. Make sure you use source material for which you are already familiar with the intensity of the colors used, as color reproduction varies by source. Pay careful attention to reds, pinks and oranges in particular, because if these appear to "flare" around the edges, and peoples' faces look slightly sunburned, then Color is set too high. Similarly, objects like trees and grass should have a natural appearance, and not appear plasticky or neon-like.

Ultimately color is one of the trickiest things to get right, especially as most TVs have some sort of inherent color bias towards red or green (even in the most accurate picture mode) which means dialling down the Color control will make some colors look right while others look over or under-saturated at the same time. This is where Tint helps in adjusting the overall balance of Red, Green and Blue.

Sharpness: This control is widely misunderstood. Most people realize that it controls the crispness of the image, but don't fully comprehend that on an HDTV, it is an artificial technique which adds undesirable elements to the image to make it look sharper. Just like Edge Enhancement discussed under the Source Material section, as the Sharpness control is raised, the image is altered so that where edges are detected, subtle highlights are added around them to raise the perceived contrast around the edge. This makes edges look sharper, but it creates white halos around them and can also reduce detail in other areas of the image. Essentially, raising Sharpness doesn't draw out any more information from the video source. The amount of detail in the source remains constant - all Sharpness does is to highlight the more prominent details at the cost of loss of more subtle detail, along with the introduction of visual glitches. For this reason, it is recommended that Sharpness be set as low as possible.

On the AVSHD disc, Chapter 5 of the Basic Settings section has a screen where you can examine the impact of the Sharpness control. Look closely at the edges of any dark line or area there should be no hint of any white halos or ghostly outlines of any type. Also look at the dark curved lines - they should be smooth along their length. Keep lowering Sharpness until the outlines, and the white dots in the middle of the main box, start to appear blurry. On a decent HDTV using a Blu-ray player connected via DVI or HDMI this should not occur even if you reach a zero Sharpness setting.

Initially, because you may be used to higher sharpness in the image, you may find reducing the Sharpness control will make things look too soft to your eyes. You will have to allow yourself time to get used to the reduction in artificial edge enhancement. A Sharpness of zero is ideal, as this allows an HD image to come through unaltered. However some people will find this setting a bit too blurry when viewing DVD material, even after allowing themselves to adjust. In that case I recommend using a Sharpness in the range of 5% to 10% as a compromise. This prevents DVDs from becoming too soft, and also adds a small amount of crispness to Blu-rays without any noticeable artifacting. It can also help counter the slight softening effect which Overscan introduces, if you use it.

Tint: This control should only be set in combination with the Color control, and using the appropriate test screen, as covered further above.

Color Temperature: This control determines the temperature of white, which according to the appropriate D65 standard should be equivalent to around 6,500 K (Kelvin), similar to ambient daylight at midday. If Color Temperature is overly "warm", that is at lower color temperatures approximating less than 3,000 K, whites and greys will start to take on a yellowish or reddish

tinge. At Color Temperatures above 5,000 K, also known as "cool", whites and greys will progressively start to turn more bluish, especially around the 9,000 K mark.

Using the correct Picture mode should get you close to the correct 6,500 K color temperature typically this is a THX, True Cinema or Movie mode as discussed earlier. Color Temperature presets of Cool, Normal and Warm on some picture modes also allow you to set something close to an accurate level by selecting the Warm or low value. However because most people are used to a cooler and slightly bluish temperature for white, this may look incorrect and slightly reddish at first. You will need to allow your eyes to adjust, and if you still have doubts, try the Normal or middle value as a compromise. The only way to accurately set Color Temperature is to do an Advanced Calibration.

Gamma: This control isn't available in all picture modes, and even if it is, it shouldn't be adjusted unless you understand its complex nature. Known more formally as Gamma Correction, it basically determines the gradation of brightness on your TV - the amount of brightness for any particular level of light specified in the source.

The reason Gamma exists is because of the way the human eye perceives light. Our eyes are much more sensitive to changes in lower levels of light than to the same changes in brighter areas of light. So while an imaging device like a camera typically has a linear (1:1 or straight line) relationship between light input and output, our eyes have an exponential (power or curved line) relationship, similar to Gamma. Source material such as movies take advantage of this property by encoding brightness to use more data at the darker end, and less at the brighter end. This results in more efficient use of RGB data, such that that an 8 or 10 bit color depth with Gamma encoding can be just as effective as 12 or 14 bit color depth without it - see the Color Reproduction section for details on color depth.

On traditional CRT monitors, this special source Gamma encoding is automatically compensated for by the inherent input/output characteristics of CRT monitors; CRT monitors also have a nonlinear relationship between voltage input and light output which is the opposite of the gamma encoding in the source, and the combination of the two cancel each other out to result in perceptually smooth gradation of brightness. An HDTV is not like a CRT however, and has a linear relationship between input and output. So a form of software Gamma correction is required on an HDTV to correct the source Gamma encoding into something which visually looks smooth. This is what the Gamma control on an HDTV does - it determines the shape of the exponential curve which applies to this Gamma correction. There is a lot of debate regarding the correct value for the Gamma control, but in practice the standard target is around 2.2 to 2.35.

In plain English: setting Gamma too high will make the image generally darker than the source intended, and will reduce detail in dark areas, while setting the gamma too low will give an overly bright and washed out image. For correct image reproduction, set your Gamma control to the available option which is closest to the target range, such as 2.2 or 2.4. This must be done in combination with the correct black level and white level as covered under the Brightness and Contrast controls further above to achieve a level of brightness gradation on your TV which is as the source material was intended to display. For more details see this Gamma FAQ.

Image Processing: There are a multitude of options on any modern HDTV designed to ostensibly improve the image by manipulating it in one way or another. The same option may also vary in name from brand to brand, as some manufacturers give them proprietary titles. As such, there are too many of these options to individually cover in this guide. The general rule is this: turn off as many of these image processing enhancements as you can, because with few exceptions, they will reduce image quality rather than improve it. This is because they are all post-processing features, so called because they apply image processing after the source image has been received in the TV, altering the source from its original form. You should of course experiment to see what each option does, and refer to the manual and do further research for details, but again, the majority of these extra features can and should be turned off.

Below are a few of the more common image processing options you may find:

Edge Enhancement - Also known as Resolution Enhancer or a similar such name, is much like your Sharpness control as covered earlier in this section, and the edge enhancement applied to some sources as covered in the Source Material section. You can experiment with using it at a mild setting if you find a poor source to be overly blurry, such as low quality Digital TV broadcasts. Otherwise it is best left Off. Noise Reduction - Also known as P-NR or DNR, can similarly be used on a mild setting on very poor sources such as excessively grainy old films to smooth (blur) the image slightly, but again, it is best left off if you don't want to lose detail in the image. Color Enhancement - Options such as Vivid Color, Deep Color and the like will extend the color gamut and oversaturate the image, resulting in a loss of natural tones. None of the common sources such as Blu-rays or DVDs can actually use an extended color gamut - see the Color Reproduction section for details. You can experiment with these options if you wish, but for strict color accuracy turn off all such color enhancements. Motion Smoothing - These options come under a wide variety of proprietary names, and their fundamental principles are discussed in the Motion Handling section. If running a plasma you should disable all forms of motion interpolation or motion smoothing to avoid an artificially smooth look to motion. On an LCD, you may need to keep some level of motion interpolation enabled if motion on your TV looks blurry or undesirable without it.

Overscan: Rescales the image by expanding it slightly to ensure that there are no gaps at the edges of the image. You can test for the degree of Overscan currently used on your display by going to Chapter 5 of the Basic Settings section of the AVSHD test disc. Look at the edges of the screen to see how much overscan has been applied. If the option to disable Overscan is available to you, consider the fact that some material has been made with the assumption that Overscan will be used. Thus if overscan is disabled, you may see small flashing white lines or dots and/or thin black bars around the edge of these images. This may cause distraction, and on plasmas, may unevenly age the phosphors around the edges of the screen. For more recent TVs, especially high-end TVs with reasonably good video scalers, enabling Overscan will result in only a very mild softening of the image which may be imperceptible. Any form of rescaling reduces image quality though, so if you want the sharpest image possible, disable Overscan.

The final result of your calibration efforts may not necessarily please you entirely. This is because the methods covered above are limited by the imprecision which comes with doing everything by eye. You are also adjusting the settings to a standard which although accurate, may give a look that you are not used to seeing on a TV. You do have some scope to adjust the settings to better suit your tastes, even if they may become inaccurate. While I don't recommend exaggerated settings such as garishly bright colors, extremely high or low gamma values, or incredibly bright contrast settings for example, in the end if you understand what each control is doing, and what the correct target is, then it is up to you to as to how much you deviate from this for personal taste reasons, or perhaps as part of certain compromises that are necessary to overcome some of the limitations of your particular TV.

If over a period of time and with some fine-tuning you are still not satisfied with the image quality of your TV, then instead of obsessing over it and spending countless hours continuing to attempt to tweak it by eye, it may be time to consider the instrumented approach, which we look at below.

Advanced Calibration An advanced calibration differs from a basic one in a critical way: it uses an instrument to take objective measurements from your display. This ensures that your results are as close to the standards as possible, and removes a lot of the guesswork and repetition involved in calibrating by eye. If conducted properly, the results of an instrumented calibration can provide a noticeable improvement over the basic method.

The easiest and best way to do an advanced calibration is to hire a professional calibrator who has been certified by the Imaging Science Foundation (ISF) to perform the process on your TV.

A list of ISF calibrators in your area can be found here. There are two main drawbacks to this method. The first and most obvious one is the fee: a professional calibration will cost several hundred dollars or more. Although the results will ensure that you are getting the most out of your TV, it may not be economical to spend that much on a calibration. The second reason is that professionally calibrated settings do not necessarily apply from one TV to another, even of the exact same brand and model. This means that should your TV need to be replaced or repaired under warranty for example, you may have to pay to have it calibrated again. This also applies if you have multiple TVs in your home, as each one has to be calibrated separately.

Another option which can provide extremely good results, but which requires time and effort, is a do-it-yourself (DIY) calibration. You will still need to outlay some funds to purchase the necessary equipment, however this outlay will typically be a fraction of the price of a professional calibration, and will also give you the tools and knowledge to perform multiple calibrations on your own TVs and computer monitors, and those of friends and relatives, at no further cost. The process is the same regardless of whether it is a plasma or LCD. The requirements to conduct a DIY advanced calibration are as follows:

Instrumentation: You must purchase or borrow an instrument which can accurately measure the red, green and blue components of light coming from a display. This is a critical requirement as part of setting the White Balance controls for an accurate grayscale, covered further below. The cheapest such instrument is a Tristimulus Colorimeter, or colorimeter for short. While there are several types of colorimeters available, the most popular being the Spyder and i1 range, the i1 Display LT is the best balance of price and accuracy. It has a slightly more expensive brother called the i1 Display 2, however the hardware is identical to the LT version; only the software (which we don't require) is different. If you have more money to spend, a Spectroradiometer (also called a Spectrophotometer) like the i1 Pro is generally more accurate than a colorimeter, but costs much more. For the most part, any colorimeter, particularly an i1 Display LT/2 should be more than adequate. Try to avoid purchasing a used meter of any type as their accuracy can deteriorate over time, especially if stored improperly.

Calibration Software: Although your colorimeter may come with calibration software of its own, it will typically not be appropriate for our needs. This guide, and many other guides and forums on the Internet, assume the use of the free Color HCFR software, as pictured above. This software needs to be installed on a PC or laptop in order to interpret and display the information received from the colorimeter. The alternative is CalMAN calibration software which is not free, although it is a bit more intuitive to use than Color HCFR, and this bundle consisting of CalMAN and an i1 Display 2 colorimeter is reasonably good value for example if you want to go down that route.

Calibration Disc: Your TV needs to display specific test images in order for the colorimeter to take appropriate measurements. Fortunately the free AVSHD 709 calibration disc we used in the Basic Calibration section further above is useful for advanced calibration as well. You can use Digital Video Essentials HD or the Spears and Munsil disc instead if you already have one of them. Customizable Settings: Your TV must have a set of White Balance or Grayscale calibration controls of some type so that you can adjust at least two of the three colors in the red, green and blue mix that combine to produce a color image on an HDTV. The basic set of these controls is called a 2-point white balance, and allows for low end (known as RGB Brightness, Cutoff or Bias) and high end (known as RGB Contrast, Drive or Gain) adjustment. Some TVs come with a 10-point white balance which allows even greater control, and this is desirable. Refer to your TV's manual for the precise name and location of these controls in your menus. If you cannot find these controls, then you will have to access the Service Menu to use them, and that can be quite risky, may void your warranty, and is beyond the scope of this guide. The settings in the Service Menu are best left to a professional calibrator. Calibration Guide: The actual process of calibration is quite detailed and has too many variables and instructions to cover in the space allotted in this guide. Instead I recommend running through this detailed Greyscale & Colour Calibration for Dummies guide. You can also refer to this Basic Guide to Color Calibration as well if you want advanced details on the concepts and methods involved.

The primary benefit of an instrumented calibration, whether performed by you or by a professional calibrator, is that it takes all of the guesswork out of getting accurate colors. This means that once calibrated, you can view any source material on your TV and be confident that any variations in color you see are as intended, rather than due to poor settings. The image will also have a much more natural look to it, and any image flaws will be minimized. Regardless of which calibration method you choose, whether basic or advanced, calibration of some type is necessary on any HDTV to achieve optimal image quality. Without it, even the best TVs can provide a relatively poor quality and unsatisfying image.

Conclusion
Before I conclude this guide, I want to cover a few issues that always seem to pop up when HDTVs are discussed. The most contentious of these is the decision between plasma and LCD. This guide has focused on plasma and LCD technology because these two display technologies between them make up the bulk of the HDTVs on sale today. However over time, people have come to question the viability of plasma as a display type, and some even consider it an outdated or dying technology. This might seem to make the decision much easier, because ruling out plasma means you're left with picking the right LCD. Not so fast.

As this report from the display technology research company DisplaySearch shows, plasma is far from dying: plasma shipments actually increased from under 3 million per quarter at the start of 2009 to over 5 million per quarter at the end of 2010. This report, this article and this article also confirm the strong sales of plasmas in 2010 and 2011. Both plasma and LCD have suffered a slight reduction in growth in the first quarter of 2011 as noted in this article due to oversupply, but plasma still showed a 6% annual growth rate, just below LCD's 9% annual growth rate. There is no doubt however that LCD is dominating the overall market with 80% market share, of which 35% of those are LED-LCDs of some type in the first quarter of 2011. The key point is that despite LCD's dominance, plasma is not dying. In fact on most audio/visual enthusiast forums and in many professional reviews you will find there is a strong preference for plasma due to a range of advantages it has over LCD, including better blacks, better viewing angles, much faster response times which provide more natural motion handling, and a more accurate color gamut. A large part of the dominance of LCDs despite their higher price is due to

consumer perception that LED-LCDs are "new technology" and plasmas are "old technology", a misguided view which is encouraged by sales staff due to the higher prices (and hence larger potential margins) on LCD TVs. The average consumer's somewhat outdated notions of plasma's drawbacks include exaggerated claims of image retention and burn-in problems, and odd myths such as plasmas needing to be regassed or having a shorter life span, or consuming hundreds of dollars a year in power. Hopefully this guide has clarified some of these myths. Right now the latest high-end plasmas not only provide excellent image quality, but are also bright, thin and efficient. As we saw in the Power Consumption section of this guide, Plasma also provides a lower total cost of ownership than LCD even when measured over a 5-10 year period running the TV at 5 hours a day. In short, do not rule out plasmas due to hearsay or mistaken assumptions. In terms of HDTV prices, there is insufficient demand to maintain high TV prices, and these are continuing to fall as noted here. Competition is now intense, and although we as consumers obviously want lower prices, unfortunately as a result, the quality control on all displays also seems to be suffering as companies engage in a ruthless war to slash costs while also improving panel efficiency and providing more features. This leads me to my next point regarding choice of brand and extended warranty, another couple of points of contention. Basically you should not make the mistake of falling prey to the "it can't happen to me" syndrome. HDTVs are complex electronic components, and anecdotally, appear to be experiencing more faults than ever before as quality control declines. It is strongly recommended that you spend some time in several prominent audio/visual enthusiast forums like those listed in the More Information section below. You will be able to determine what the current crop of issues are with your chosen TV, and read the user experiences with customer support for various brands. This research will not only save you headaches in the long run by not purchasing a potential lemon, if you are an existing owner it will also let you know the workarounds and fixes you can implement to address any issues you're experiencing. It also arms you for more productive interaction with customer support, because knowledge is power. Customer support from all brands have a habit of trying to minimize or ignore certain issues, and having some facts and links handy to back up your claims speeds up the process.

Furthermore, not all faults will exhibit themselves within the first 12 months of an HDTV's life while it's covered under manufacturer's warranty. Some TVs, as part of the cost cutting measures which severe competition brings, are effectively defective by design and may experience component failure within 2-3 years for example. Depending on how much your TV cost, and how much extended warranty will cost you, it is usually sensible to invest in at least a couple of extra years of extended warranty. A good rule of thumb is no more than 10% of the price of the TV should be spent on extended warranty, and you must read the terms and conditions carefully before signing up, as they are what make or break the usefulness of any extended warranty.

Future Technology Future developments are an extremely important factor when it comes to any HDTV purchasing decision. Everyone wants to know what's just around the corner, and whether it's worth waiting for the next big breakthrough or new technology that will antiquate existing TVs. Based on announcements from the major manufacturers, plasma and LCD will both continue to improve. However they both have inherent technological limitations which make improvements of a marginal rather than a breakthrough nature. At the moment, there is a great deal of expectation surrounding the use of Organic Light Emitting Diode (OLED) technology in bigscreen TVs. OLED essentially combines the positive characteristics of plasma and LCD displays. It is extremely thin, but has no backlight and emits light directly from the screen. Cells which are not active can also be completely switched off. In short the advantages of OLED include:

Wider viewing angles than an LCD. Deeper blacks due to the lack of a backlight. Response times approaching a plasma, at around 0.01ms. Highly flexible and light-weight, allowing for screens as thin as 3mm.

The key disadvantages of OLED at the moment are the high cost of production, its relatively short life-span, and much greater susceptibility to uneven pixel ageing and burn-in. There are other disadvantages too, but basically, OLED is not going to be a perfect display technology either, at least not in the immediate future. The main benefits of OLED are that it offers an extremely thin and light display, opening up the potential for new form factors and applications as the technology is developed.

At the time of this writing in late 2011, big-screen OLED displays are not widely available. Most are under 15 inches, and restricted to applications such as smartphone displays using AMOLED. LG has announced a 55" OLED TV for 2012, but with its smaller 31" OLED TV brother priced at over $9,000, it's obvious that a 55" model will be incredibly expensive, especially when compared to plasma and LCD TVs. In fact one manufacturer has recently said that large OLED TVs won't be feasible until 2014 at the earliest due to high costs and low yields.

A quick mention should also go to Surface Conduction Electron Emitter Displays (SED), which at one point promised to be the next big thing in display technology, giving CRT-like image quality in flat panel form. However in mid-2010 Canon Inc., the primary developer of the technology, halted development for home use of SED TVs, so it is as good as dead right now.

In other words, the next big thing is not quite around the corner, and certainly not without its disadvantages and a heavy-duty price tag. For now plasma and LCD TVs can provide superb image quality at a much more reasonable cost, and are also quite thin and efficient as well. There's no reason to sit and wait in fear of something coming out of the wings to make your new HDTV completely obsolete.

Another consideration is the evolution of source material. It was only in 2008 that the Blu-ray vs. HD DVD war came to an end, resulting in Blu-ray emerging as the victor. However this NPD Report shows that only 15% of US consumers used a Blu-ray player in the year to March 2011, compared to 57% using a standard DVD player over the same period. While this is an improved result over previous years, given Blu-ray was officially released in 2006, the uptake has been very slow. As we've seen in the Source Material section, Blu-ray definitely has advantages over DVD by allowing much higher bit-rates for video material. However on smaller screens and standard definition TV, the advantage is almost impossible to detect. In any case, there is no contention that HDTV owners should definitely invest in a Blu-ray player to get the most out of an HDTV, especially as the prices of both the players and the discs continue to drop.

It needs to be kept in mind however that Blu-ray is simply an evolution of video storage, just as DVD and VHS were before it. It will not remain the best format for too long, simply because as we discussed under the Source Material section, movies shot on analog film - even old ones - are actually capable of much higher resolutions than 1920x1080. So perhaps in the next 5-10 years as HDTVs become capable of even higher resolutions, a new standard video format far exceeding Blu-ray will emerge to give us an even higher fidelity source, and force us to once again repurchase all our favorite films.

At the moment though, what is likely to occur is an increasing divergence between physical discs and movie download/streaming services. That is, movies stored on Blu-ray, as opposed to those delivered via the Internet through a service such as Netflix. Although very convenient, streaming services do not provide the exact same quality as a Blu-ray disc as demonstrated here for example. For the best possible quality, maximum convenience, infinite replayability and lack of any viewing restrictions, a Blu-ray disc still provides the best choice. This may change as both bandwidth and streaming services improve, though it's unlikely that streaming will exceed Bluray quality anytime soon.

One last area of curiosity for HDTV users is developments in 3D TV. At the moment, many people have expressed concerns over the inconvenience and cost of having to use special glasses to view 3D content. A common desire is to see glasses-free 3D, which is currently available in the form of Autostereoscopic 3D. This technology isn't perfect, because it is the display itself, rather than the glasses, which attempts to direct the relevant image to the appropriate eye of the viewer, and this can limit viewing positions. As this article points out, the technology is not really effective in providing convincing 3D, and is unlikely to be implemented for many years to come.

As an aside, it's interesting to note that 3D was a method used in the 1950s to lure people back to the cinemas after TV started gaining popularity. The same strategy is being tried again in the 21st century, not only to get people back into cinemas, but to also drive sales of HDTVs. Not surprisingly though, 3D at home isn't proving to be particularly popular as pointed out in this article and this article for example. Just as the 3D fad died off in the '50s, there's potential for it to run out of steam in the new millennium - at least until some breakthrough such as holographic projection comes along. For now it's there as a feature on most recent high-end TVs, whether you use it or not, and can provide a novelty aspect.

Any major developments in HDTV technology will be noted here in future updates to the guide, but for now things seem relatively stable, and the ability to enjoy movies on a big screen at home is within the reach of more people than ever before.

4K Ultra High Definition

Updated: December 2012

A recent major development is the accelerated pace of the adoption of 4K Resolution standard by electronics manufacturers. The Consumer Electronics Association (CEA) announced in late 2012 that this new 4K standard would be known as Ultra High Definition, or Ultra HD, to distinguish it from the existing HD notation for TVs.

Current HDTV screens have a 2K resolution (1920x1080), remembering that as we discussed in the Source Material section, the 2K or 4K notations refer to the horizontal resolution. So 2K refers to displays that are approximately 2,000 pixels wide, such as 1920x1080 HDTVs, while 4K refers to displays with an approximate width of 4,000 pixels, such as 3840x2160, which is the Ultra HD standard.

Some 4K displays already exist, such as Sony's 84" XBR 4K Ultra HDTV, which is priced at $25,000. While 4K TVs have around 4 times the resolution of current 2K TVs, there are several reasons why these displays will not be replacing your current HDTV anytime soon:

Price. As noted above, initial 4K TVs are around 10 times the price of an average HDTV. It will take quite some time before the prices of Ultra HD displays fall to more palatable levels, especially as current HDTVs continue their dramatic decreases in price. Size. As discussed in the Screen Size & Viewing Distance section of this guide, there is an optimal viewing distance for every screen size, based on a combination of taste, resolution and source material. This chart takes the point a bit further by demonstrating that to see the full benefit of 4K resolution over the current 2K resolution, you need to sit 5 feet away from an 80" screen, or 8 feet away from a 140" screen. Even to see some of the extra detail from the higher resolution of 4K, you either need to sit very, very close to a standard sized TV (e.g. 5 feet away from a 60" screen), or at a normal viewing distance of say 8 or 9 feet, you need to have a extremely large screen (i.e. 80-90" and larger), which many people simply could not fit into their viewing area, much less want to. 4K content. Aside from some demo material, there is currently a complete lack of any movies that are released in native 3840x2160 resolution. This will change over time as studios start shooting films in 4K, and some existing 35mm movies have already been scanned in at this resolution. There is also no medium that presently holds 4K material, as the current Blu-ray standard only goes up to 2K (1080p) resolutions. Of course there is no reason why the Blu-ray standard could not be revised to include 4K, and streaming content can be provided at any resolution right now. However Blu-ray discs store a maximum of around 50GB of content, which is a tight fit for longer movies at 4K, and streaming content providers already reduce quality to save on bandwidth, so quadrupling resolution, and hence bandwidth requirements, is not high on their list of priorities at the moment. Consumer Acceptance. As we saw with the DVD to Blu-ray transition, most consumers strongly resisted the push to adopt the Blu-ray standard introduced in 2006. It's only been in the last few years that the Blu-ray format has become more widely accepted and is rising in sales. This is because in the initial few years after its release, most people didn't have the hardware or the screen sizes to fully appreciate the benefits of 1080p, and couldn't justify the hefty cost of an upgrade. In short, although consumer electronics manufacturers will be pushing the 4K standard, in large part to ensure that consumers buy more of their wares, the reality is that there is no pressing need for 4K at the moment. While higher resolution displays are certainly nice, and in many respects, 4K Ultra HD is an inevitable step after 2K HD, in my opinion it will be many years before 4K becomes mainstream. First, we will need the 4K content. Secondly, we will need hardware prices to fall to more acceptable levels. And thirdly, consumers will need to become accustomed to larger displays, typically around the 80"+ size, and at closer viewing distances. All of this will take time.

The upshot of it is that even if you can afford a 4K Ultra HDTV right now, and even with good upscaling of existing 2K content to 4K, there will be an imperceptible benefit at normal viewing distances, even on a larger screen. I'd suggest a 5-10 year timeframe before 4K becomes standard.

More Information

Aside from researching the Internet using a search engine like Google or Bing, there are several good sources of the latest information on HDTVs. The most prominent of these are as follows:

High Def Junkies - A US-based A/V enthusiast forum. See in particular the Plasma (PDP) Displays and LCD & LED Displays sub-forums.

AVS Forums - Another US-based A/V enthusiast forum. See in particular the Plasma Flat Panel Displays and LCD Flat Panel Displays sub-forums.

Home Theater Forum - A mature forum for discussion of home theater and movies. The Blu-ray sub-forum in particular contains excellent discussions and reviews of blu-ray films, by professionals such as noted film restoration expert Robert A. Harris.

AV Forums - A UK-based A/V enthusiast forum. See in particular the Plasma TVs and LCD and LED LCD TVs sub-forums under the Video Electronics>TVs sub-category.

Blu-ray.com - The main site is a good source of information on Blu-ray releases. The majority of discussions on their forums on the other hand focus mostly on superfluous gimmickry such as slipcovers and collectibles.

FlatpanelsHD - A European site which provides a range of reviews on the latest HDTVs.

HDTVTest - A UK site which provides a range of reviews on HDTVs.

CNET - A US-based site which provides reasonable reviews of a range of HDTVs.

There are of course a range of other sites and forums where you can find discussion and reviews of HDTVs, but bear in mind that many of them can be of limited value in providing useful information, or worse, may give you misleading or biased advice. Read as widely as possible to get a better idea of what's true and what's simply myth or false information.

Further Reading

You may have noticed that I have linked extensively to Wikipedia for references to various terms throughout this guide. However Wikipedia is not the main source used to research this guide. The main reason I have linked to Wikipedia is because it is a stable, reasonably up-to-date and accurate (for technical topics) and summarized source for key concepts. In the past when I've linked to other sites for key references, I've found that the links change or die very quickly and require constant maintenance. So Wikipedia's stable presence is a primary factor in its use throughout this guide

In any case, I strongly urge readers not to rely solely on Wikipedia and to research and read widely on any of the topics covered in this guide. As well as those linked throughout this guide, the following articles were just some of those used as part of the research for this guide and may be of some interest to you:

Frame Rate Basics What is Deinterlacing? Facts, solutions, examples What Exactly is ATSC? Contemporary LCD Monitor Parameters: Objective and Subjective Analysis LCD Response Time: Is Faster Always Better? The Contrast Ratio Game Image Sticking in LCD and LED HDTVs What Does 600Hz Sub Field Drive Mean? How Plasma Displays Work LED Backlighting xvYCC and Deep Color LCD-Plasma Display Technology Shoot-out Contrast Ratios Pt.II 2K, 4K, Who Do We Appreciate? Understanding Gamma

Ten Lies in HDTV Sales

Conclusion

I hope you've found this guide informative. This guide is not sponsored by any hardware or software manufacturer or advertiser. If you wish to see more of these types of guides/articles, consider purchasing the TweakGuides Tweaking Companion Deluxe Edition both as a form of support, and to also gain a comprehensive PDF guide to understanding and optimizing Windows. More importantly, if you find the guide useful, please spread the word by linking to it on forums, blogs or websites so that more people become aware of it and can potentially benefit from it.