If you’re considering buying a new TV, you can be forgiven for having questions, possibly many of them. Several big-screen TV technologies—Ultra HDTV, high dynamic range (HDR), OLED—have been introduced over the past few years. At the same time, some older ones, including 3D and plasma, have faded from prominence. There have also been changes on the connectivity front, with new versions of the HDMI standard added to sets to accommodate new, higher-bandwidth signal formats. The good news is that changes in TV tech have settled down to a degree, making buying a set a relatively safe bet today. The number of display types has been effectively reduced to two, LCD and OLED, and many new Ultra HDTV models are now compatible with HDR. Also, virtually all sets that were introduced in 2016 sport at least one HDMI 2.0a input—the latest version—along with the necessary copy protection to enable viewing new formats like Ultra HD Blu-ray. LCD or OLED? The most common TV display technology, one that’s offered by all manufacturers large and small, is LCD. The main benefit to LCD is light output: Picture brightness can be cranked high enough to accommodate situations such as watching sports in a well-lit room. At the same time, many high-end LCD TVs feature backlight modulation technology, which lets them achieve the deep black levels and high contrast ratios that are favorable for watching movies. Another LCD benefit is cost, with 50-inch Ultra HD models currently selling for as low as $500. Big-screen LCD TVs create images using an array of LEDs that beam light through a grid of liquidcrystal pixels, with the liquid-crystal material alternately twisting and untwisting to regulate the amount of light passing through. There are two main LED backlight types: edge-lit and full-array. Edge-lit LCD models employ LEDs positioned on the sides or the top and bottom of the panel; they emit light into a light guide/diffuser that serves to illuminate the entire display. Full-array models, in contrast, use a grid of LEDs distributed across the back of the LCD panel. In both cases, local dimming can be employed to alternately dim, shut off, or boost the brightness of specific LED “zones” (a grouping of LEDs), dynamically enhancing contrast based on image content. Full-array models, however, provide greater precision than edge-lit ones in this regard, since the LEDs distributed across the screen allow for specific areas in the image to be addressed—right down to the individual LED level, in the case of Sony’s new Z9D series sets (see review on page 36 of this issue). While LCD combines the benefits of high brightness and high contrast potential, and does so at a reasonable cost, the technology has some downsides. The main one is the limited viewing angle from models that use a Vertical Alignment (VA) display panel—a type found in the majority of high-end LCD sets. Models using an in-plane switching (IPS) display panel provide a significantly wider viewing angle, though they can’t achieve the same depth of black and thus have more limited contrast compared with VA displays. Apart from being almost impossibly thin, OLED TVs are best known for being able to display deep, rich blacks. There’s a solid technical reason for that. Unlike LCD, OLED doesn’t require a backlight; the pixels in the display contain an electroluminescent organic material that generates light in response to electric current. Since each pixel is its own light source, it can be individually switched off to create pure black or modulated to create fine gradations of gray. Because OLED TVs don’t require a backlight, they also have a wide viewing angle, with pictures retaining uniform contrast and color from any vantage point in the room. At present, the main downside to OLED is cost: 55-inch Ultra HD models from LG, the only manufacturer currently selling OLED TVs, start at $2,800. As far as video performance goes, OLED sets also have limited brightness compared with the best LCD displays, which can deliver about twice as much light output. OLED models introduced in 2016, however, are showing improvements in brightness, and the technology’s native ability to achieve deep blacks results in contrast ratios on par with or even greater than that of LCD models, even when displaying HDR content. Another potential downside of OLED—one that has been exaggerated by LCD manufacturers—is the possibility of image retention when the TV is displaying, for extended periods of time, static images like the graphics around a news crawl or the frame of a video game. While instances of burn-in are rare, it’s enough of an issue that OLED TV manuals warn against “image sticking.” Image retention is not an issue with LCD TVs. Questions surrounding the life span of the organic material used for OLED pixels have dogged the technology since its inception. LG’s implementation, however, uses white OLED subpixels with red, green, and blue filters to create full-color images—a process that, the company claims, increases longevity and ensures that all the pixels age at precisely the same rate. How long can an OLED set last? According to LG, their OLEDs now have a life span of 100,000 hours, which equates to decades of viewing. One thing LCD and OLED have in common is that both display types are available in a curved-screen format. While TV manufacturers claim that the curve provides a more “immersive” experience, with most screen sizes you would need to sit uncomfortably close to the TV to get any such benefit. Otherwise, there’s no performance advantage to a curved-screen TV; opting for one over a flat-screen model is purely a matter of style. Ultra Is Better HDTV is still the standard in use by TV broadcasters and cable/satellite providers, but Ultra HDTV represents the future for both traditional program providers and streaming services. A key difference between HDTV and UHDTV is resolution: Whereas HDTV provides 1920 horizontal x 1080 vertical pixels, UHDTV offers 3840 x 2160. That’s more than 8 million pixels—four times as many. You’ll sometimes see Ultra HD referred to as “4K.” Technically, Ultra HD has a few less pixels than the 4K-resolution displays employed for commercial digital cinema, but for the purpose of shopping for a consumer TV, the terms are essentially interchangeable. Just one short year ago, there wasn’t enough Ultra HD content to justify the purchase of an Ultra HD set over a regular HD set. Now the situation has changed, mostly due to the arrival of Ultra HD Blu-ray players and discs, but also because of a sharp increase in content on streaming services like Netflix, Amazon Instant Video, and Vudu. So there’s no reason other than budgetary concerns to opt for a regular HDTV over an Ultra HDTV. Another difference between HDTV and UHDTV is the potential for the latter to display images with 10-bit color. This provides a significant boost in the amount of possible colors that a set can display: 1 billion versus the 16.8 million on regular HDTVs, which employ 8-bit color. The UHDTV standard’s wide color gamut means that many sets now come close to matching the P3 color space used in professional digital cinemas. The benefits are twofold. First, the P3 color pool includes some color we’ve not been able to accurately display before with the old Rec. 709 HDTV standard, notably deep red. Second, more colors means finer gradations within objects or areas of the image and therefore fewer noticeable banding artifacts. High Dynamic Range Along with displaying enhanced color, many new Ultra HDTVs have the ability to handle high dynamic range (HDR). When a movie or TV show is mastered for HDR, it includes metadata that instructs the TV to extend its maximum contrast capability so that both highlights and shadows show greater intensity and wider gradations of detail. Sources of HDR content include Ultra HD Blu-rays and the same streaming services that currently offer Ultra HD content.