Everything You Need to Know About 4K HDR TVs

Consumer Reports has no financial relationship with advertisers on this site.

If you’re in the market for a new TV, there's a good chance it will be a 4K set with a feature called high dynamic range, or HDR for short.

Shoppers should be particularly excited about HDR, which is now found in the vast majority of midsized to large sets. While 4K TVs are all about greater picture detail, not everyone notices the extra detail from normal seating distances. In contrast, TVs that do a good job with HDR video can present brighter, more vivid images with greater contrast and a wider array of colors, much closer to what we see in real life.

But CR’s testers have found that not all TVs with HDR perform equally well. In fact it's become one of the key differentiators among the sets in our TV ratings. Here’s what you need to know to make a good choice.

What Is HDR, Exactly?

In music, high dynamic range refers to the difference between the softest and loudest parts of the composition. In video, it’s about increasing the contrast between the brightest whites and the darkest blacks a TV can produce.

“When done well, HDR presents more natural illumination of image content,” says Claudio Ciacci, who heads the Consumer Reports TV testing program. “Though HDR demands a higher peak brightness from the TV, it doesn’t mean it has to present a blindingly bright image to the viewer. It simply means the TV has the brightness headroom needed to present the various elements in an image—a shadowy cave, sunlit facial highlights, a brightly lit lightbulb—at the brightness level that is required.”

When HDR is at work, you’ll notice the texture of the brick on a shady walkway or nuances in the white clouds in a daytime sky.

You’ll also see brighter, more realistic “specular highlights,” such as, say, the sun’s reflection off a car’s chrome bumper or an airplane wing. With HDR, those flashes of light pop; without it, they don’t stand out nearly as much.

HDR TVs typically produce more vibrant, varied colors, too. That’s because HDR is often paired with another newer TV technology called wide color gamut technology, or WCG.

Think of it as giving your TV a larger box of crayons to play with. Standard HDTVs can display about 17 million colors. Those with WCG can display up to a billion.

But you don’t get to enjoy all that fantastic contrast and color every time you turn on the TV. You have to be playing a movie or TV show mastered to take advantage of HDR and WCG. Regular broadcast TV, such as you get from your local networks, is still playing catch-up for both 4K and HDR content, but 4K shows and movies are now readily available from many streaming video services.

Are All HDR TVs Created Equal?

In a word, no. Our tests show that not every TV with HDR written on the box produces equally rich, lifelike images.

First of all, TVs are all over the map when it comes to picture quality, HDR or no HDR. But there are also challenges specific to this technology.

Most notably, a TV must be bright enough to really deliver on HDR. To understand why, you need to know your “nits,” the units used to measure brightness.

Better-performing HDR TVs typically generate at least 600 nits of peak brightness, with top performers hitting 1,000 nits or more. But many HDR TVs produce only 100 to 300 nits, which is really not enough to deliver an HDR experience.

With an underpowered TV, the fire of a rocket launch becomes a single massive white flare. With a brighter television, you’d see more intense, lifelike flames, as if you were really there.

“The benefits of HDR are often lost with mediocre displays,” Ciacci says.

How to Tell a Great HDR TV From a Bad One

Unfortunately, you can’t just read the packaging—or even rely on how the picture looks in the store.

Though some TVs carry an Ultra HD Premium logo, indicating that they’ve been certified as high-performance sets by an industry group called the UHD Alliance, not all manufacturers participate in the program. LG and Samsung do; Sony and Vizio don’t.

You can’t rely on a TV’s claim of peak brightness, either. Most of those measurements are recorded using a standard industry test pattern, called a 10 percent window, that evaluates the brightness of a small box against a completely black background. But companies can use other methods to produce peak brightness numbers.

What to do instead? If you're a CR member, check our TV ratings and buying guide. We now have separate scores for UHD picture quality and HDR performance.

We measure brightness differently from the testers at most other organizations. You know that 10 percent window pattern? We don’t think it’s a realistic way to determine a TV’s brightness during a regular TV show or movie. That’s why Consumer Reports developed its own brightness test patterns, placing that white 10 percent window against a background of moving video. That gives us a much better idea of the set’s real brightness.

If you look through our ratings, you’ll see that the TVs with the best HDR often tend to be the priciest. But there are also some good choices for people who want to spend less. We think HDR performance will continue to be the big differentiator among 4K TVs throughout 2021, but don’t be surprised if more lower-cost sets start to deliver a satisfying HDR experience, too.

Types of HDR

When you shop for a TV, you may see references to different kinds of HDR. There are several variations on the technology. It can be useful to understand these different HDR flavors, but if it’s more information than you want, don’t worry. TVs with any type of HDR can all work well, depending on the specific television model.

HDR10 has been adopted as an open, free technology standard, and it’s supported by all 4K TVs with HDR, all 4K UHD Blu-ray players, and all HDR programming.

A number of TVs, including models from LG, Sony, and Vizio, plus Roku TVs from several brands (including Hisense and TCL), also offer Dolby Vision, which is being promoted as an enhanced version of HDR10. It's also available on some Amazon Fire TV, Apple TV, Google Chromecast, and Roku streaming players, and from streaming services including Amazon Prime Video, Disney+, Netflix, and Paramount+.

Among its advantages, Dolby Vision supports “dynamic” metadata, which allows the TV to adjust brightness on a scene-by-scene or frame-by-frame basis. By contrast, HDR10 uses “static” metadata, setting brightness levels once for the entire movie or show.

There’s also a newer technology, called HDR10+ that uses dynamic metadata much like Dolby Vision. It was developed by Samsung, Panasonic, and 20th Century Fox, and is available in some 4K TVs from Hisense, Samsung, TCL, and Vizio. It's also supported in some Amazon Fire TV, Google Chromecast With Google TV, and Roku streaming players.

You can find TV shows and movies in HDR10+ on Amazon’s 4K Prime Video streaming video service, Paramount+, Google Play Movies, and YouTube. It can also be found in some 4K UHD Blu-ray discs, mainly from 20th Century Fox.

Both Dolby Vision and HDR10+ have offshoots of their respective technologies—called Dolby Vision IQ and HDR10+ Adaptive—that can be found on some newer TVs. It allows your TV to make small adjustments to the brightness, contrast, and color of the images based on the ambient brightness of the room.

One advantage of the dynamic metadata in both Dolby Vision and HDR10+ is that it can help a midlevel TV that doesn’t have the brightness levels of a top-tier model adapt the content to the set’s limitations. Using a process called “tone mapping,” the metadata can guide the TV to make scene-by-scene or frame-by-frame adjustments according to brightness, color, and contrast variations in the content.

Many TVs now also include support for one more HDR format, called HLG, short for hybrid log gamma. If it’s adopted for the next generation of free over-the-air TV signals, which will follow a standard called ATSC 3.0, you’re likely to hear more about it. (It will be important for those who get TV through antennas, which are making a comeback.)

Many newer TVs have built-in support for HLG, and others can receive it via firmware updates if necessary.

Yes, that all sounds complicated.

But there’s some good news. First, your TV will automatically detect the type of HDR being used in a given movie or show and choose the right way to play it. (Often you’ll see a little flag on the TV screen showing the type of HDR that’s playing.) No fiddling required.

Second, as noted above, the type of HDR doesn’t seem to be too important right now. Based on what we’ve seen in our labs, a top-performing TV can do a great job with any of these HDR formats.

Our advice: Instead of fretting over the type of HDR, simply buy the best TV you can, especially since TV manufacturers can update their sets to support additional formats if they become more popular. That’s what Vizio did last year when it added HDR10+ capability to sets going back to 2018.