Home Featured HDR Grading on The Mandalorian

HDR Grading on The Mandalorian

by alexis
The Mandalorian

I’ve been thinking a lot about HDR in the last three years, not only as a colorist who’s also taken it upon myself to try and explain why this new development is so exciting to a wider audience, but as a director who’s done three projects intended for HDR mastering who’s keen to make sure that what’s done on set gives the most efficient starting point for the eventual HDR grade. As I’ve spoken to folks about HDR, I’ve always tried to make clear that one of the things I like about how this new display technology is rolling out is that there are no rules for how it’s used.

There’s no one-size-fits-all approach to grading HDR, and there are no hard and fast rules for how to map different highlights to different levels, just like there are no rules for choosing specifically what values different shadow levels are supposed to appear at. Grading HDR is a matter of creative decision-making, which is incredibly exciting for everyone who exercises control over narrative imagery.

That’s not to say there aren’t technological limitations that are important to keep in mind in terms of what percentage of the image can be graded up to HDR highlight levels and accurately reproduced on consumer displays. Display capabilities are a moving target as new TVs with new capabilities come out every year. Prudent coloristists make themselves aware of what consumer televisions are capable of, and how the HDR mastering standard they’re adhering to (such as Dolby Vision, HDR10, or the HDR10+ family of standards) deals with out-of-bounds levels, because this partially informs how one chooses to distribute one’s pixels of brightness.

For many display technologies, having too many HDR-bright pixels at too high a level triggers Automatic Brightness Limiting (ABL) to limit power consumption and protect the panel. This means you end up having an HDR highlights pixel budget of what percentage of the image you can distribute among low, medium, and high levels of HDR highlights. At least, that’s how I choose to look at it in the work I’ve done, and it’s served me well.

In my experience, this isn’t the worst thing, because these limitations reinforce the very purpose of HDR; HDR grading is about using the additional headroom to make highlights more energetic, more varied, more detailed, and more saturated, instead of having to compress, clip, or desaturate them as we must to fit all highlights into the limited range of SDR. HDR is not about making the overall picture brighter, just making the brightest parts of the image brighter, as desired.

This focus on HDR being about better highlights also means that the shadows and midtones, which are nearly always the majority of a dramatically-exposed image, remain down in the good old SDR range of values that cinematographers and colorists are so used to. Let me say that again, it’s typical for the shadows and midtones of SDR and HDR images to be similar.

That’s not to say that there hasn’t been an evolution of thinking as everyone gets more experience grading HDR images. For example, there’s been a wide consensus among professionals I’ve spoken with that images with HDR highlights benefit from somewhat brighter “diffuse white” levels. Diffuse white, or “reference white,” defines the level of light reflecting off a sheet of flat matte white substance (with no specular highlights) that reflects evenly at all wavelengths. Think a white sheet of paper, a white t-shirt, or a matte white wall. Slightly elevating reference white guarantees that, to the viewer, ordinary matte whites still appear as white relative to the even “whiter” hard white highlights that HDR can add to an image. You don’t want a flat white t-shirt looking gray.

This thinking is reflected in BT.2390, which is a recommendation to use a linear scale operation to increase the level of Standard Dynamic Range material so reference white levels hit 200 nits when mixing SDR and HDR media together, as when you mix archival SDR footage with new HDR material in a documentary. This way, the SDR material doesn’t look so dingy when compared to the popping highlights of HDR. This thinking is also reflected in BT.2408, which is a recommendation for a 203 nit Reference White target in one’s HDR grading. Yes, this means that higher shadows and high midtones may get a little bit brighter (depending on how you grade), but this is in the interest of having the rest of your image not seem dingy compared to the HDR highlights being sprinkled around your image. Absolute black stays down at 0 percent, and the darker shadows more or less stay where they are (depending on how you like to grade). In my experience, this is a good general guideline, although your implementation will vary depending on your creative decision-making and the scene at hand.

So, there are increasingly recommendations about how to redistribute the SDR-ish values (shadows, midtones) within an HDR grade in which the increased dynamic range is being used to create a significant differentiation between different kinds of highlights, but they’re just that, recommendations. Many programs are choosing to master to a maximum level of 1000 nits for HDR highlights, since that’s a reasonable average of what consumer televisions can do at the moment (whether or not this is the right thing to do is an entirely different article). With 800 additional nits of highlight range to choose from, you can now have dim highlights, medium highlights, and bright highlights that are 200 nits apart from one another, as opposed to having similar levels of image highlights being only up to 10 nits different if you’re using 90 to 100 percent of an SDR image for the same range of diffuse whites to sun glints. But how you use this range is entirely image and content dependent.

Getting back to the Mandalorian, when I read the Ars Technica post about the HDR grading in The Mandalorian being “fake HDR,” I chuckled. I get what I believe the author is trying to say, which is that the way HDR is used in the show is not spectacular enough to make owning an HDR television feel worthwhile to audiences who want to turn the video up to 11. If all the author said was they didn’t like the grade and wished the highlights were brighter, that’s an opinion reflective of the author’s tastes, and I’d have no reason to argue. Taste is taste. Whatever.

However, citing television display capabilities and a “test” that to quote the author, “heat-mapped the image for the YouTube video to make it clear how bright each part of the image is…” to observe that “…at no point did any part of the image in The Mandalorian—even highlights like blaster fire, a forge of molten metal, or the Sun—appear at more than 200 cd/m².” and thus accuse the program of being “faux HDR” is misguided. The author goes on to write “the image looks awfully dim and isn’t living up to expectations.” Having lived through the “Battle of Winterfell” debate, I can safely say that if every episode of The Mandalorian is too dark, they should check that their TV is working properly, it looks nicely exposed to my eye.

HDR display specifications are only for governing the required capabilities of one’s display to be able to reproduce HDR images properly. These specifications say nothing about how you should grade any given program, beyond a general description of the purpose of HDR, with an implied warning not to overdo things if you want what is promoted as the HDR effect of perceptually spectacular highlights in direct comparison to lower level shadows and midtones (in other words, don’t just scale the brightness of the entire image up).

Even if you’re grading a program with scenes that are using HDR’s full highlight-popping capabilities, creatively speaking you’re not going to do so in every shot of every scene. A typically graded program will and should have wide variation in how one chooses to use HDR, depending on the content of a given scene and the look and mood the director, cinematographer, and colorist are going for.

In fact, there’s no requirement that you use maximum HDR strength highlights at all, much less forcing you to use some arbitrarily high level just to use the available range. How you use these levels is entirely up to your creative team.

One of the things I’ve learned grading my own directorial efforts is that the extreme dynamic range afforded by HDR makes it tempting to boost perceived contrast so far beyond what cinema viewers are used to that the image begins to look like live television. To make one’s dramatic images play into audience expectations of what a capital M movie is supposed to look like actually requires a lot of restraint, and I find that when grading HDR I pay an enormous amount of attention to grading my shadows to strike just the right balance between darkness and black, and to counterbalance the distribution of the highlights I’m choosing to map. Just because you can do certain things doesn’t mean you should. But again, these are purely aesthetic choices and opinions. There’s no rule about any of this, nor should there be. And I’m sure that as time goes on, our collective opinions about what “looks cinematic” will evolve as well. New fashions in image-making and grading will emerge, and the wheel of punditry will turn.

I’ve watched every episode of The Mandalorian that’s been released so far, and it’s clear to me that its use of HDR is deliberate and intentional. In my opinion, the grade is a nice nod to the look and feel of the original movies by being classically restrained in the distribution of shadows, midtones, and saturation, while sprinkling moderate HDR highlights into the image that are compatible with the overall vibe they’re selling. No part of the grade has distracted me from simply watching the series, and yet overall it looks great on my OLED LG TV. I want to congratulate cinematographers Barry Baz Idoine and Greig Fraser, colorist Steve Scott and his collaborators Charles Bunnag and Adam Nazarenko at Company 3 (apologies for getting these names wrong earlier), and all the people who were collectively responsible for all aspects of the image during the shoot and in post on their clever blend of old and new.

So, is the lighting and grading good? That depends on each viewer’s opinion and I’m not going to argue that point (I think it is). Does this restraint make it objectively bad HDR? No. That presupposes there’s a “good” amount of HDR to use, which is like complaining they didn’t use as much saturation as they could have, or that every scene didn’t maximize the dynamic range of the audio mix.

I have spoken.

12/8/19 – Edit: I corrected the names of colorists Steve Scott, Charles Bunnag, and Adam Nazarenko at Company 3. I also added links to the recommendation papers I cited (thanks to Marc Wielage for tracking these down!)
12/9/19 – Edit: Edited the fifth paragraph for a typo and for clarity.
12/20/19 – Edit: Edited the sixth paragraph to fix a typo.

You may also like

8 comments

Gavin Greenwalt December 7, 2019 - 7:22 pm

I believe the criticism wasn’t that the HDR was bad but that it was non existent. 200 nits is as dark or darker than most viewers view SDR material. So if your TV is SDR and 300 nits then watching mandalorian if it peaks at 200 nits actually will result in a flatter darker image than watching the SDR grade stretched to whatever backlight setting 99% of consumer sets are set to for white. We as professionals know that outside of a theater almost nobody actually watches films at 100 nits. People have daylight and lamps in their living rooms so they turn their LED backlights up to 300 nits+. Tablets, laptops and phones are set even higher. I think my computer monitor peaks at 500.

I think it’s a valid question, is SDR packaged in Dolby Vision like how AppleTV outputs everything as a DV container where we should head? I think yes so that everything is consistent but I don’t like the idea of delivery claiming to be HDR if it’s literally SDR in say a Dolby Vision package. Just like I’m ok with 8bit material in a 32 bit EXR for consistency of file formats… I wouldn’t advertise though the material as 32bit material. We saw the same thing with 4K material. It’s not bad to have 2k films and I would rather have an UHD 4K distribution of a film mastered in 2k/SDR because of better compression, but don’t pretend the film is in 4k when it’s just an upscale with no additional resolution.

Reply
alexis December 8, 2019 - 10:23 am

In terms of viewing conditions, assuming Disney+ is feeding Dolby Vision image streams correctly, if you’re watching The Mandalorian on an SDR television or device, then you should be getting an SDR stream with a specifically trimmed SDR grade with highlights that are fit within the 0-100% signal range, which the television should be linearly scaling to whatever peak output it’s set to as with any SDR signal. Getting a flatter image from linearly scaling the HDR image into an SDR range is not how Dolby Vision works (assuming your Dolby Vision viewing device(s) work properly). My opinion of the grade was formed by watching the program via the Disney+ app on my OLED LG TV, with the TV left to its calibrated setting. I understand that Apple TVs have had issues feeding the correct data to HDR televisions in the past, so it’s something I’ve skipped doing this generation, and onboard App handoff of streamed HDR to the LG has so far worked flawlessly. I’ve consistently watched the program at night with a living room lamp turned on off to the side to illuminate the room but not reflect on the TV, which is a pretty textbook case of the “moderately lit living room” situation that colorist suites are meant to approximate. Thus, I’ve watched the program in the environment it was designed for, and had a good experience. I can’t speak for those watching on iPads or phones with manually adjusted brightness in bright daylight, but I’m guessing they’re not getting an optimal experience. However, imho that’s a separate issue.

Reply
Simon Walker December 8, 2019 - 11:46 am

Great article Alexis! Good points about considering ABL during the grade and honoring the look and feel of previous material in the series/franchise.

Reply
Mierzwiak December 9, 2019 - 5:16 am

@Gavin Greenwalt

“I believe the criticism wasn’t that the HDR was bad but that it was non existent.”
Non existent by what standard? They didn’t even compared it with SDR stream.

“200 nits is as dark or darker than most viewers view SDR material.”
And that’s the reason why some people are constantly whining about HDR being dark or unwatchable, but it’s their problem, not colorists. It doesn’t mean suddenly we should change industry standards and grade SDR for what, 300? 500 nits?

“I don’t like the idea of delivery claiming to be HDR if it’s literally SDR in say a Dolby Vision package”
Mandalorian is not SDR in Dolby Vision package. It looks different than SDR, there’s more range with more highlight details.

“It’s not bad to have 2k films and I would rather have an UHD 4K distribution of a film mastered in 2k/SDR because of better compression, but don’t pretend the film is in 4k when it’s just an upscale with no additional resolution.”
Many UHDs based on 2K masters looks significantly better than Blu-ray, with better reproduction of grain, more details etc. Of course how it’s advertised is a different story, I cringe every time I see on back cover that UHD is “4 times sharper” than HD.

Reply
Gavin Greenwalt December 9, 2019 - 5:31 pm

“Non existent by what standard? ”
What I heard, and since we’re all talking second hand about people complaining since we ourselves aren’t complaining… is that there is no separate HDR grade but it was graded once and the “HDR” was the SDR grade inside of a DV package.

“Mandalorian is not SDR in Dolby Vision package. It looks different than SDR, there’s more range with more highlight details.”
Then the people complaining about the grade are wrong. That’s the only complaint I’ve heard about the Mandalorian HDR grade: that it’s the same grade just in a different container.

“It doesn’t mean suddenly we should change industry standards and grade SDR for what, 300?”
Doesn’t it though? There might be 100 people on earth actually watching TV at 100 nits. When 99.9999999% of your viewers are watching at 250-300+ nits, then we should be targeting the defacto standard that every single viewer is watching at, not the unrealistic standard that will last be seen on the colorist’s reference monitor. There is nothing wrong with 300 nit SDR. This isn’t like mastering to a pair of crummy ear buds where you lose substantial amounts of dynamic range in the mix. Most viewers have been getting quasi-HDR for years just based on the fact that their TV was set to 80% of their back-light’s max output. If you took 99.999999999% of the TVs and they’re running at 300 nits white and you playback the in HDR and the HDR is peaking at 200 nits people are going to believe their eyes and say that the SDR output is the “HDR” display and the HDR display is the SDR display… and they’re not wrong.

These streaming services will presumably never be shown in a darkened theater. You’ll be lucky if they’re shown in a darkened room. And even if they’re in a darkened room, I don’t think I’ve ever witnessed a TV in my entire life actually set to peak 100 nit white. Never ever.

Reply
alexis December 9, 2019 - 7:44 pm

I’m hoping to get some information at some point directly from the colorists who worked on this, because speculation about the intent of the colorists is pointless.

What I can say is that, having graded programs myself using the Dolby Vision process, there’s no such thing as “wrapping SDR in an HDR package.” Every clip in a program being mastered for Dolby Vision needs to be analyzed and trimmed in one or more specific-nit-output passes, which results in artistic trimming metadata being saved for every clip in a program, that’s stored within the program deliverable. This trim metadata is what lets a Dolby Vision enabled television fit the range of a graded image to the capabilities of any given SDR or HDR television with artistic guidance as to what the final image is supposed to look like. That’s the whole point of Dolby Vision, and it’s something that I know the folks at Company 3 where this program was finished, which is one of the biggest postproduction houses in Hollywood, are abundantly aware of.

Regular SDR televisions linearly scale an SDR signal to the nit range of light being output by the panel. There is no color management. If your ordinary SDR television is outputting peak 450 nits, then whatever was mapped at 700 mV (100 IRE) of signal in a program’s SDR grade gets mapped to 450 nits, with the whole signal from 0 to 100 percent being stretched linearly, and any signal above 700 mV (100 IRE) getting clipped. The result will be a very bright image unless it was deliberately graded dark in SDR.

Furthermore, a straight HDR signal viewed in SDR looks log-encoded, desaturated and dull, because the entire 0-10,000 it range supported by ST.2084 must be encapsulated within a 10- or 12-bit file that’s compatible with ordinary video equipment. When you look at an HDR signal on an SDR television, it doesn’t just look dark, it looks desaturated and horrible, because it requires an HDR television to remap and present that signal correctly. No matter what, you have to somehow trim an HDR image to SDR, or vice versa, if you want something that looks even boringly ordinary. Even if you use a tone-mapping algorithm to make such a transformation automatically, an adjustment is being made, it’s not going to be the same grade.

I’m guessing (I don’t know for sure) that Disney+ is delivering a separate stream for non-Dolby Vision SDR televisions, and that stream will occupy a conventional SDR signal range, that should stretch out like any other SDR signal to whatever that television is outputting. This isn’t “quasi-HDR,” it’s just a really bright SDR image. HDR is a specific tonal response curve for mapping shadows and highlights appearing in an image to create a specific kind of contrast in an image when output to an appropriately color-managed HDR display. What you put on that curve is a creative decision.

It’s possible that some people who are complaining are using streaming box and display combinations that aren’t correctly configured (or even compatible), and that’s causing things to go wrong for them, but that has nothing to do with how the program is being graded or mastered, and confusing one for the other would be incorrect.

Reply
Joel Hjerten December 29, 2019 - 11:24 am

Madalorian is so restrained use if the available range to only 2~300 makes me question why even use the HDR container at all when there is so little difference to the SDR version. I have a very capable Sony Z9F that can render 1800 nit but it does bloom a little bit and has slightly raised black in HDR mode due to this the show actually look better in SDR on my TV. But as the market is getting saturated with sub 1000 nits OLDEs and you have one and think it looks good on it, it make me a bit sad as the 4000 nit movies I’ve seen are spectacular. I think they took the artistic mutedness too far. Such missed potential.

Reply
alexis December 29, 2019 - 3:44 pm

If you have a consumer television, you’re not seeing the 4000 nits that some movies are being mastered to. You’re seeing either a tone mapped version with peaks as high as your particular television can handle (as in Dolby Vision), or a version with the top highlights clipped or rolled off in whatever way your TV does (with HDR10). The best HDR consumer televisions I’m aware of today are doing peak highlights on the order of 1500 nits (this is usually LCD with LED backlit, and thus relative the lighter absolute black those TVs do), so that’s not nothing, but the dimmer by comparison peak highlights of OLED (800 odd nits I believe) is relative the significantly darker blacks that OLED is capable of. Since contrast is judged relative to the black level of a display, that makes OLED still look spectacular even though LCD can drive harder highlights. 300 nits is still a heck of a lot brighter than SDR, and while I get that you wish it were more spectacular, they are still doing a grade that you can’t do with SDR.

Eventually, there will be no SDR televisions, only HDR televisions that encompass SDR programs, and programs that decide they don’t need to or want to grade using abundant HDR levels. At that point, it won’t be about what “container” (mastering standard, really) is being used, it’s a matter of the creator’s aesthetic.

Reply

Leave a Comment

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More