About Those New Mac Pros…

Mac Pro

To be honest, the only reason I’m writing this is because I feel a tad guilty about my first post-Mac Pro announcement tweet. The one where I was really upset about the inclusion of ATI GPUs instead of Nvidia. Understand, for the last three years I’ve been fully immersed in applications for which the messaging has consistently been “we’re heavily optimized for Nvidia and CUDA, and the current version of OpenCL on OS X sucks.”

What Apple did a terrible job of making clear in their initial presentation was that a new version of OpenCL will apparently make it far easier for software developers to achieve stellar performance on this new hardware. And I’m not drinking cool-aid; outside of the various announcements that Blackmagic and Adobe have made, I’ve had several email exchanges with folks I trust who are in a position to know, who are sadly hidden behind multiple layers of NDAs. The responses I’ve gotten have been incredibly optimistic.

So that makes me feel better.

I don’t care about the new Mac Pro being forward thinking or progressive or whatever. I don’t care about it being the future. I care about it being fast, reliable, and affordable. Ideally, I’m hoping that I’ll want to buy it because it’ll have the best bang for the buck in terms of price/performance, a characteristic of the best releases of former generations of Mac Pros.

I don’t care what the thing looks like. They can put the internals in a shoebox if the machine runs quickly, quietly, and reliably.

I don’t even care that much about the expandability story, given my own personal use case. Frankly, its two high-end GPUs will be far better then what I’ve got now in either of my current two Mac Pros, and if the integrated architecture sacrificed slots in order to move bits around faster, that’s fine with me. I’ve already got a spaghetti USB hub, USB audio interface, and external RAID connected to my current Mac Pro, so all the new one would add would be a Thunderbolt video box, which I frankly prefer to an internal card as I can move it around and use it with multiple computers, desktop and portable, and an external box for my Red Rocket, which again would make it more portable which is kind of interesting when you think about it.

The only thing I lose is the illusion that I, personally, will someday buy a PCIe expander and fill it up with four highest end GPU cards in order to have a mega-processing behemoth. For those who actually would do such a thing, this is a true loss, but it’s a dream I can’t afford. Two GPUs will do me fine if the architecture is right.

So I’m happy to wait and see. Time will tell what its true performance will be once it ships.

There’s only one thing; if Apple’s going to solder everything together in a non-upgradable form factor, they better update that fucking product every year to keep it current. And it’d help if the price is low enough so that an annual or two-year upgrade cycle is an attractive prospect for power users (I used to upgrade my Mac Pros every other year, back when there was something worth upgrading to). I’m tired of waiting in silence for next generation hardware to help me do my work, and it would be nice to feel that Apple has as much regard for users of this new form factor four years from now as they do for their current laptop and iOS customers.

Color Correction Handbook 2nd Edition: Grading theory and technique for any application.
Color Correction Look Book: Stylized and creative grading techniques for any application.

Some Things I Saw at NAB 2013

The 2013 National Association of Broadcasters (NAB) show was so very busy that only now has the dust settled enough for me to write anything. Unfortunately, I hadn’t the time to visit everyone I wanted to, but between the madness I did manage to catch up with a few of the companies who make products that interest me. While there were some major announcements that grabbed a lot of attention, here are some of the smaller pieces that you may not have noticed.

One of the reasons the show was so hectic for me this year was that it came just as the short film I’ve been directing has been winding its way through post. Just before leaving for Vegas, I posted a two-minute teaser online to have something to show off while the overall 12 minutes is finished. The inestimable Brian Mulligan, who’s been contributing compositing to my film (he’s responsible for the dimensional doorway effect) was at the Autodesk booth showing how he created this and several other effects using Smoke 2013 on the Mac.

Brian Mulligan showing how to make a burning doorway in Smoke 2013

Brian Mulligan (right) showing how to make a burning doorway between dimensions in Smoke 2013

While Autodesk wasn’t showing anything brand spanking new just yet, a “technology collaboration” was announced between Autodesk and Blackmagic Design that should eventually be great news for folks using products from both of these companies. There’s plenty of synergy I’d like to see; time will tell.

Speaking of Blackmagic, the upcoming DaVinci Resolve 10 was the other reason I was so busy. They piled in so many new feature announcements that I couldn’t even cover them all in the 25 minute Supermeet 2013 presentation I gave.

The video shows most of the headline features including a continued focus on application interoperability, online-oriented editing features such as timeline audio tracks, 3-point editing, and a unified trim tool; text generators; integrated optical-flow processing of slow motion speed effects; a completely revamped windowing interface with bezier drawing, unlimited window support within one node, a new gradient window, and window naming; all-new tools for splitting color channels in different color spaces for individual adjustment; and support for OpenFX plugins allowing Resolve’s capabilities to be expanded with whatever compatible plugins you want to use.

All of this just scratches the surface, however, and I’ll be demonstrating even more announced features at various events in the coming months (starting with an appearance at the BOSCPUG on May 29th). I’ll be demonstrating the new live on-set grading tools, more online editing features, the new optical-flow based noise reduction and motion blur features, and more.

Speaking of OpenFX plugins, GenArts announced version seven of their enviable Sapphire plugin collection. Anyone doing serious work in postproduction has either used or wanted to use these plugins, which in addition to compatibility with every major plugin format in use, is also available in the OpenFX format, meaning Resolve 10 users will have access to the phenomenal optical glows, lens flares, and video/film damage effects that you know and love.

In version 7, GenArts has added a new Beauty plugin for fast, targeted edge aware skin-tone smoothing, a new general purpose Edge Aware Blur for blurring low-detail portions of an image while retaining edge detail, and an update to their glow filters which allow for the addition of animated atmospheric noise as part of the effect (providing an illusion of volume). In addition, they’ve improved their Lens Flare engine, undertaking a project to shoot real flares through a wide variety of popular and vintage lenses, and rebuilding their flare elements library from terabytes of these scanned source images.

This year, I happened by the Eizo monitors stand, and noticed that they have a pair of LCD-based displays they’re hoping will appeal to the video postproduction crowd. The Eizo CG246 (a 24-inch LED edge-lit display) and CG276 (a 27-inch CCFL-backlit display) both feature DVI-D, 10-bit DisplayPort, and 10-bit HDMI inputs for convenient Rec.709 monitoring.

The Eizo

The Eizo CG276 27″ monitor

An additional feature of these monitors is a built-in Konica-Minolta colorimeter that pops up from the bottom bezel, and takes color measurements via built-in calibration software that can be invoked manually, or scheduled for routine automatic calibration.

The Eizo display's built-in Konika-Minolta colorimeter

The Eizo display’s built-in Konica-Minolta colorimeter

After the calibration routine has been completed, a convenient window can be summoned that shows how the calibrated result lines up with the designated target brightness, white point, and gamut, all built right into the display via its internal menus.

Eizo calibration report

Eizo calibration report

In terms of gamut, I’m told they boast 100% of Rec. 709 and sRGB, 97% of Adobe RGB, and 92% of DCI P3, all of which are reasonable given their respective price points of $2400 (CG246) and $2700 (CG276). I’m told the black levels of these displays is respectable, although it was impossible to tell in the predictably wretched viewing conditions on the floor. There are trade-offs, though, as there are no built-in HD-SDI inputs available for more conventional facility installations. I’ve heard that Eizo has a great reputation among photographers, and I’d venture to say these look like excellent displays if you’re an editor or compositor, or if  you primarily do grading for web video, with a bit of work for video output here and there.

One last note, in keeping with the general theme of 4K throughout the show floor, Eizo was showing a 4K prototype that’s being adapted from one of their high-resolution air traffic control displays (Eizo also makes displays for a variety of other niche markets), so one might hope that they could eventually come up with an affordable 4K display solution.

Speaking of display technology, Flanders Scientific came out with two sets of new displays. The color critical CM series consists of the 17-inch CM171 ($3,295), the 24-inch CM240 ($4,995), and the 32-inch CM320TD ($5,495). The CM series have 10-bit panels that display native 1920×1080 video. Of these, the two that are probably of interest to the colorist due to their size are the CM320TD, and the CM240.

The 32-inch Flanders Scientific CM320TD

The 32-inch Flanders Scientific CM320TD

A 32-inch display is a reasonable size for a display with good off-axis viewing, in a medium-sized color grading suite, in which you’ll be working supervised with clients sitting behind you. Thus, it’s tempting to think that, at a mere $500 premium over the 24-inch CM240, the CM320TD is an easy choice. However, be aware that there are key differences between these two displays that you may or may not find important.

  • The CM320TD is capable of passive stereoscopic 3D, has a glossy screen, and a higher 1,600:1 contrast ratio. Its panel is native 10-bit. It’s also an LED edge-lit display, for which no warm-up period is necessary for critical viewing. However, this results in a narrower gamut then the CM240; the CM320 displays 100% of Rec.709, but it covers a smaller portion of DCI P3 then its 24-inch counterpart.
  • The CM240 is not stereo capable, it has a matte screen, and a 1,100:1 contrast ratio. It uses FRC to achieve 10-bit performance, which you’d likely never notice. Using CCFL fluorescent backlighting, its native gamut is wider then that of the CM320TD, covering approximately 97% of the DCI P3 colorspace, however CCFL needs a warm up period of approximately 30 minutes to fully stabilize.

Other then the size, if you care about glossy versus matte, or stereoscopic 3D-capable versus not, then you’ll have a decision to make. If you care about the difference in P3 gamut, that’s fair, but be aware that both monitors are capable of being switched to the DCI-P3 standard using the proper color transform, white point, and gamma setting with which to get a preview of how this transformation will affect your image.

Flanders also announced the new BM series of lower cost, LED edge-lit, 8-bit displays. The BM210 is a 21.5-inch model ($2,495), while the BM230 is a 23-inch model ($2,995). While not being marketed as color-critical displays, and having a lower 1,000:1 contrast ratio, these are nonetheless great-looking displays covering 100% of Rec.709, and will be right at home in any video village or editing suite.

It’s also worth mentioning that both the CM series and the BM series feature Flanders’ CFE2 Color Fidelity Engine, which allows for the use of two 64-sided LUTs, one for calibration, and a second one for applying “looks” in the field. CFE2 is also compatible with LUTs generated by LightSpace CMS, and in fact Flanders and Light Illusion have announced “LightSpace for FSI Monitors,” which is a lower-priced ($2,500) version of LightSpace CMS specifically for use with FSI monitors for calibration and LUT generation, facilitating a wide variety of workflows. Probes Flanders recommend include the Minolta CA-310 and the Klein K10-A (more on that later).

Incidentally, I did an interview with Larry Jordan on Digital Production BuZZ in which I erroneously mentioned the existence of HDMI input on the Flanders displays. Afterwards, I went back and chatted with Bram Desmet at Flanders as I couldn’t believe I had gotten that wrong. It turns out that, while there is not in fact an actual HDMI connector on these displays, their DVI connector is pin-compatible with HDMI.

The connections available on the back of the Flanders Scientific CM320TD 32" Display

The connections available on the back of the Flanders Scientific CM320TD 32″ Display

This means you can connect, for example, the micro HDMI output of the upcoming Blackmagic Pocket Cinema Camera to any of the Flanders displays via a simple adaptor. And I mention the Blackmagic Design camera for a reason—Flanders Scientific also added BMD-Log “standard” and “full” monitoring modes, so you can monitor a normalized image even while shooting using the film log setting of this family of cameras. This is in addition to the C-Log and S-Log modes the monitors already support. In conjunction with built-in video scopes and the new ability to display two separate video signals  side by side via two simultaneously connected inputs, these are incredibly flexible displays for field use.

And by the way, that Blackmagic Pocket Cinema Camera is ridiculous; I can’t wait to get my hands on one. A super-16 mm sized sensor shooting 1920 x 1080 video with a micro four-thirds lens mount on a pocketable body recording compressed CinemaDNG raw media to affordable SD cards for $995? Unbelievable.

BMG Pocket Cinema Camera

Connecting the BMD Pocket Cinema Camera to a cinema zoom is a “Where’s Waldo”-esque exercise

After learning of LightSpace and Klein compatibility, I had to pay them a visit, too. The Klein K10-A is an improvement to the original Klein K10 colorimeter, which they’ve been kind enough to provide me for classes I’ve done on monitor calibration. The K10-A has been out for some time, but I finally got the chance to ask Klein president Luhr Jensen just what’s better about it. The K10-A boasts a three-times improvement in lowlight sensitivity over the previous model, and a longer focal length that’s also appropriate for cinema applications where you’re measuring the screen. Like its predecessor, the K10-A is appropriate for measuring CCFL and LED backlit LCD, CRT, Plasma, DLP Projector, and OLED, so it’s an extremely versatile instrument.

The Klein K10-A colorimeter, taking a reading

The Klein K10-A colorimeter, taking a reading

LightSpace, from Light Illusion, wasn’t announcing anything specifically at NAB (they’ve been announcing new features all year), but after some prodding Light Illusion principal Steve Shaw did mention a just-released improvement for identifying probe-induced errors, specifically for darker readings that can be problematic even for high-end equipment. LightSpace is able to use statistical analysis to identify spurious out-of-trend data and average it out of the final result.

Getting back to grading, SGO was showing great new features in Mistika 7, including an all-new (to them) curves interface.

Mistika Curves

RGB curves are new to Mistika 7

This isn’t just limited to the RGB curves (shown above) and Hue vs. Hue, Hue vs. sat, Hue vs. Luma curves often seen in other grading applications. Mistika also includes Luma vs. Luma, Sat vs. Sat, and Luma vs. Sat curves. Those of you looking for a professional grading environment that has a “Vibrance” control, appropriate use of the Sat vs. Sat curve opens the door to all that and more.

New Curves

Mistika 7 also has Luma vs. Luma, Sat vs. Sat, and Sat vs. Luma curves

Another thing Mistika was showing off was an extensive suite of tools for improving qualified keys. As you can see below, shrink, grow, gaussian blur, de-speckle, fill-holes, and median blur filters can be applied to refine your qualified key. Additionally, qualifiers can now be combined using blend modes. But that’s not all…

Key filtering

New qualifier key filtering in Mistika 7

Taking this one step farther, Mistika provides the ability to adjust the lift and gain of the qualified key, as well as a “Key Curve” that lets you adjust the contrast of your key in fantastically specific ways. This is a terrific level of control more typically seen in a compositing application.

Qualifier grading

Keys can now be adjusted with lift, gain, and a separate curve control in Mistika 7

Mistika 7 also adds an interface for assigning individual settings to multiple simultaneous outputs, in order to apply different transforms, LUTs, or other effects to each individual output. For example, this lets you apply a transform and LUT to 709 output sent to a conventional HD display, while also applying a separate LUT or other adjustment to XYZ output being sent to a 2K projector. The UI has room for nine different output definitions, which can be used for monitoring and also for rendered output if you’re creating multiple masters.

Multiple Outputs

You can assign individual LUTs and transforms to multiple simultaneous outputs in Mistika 7

Additionally supporting XML import, integrated DCP Creation, and ACES, this is a good update for Mistika-using colorists.

Last, but certainly not least, FilmLight had some great announcements, starting with the sexy new Slate control surface. At $12,000, this is a more affordable Baselight-specific control surface then the Blackboard range, and looks like it pairs well with their more affordably priced Baselight ONE (dropped to $46,000 without external storage). With its compact size, it’s a good fit for smaller suites and on-location work, and connects via either Ethernet or USB.

The new, compact Slate control surface

The new, compact Slate control surface

Like its big brother the Blackboard 2, the Slate has remappable buttons (66 of them) with names and icons that change as you change modes. Along with 12 rotary encoders, and sets of six remappable buttons above each of the three trackball/ring controls, this is a serious control surface with great feel and solid build quality.

More here

The Slate has remappable buttons with updating displays

It’s worth noting that, if the Slate is too rich for your blood, the Baselight One (and Baselight editions NLE plugins) are compatible with the Tangent Element, Wave, and Avid Artist Color panels.

Getting back to the Baselight, FilmLight is clearly interested in making the Baselight ONE a more affordable and attractive option for smaller facilities. The diskless version of the workstation comes in a 4U tower that’s been engineered for quiet, for installations lacking a machine room. With an internal 2TB SSD drive cache, it’s meant to connect to an external SAN or NAS of your choice (although Filmlight’s FLUXStore is always an option) via fibre-channel or 10Gig-E. Filmlight old-schoolers can still get a rack mounted Baselight One with a built-in 28TB or 56TB RAID, but the rackmount and tower versions have identical performance, so it’s purely a matter of form factor convenience.

Baselight also sees some nice improvements targeted at the experienced colorist. For example, a new Result Blending control lets you mix back to the original image, or any other specific layer, at any point in your grade. This also works in conjunction with each layer’s blend mode and source control.


The Result Blending control mixes any other layer into the current one

A related feature lets you layer any image into your grade. For example, you can use this feature to add texture to your grade using a film-scan of grain. As an open-ended control, there are all manner of things you could use this tool for.

Image Layering

Layering an image into your grade to add texture

FilmLight has put some effort into streamlining stereoscopic 3D workflows, as well. Stereo clips now appear within a single timeline, rather then requiring you to manage two separate timelines. A new color-matching algorithm does per-pixel color matching across the entire frame, simplifying the hassles of matching both eyes before getting into your real grading. Geometry matching can now be accomplished using track points to account for situations where you need to deal with a moving shot with a flexing rig. On top of all that, automatic stereo correspondence handling has been added for shapes that you’re using for secondary work.

The FilmLight FLIP portable on-set grading workstation has been updated to be thinner, and now has the capability of communicating with compatible cameras via WiFi to, for example, copy metadata from the FLIP to the Arri Alexa, to be written along with the rest of the recorded data. This is all part of their “FilmLight at every stage” initiative, using the BLG (BaseLight Grade) format to copy grade metadata from set, through editorial and compositing (using Baselight Editions plugins), and finally through to be available for finishing inside of one of the Baselight grading workstations.

There were plenty of other announcements from Avid (new Media Composer features), Sony (updated OLED studio displays with wider viewing angles), Assimilate (demoing Scratch 8), and much more, all of which I sadly missed. But that’s okay, it’ll just give me more to see at IBC in a few months.

Color Correction Handbook 2nd Edition: Grading theory and technique for any application.
Color Correction Look Book: Stylized and creative grading techniques for any application.

“The Place Where You Live” Teaser

Nina Ashton, a professor of physics, is abducted by her counterpart from an alternate dimension—one in which her husband has died. As her doppleganger takes her place, Nina struggles to rebuild the machine and reopen the gateway between worlds in order to regain the life that should be hers.

I’m very pleased to present a preview of the first two minutes of “The Place Where You Live,” my new science fiction short that’s working its way through postproduction, shot by fantastical shot.

We’re aiming for a May release, and I couldn’t be more thrilled with how it’s turning out thanks to the fantastic cast and crew, as well as the incredible talents of designer and animator Brian Olson, compositors Brian Mulligan, Aaron Vasquez, Joel Osis, and Christopher Benitah, and 3D artist BJ West.

Kelly Pieklo has begun working on the sound design and mix, which can be heard along with John Rake’s wonderful score. I also need to thank Autodesk for their sponsorship of this project (the entire program is being edited and composited on Smoke 2013), as well as Splice in Minneapolis for their hands-on support.

If you want to learn more, I’ve blogged about the production, and I’ve also blogged about post.

And keep your eyes peeled for my next major announcement, once the whole 12 minute film is ready for viewing!

Color Correction Handbook 2nd Edition: Grading theory and technique for any application.
Color Correction Look Book: Stylized and creative grading techniques for any application.

Creative Looks Video Training


By popular request, I’m pleased to announce that I’ve done another video training title for Ripple Training, “DaVinci Resolve – Creative Looks.”

Whereas my “DaVinci Resolve Core Training” title provides an 11 hour tour of DaVinci Resolve, from workflow through each of the many tools, this title is a focused 90 minute exploration of the creative process. I’ve long made a point of saying that the whole reason to use a dedicated grading application, rather then filters with preset looks, is that a grading application’s more sophisticated toolset makes it possible for you to be the plugin, crafting custom styles to match the content at hand.

Since the scenes in every project have unique visual characteristics, I show you how to approach each of a variety of oft-requested image stylizations in a variety of ways, in the process unlocking the flexibility of the DaVinci Resolve toolset to quickly customize an image’s look. I demonstrate different methods of evaluating how to apply a given look to a scene, and explore how you might combine techniques to create a style that’s completely your own.

So if you know the basics of DaVinci Resolve and want to get a look at some more things you can do, please click the link and check out some of the sample movies. $29.95 for 720p movies (playable on anything), or $39.99 for full 1080p playback (suitable for a 3rd or 4th generation iPad).

Color Correction Handbook 2nd Edition: Grading theory and technique for any application.
Color Correction Look Book: Stylized and creative grading techniques for any application.

Postproducing “The Place Where You Live”

It’s been a while since I’ve posted an update about my upcoming science fiction short, “The Place Where You Live.” In the months since the initial two-and-a-half day shoot, I’ve edited the whole thing together, and after several passes of polish, the final runtime comes to 12 minutes, excluding end credits.

The edit so far

The edit so far

That’s a bit longer then the now quaintly optimistic 8 minutes I was estimating up front, (the shooting draft of the script itself is 9 pages). Having shown it to a few early audience members, it’s a tight 12 minutes that moves along briskly, and I’m pleased. The cut is always the last draft of the screenplay, and being the writer and the editor really hammers that home.

Once I finished the first cut, composer John Rake started putting together the music cues, seven of them in all, so I had something to play off of the visuals as I started my fine cutting. It was a great process bouncing cuts and cues back and forth as we zeroed in on the final timings, and I’m really happy with the score, which has a retro nod to some of the synth-oriented scores I’ve loved from the ’70s and ’80s.

Once I had enough of the edit done to know where I needed to add some narrative space, and how I wanted to handle the scene transitions, I did one last half-day shoot with DP Eli Ljung and AC Erica Wollmering to get some second-unit style driving, hallway walking, and videophone conversation shots accomplished, which allowed me to finish the cut once and for all.

Shooting additional exteriors

Shooting additional exteriors

As I cut in Smoke 2013, I took advantage of the integrated compositing to assemble rough comps to identify the timings of each of the VFX plates, and to help test viewers out.

For example, there are a number of shots that visualize heads-up-display technology for all of the computer screens. While I didn’t want to take the time to comp in a whole UI, I did have the reverse half of a videophone conversation that I needed to edit together, which I was able to easily rough in with a picture-in-picture effect using Smoke’s 3D transform environment, using Smoke’s Axis timeline effect to quickly motion track the window to the background for a true “floating window” effect. This is a level of VFX detail I’m not used to having in an NLE, and made it a lot easier for my test audiences to pay attention to the story, and not the effects-in-progress.

Of course, using my primitive compositing as a reference, I was then able, using Dropbox and with compositing supervisor Marc-André Ferguson’s help, to organize and hand off all of the plates to the team of artists and compositors who are remotely creating the final VFX, including the actual HUD graphics (graphics design by Brian Olson at Splice Here, compositing by Aaron Vasquez).

In-progress HUD shot

In-progress HUD shot, graphics by Brian Olson, compositing by Aaron Vasquez

Another compositing artist, seasoned Smoke veteran Brian Mulligan, has been creating the “doorway between dimensions” effect used throughout the piece, as well as rocking some of the more challenging digital duplicate shots of main character Nina and her doppleganger from another reality.

The dimensional doorway, in progress

The dimensional doorway, in progress, compositing by Brian Mulligan

Meanwhile, 3D artist BJ West has been creating the set extension for the lab using 3D Studio Max, building what he’s been referring to as “La Machine.” First, illustrator Ryan Beckwith’s finalized his concept art for the machine.

Ryan Beckwith's initial design

Ryan Beckwith’s initial design

With that finished, BJ has been elaborating on the initial design, making modifications as I’ve been coming up with changes based on the needs of editorial.

The unfinished, untextured lab model

The unfinished, untextured lab model by BJ West

BJ’s challenge has been to fit the geometry of everything so that it matches the space of production designer Kaylynn Raschke’s lab set. My goal is for a seamless digital set extension, and we’ll be sweating the details.

Fitting the Model to Production Designer Kaylynn Raschke's set

Fitting the Model to Production Designer Kaylynn Raschke’s set

At the moment, all of these VFX shots (57 of ’em) are still in process, but I’m aiming to get one segment in particular finished to the point where I can preview it during interviews at NAB. The rest of it will be completed by the end of April (I’ve just handed off the audio tracks to sound editor Kelly Pieklo at Splice Here, so we’re on our way).

So that’s where the project is now. It’s far and away the most technically challenging piece I’ve ever directed, but all of the elements are coming together beautifully, and I can’t wait to share the final product once it’s done.

Color Correction Handbook 2nd Edition: Grading theory and technique for any application.
Color Correction Look Book: Stylized and creative grading techniques for any application.

Pimp Your Vectorscope Ride

It’s time for a new Vectorscope graticule. The graticule, (sometimes called a reticle), is the overlay that presents targets, reference lines, crosshairs, and other guides to help when interpreting the trace or graph of a Vectorscope’s analysis. Older hardware-based Vectorscopes had the graticule silkscreened on a plastic overlay, so it was fixed and unchanging.

Graticule from a Hitachi Vectorscope

Graticule from a Hitachi Vectorscope

I’ve used several different hardware and software-based Vectorscopes over the years. Speaking as a colorist and not a broadcast engineer, they’re useful for comparing saturation levels between clips, for comparing the angle of hue of specific features appearing in multiple clips, for QC checking to make sure the signal is within tolerance, checking for overall errors in hue, and creatively they’re useful for evaluating how much color contrast you’ve got in your image, and in what direction the average color balance or dominant color temperature of the scene is leaning. Once you learn to read the graph of a Vectorscope, there’s a lot you can see.

Despite all this utility, the average HD vectorscope graticule in this day and age of graphically drawn software scopes shows nothing but boxes to indicate each of the target hues found in the 75% color bars test pattern, sometimes a second set of 100% bars boxes, usually a small (tiny) crosshairs to indicate the very center of 0% saturation, and maybe an In-phase indication line (or skin tone indicator line, depending on who named it). Other then that, you’re looking at a big black area with a blob of a graph at the center that shows you all the data.

The Final Cut Pro X Vectorscope

The Final Cut Pro X Vectorscope

In the screenshot above, I chose the Final Cut Pro X Vectorscope as an example not to pick on it, but because it’s actually a nice implementation (especially now that they put the center crosshairs back in, which had disappeared during a previous update). However, it’s also an example of a brand new piece of 2013 software that’s implementing a vectorscope graticule that wouldn’t look out of place in the eighties.

More to the point, the following picture shows the graticule presented by my rather expensive Harris VTM 4100 vectorscope (when analyzing an HD signal). It’s got boxes to show the angle of each hue, it’s got crosshairs, but that’s pretty much it as far as any kind of scale goes.

The graticule presented by a Harris VTM-4100 vectorscope

The graticule presented by a Harris VTM-4100 vectorscope

I understand the idea of simplifying the visuals of the scope. Folks with software Vectorscopes are likely using them for creative and comparative analysis, rather then as tools for signal alignment. Furthermore, a lot of the lines and indications from the analog days just aren’t meaningful anymore when examining a digital signal.  I’m sympathetic to the goal of freeing the colorist’s eye from the unnecessary clutter of legacy scopes.

However, what I find I’m missing within the sparse landscape of the modern vectorscope is some kind of a scale of hue to aid me in the process of signal comparison. While the little standard box targets do a fine job of suggesting the direction of each of the primary and secondary hues, I’ve long wished there was a more concrete guide showing the actual vectors of hue for reference at a variety of intensities of saturation. Given the little boxes distance from the traces of most shots of average saturation, one needs to essentially “eyeball” a given trace’s relative position to the angles of absolute red, or blue, or yellow.

So I designed my own graticule.

Hue Vectors Graticule

My design goal was to find a clean, uncluttered way of providing hue angle guidance at a variety of levels of saturation, while retaining the useful guideposts we’ve come to rely on from previous designs. The image up top presents all of the options in my current design, but a recommendation of this design is that there be the option to turn off the dotted skin tone indicator and the user-adjustable reference line. The simplified result appears below. Overall, I’m trying to present a more visible scale of the crucial angles of hue, while still keeping the graticule simple and immediately comprehensible.

Hue Vectors Graticule Simplified

Here is an explanation of the features of this design.

Hue Vectors Explanation

  1. 75% intensity tic marks, that correspond to the same angles and center-points of the hue boxes in traditional Vectorscopes. The intersection of the inner tic marks and hue vector lines show the dead center of each hue at 75% intensity. The outer tic marks indicate the boundary of 100% intensity.
  2. Long lines that indicate the vectors of each of the primary and secondary hues, stretching from 100% intensity to 22.5%. These lines thin and fall off towards the inner 22.5% boundary, leaving the center of the vectorscope clear to provide an uncluttered view of the subtle traces that describe the most common levels of saturation in most average images, while the pointed tips still provide a useful reference of each hue when evaluating these smaller traces. The objective of these long hue lines is to provide angular reference indicators that are more easily seen and remembered when comparing the traces of differently shaped graphs for multiple images. Furthermore, these long lines all point towards the critical center of the display, providing clear visual guidance without the need for full vertical and horizontal crosshairs.
  3. A center crosshairs that indicates the crucial 0% saturated center of the graph, oriented along the Red/Cyan and Yellow/Blue axes. This orientation provides a clear warm/cool visual reference that will be useful for beginners, and for tired professionals and their clients at 4am. The crosshairs should be big enough to be seen clearly, but small enough not to impede the detail found in smaller traces.
  4. An optional dotted line indicating the traditional In-phase/Skin Tone vector. Dotted to reinforce the idea that this indicator is a reference, and not a rule.
  5. A user adjustable vector reference that can be positioned at any angle and percentage. If you’re needing to match a product color precisely from shot to shot to shot, a user-adjustable indicator that can be put right where you need the trace to be is exceptionally handy.
  6. My original design presented simple letters to indicate the hue of each vector, but Mike Woodworth suggested color-coding the outer 100% tic marks as well to provide a more immediately identifiable UI. The colors shown in the above PNG are exaggerated for clarity; with darker colors, I don’t find the colored tics distracting, but this can always be adjusted along with the brightness of the rest of the graticule. How customizable to make these sorts of display issues (turn off colors separately from letters?) is an implementation choice.

When I first worked this design up, I was of course concerned that it would never see the light of day. I know how busy developers are, and I was afraid that this would end up being a very low priority.

However, when I mentioned what I was doing to Mike Woodworth, developer of Scopebox, he was genuinely interested. After some conversation, I decided to offer Mike the design free for inclusion into Scopebox first, with the understanding that upon its release, I would immediately publish the design here on my blog under a creative commons license, so that any developer or company who wants to incorporate this graticule into their product (even commercial products) is free to do so. I want to remove any barrier to adding this to a Vectorscope implementation, but I also want to make sure that anyone can include it. Really, I just want to be able to actually use it in real products.

Concurrent with this article’s publication, Mike is releasing Scopebox 3.2, which offers this graticule as an option with all the features described here (choose Hue Vectors from the Grat Style pop-up menu) alongside the previously available graticule. I’ve been running a variety of clips through it, and have used it very lightly for a scene or two of grading, and can honestly say that I find it useful.

Hue Vector option in ScopeBox

The Hue Vector option as seen in ScopeBox 3.2

I’ll share one observation about the Scopebox implementation. I find I like to open the ScopeBox preferences and darken the default color of the “Graticles Color” to obtain a nice, subtle graticule that is lightly visible without calling too much attention to itself (as you can see in the image above). With a darker graticule color, I find the Grat Intensity slider moves along a more reasonable scale of light to dark for my taste. The magic of user preferences is that you can set this up as you like.

ScopeBox Preferences

If you’re a ScopeBox user, give this option a whirl and by all means let me know what you think. I invite comment.

And if you’re a software developer or manufacturer of video scope software or hardware and this looks interesting to you, please download the PNG files in this article as a starting point, or give me a call. I mean what I say, this is a creative commons licensed design, and I’ll be happy for you to incorporate it into your product. I want to see it used, and I want it to be available to anyone who wants to implement it, free of the briar patch of patent restriction. This is my first foray into trying to make a design available in this way, so here are the provisos, courtesy of creativecommons.org:

Attribution — You must attribute the work in the manner specified by the author or licensor (but not in any way that suggests that they endorse you or your use of the work). A simple mention in the attribution front-matter of your documentation is fine; “Hue Vectors graticule designed by Alexis Van Hurkman.”

Share Alike — If you alter, transform, or build upon this work, you may distribute the resulting work only under the same or similar license to this one. I would like improvements to this design to ripple out to the wider community. This applies only to the Graticule design, not to your entire product. Just as existing graticule designs aren’t copyrighted, I don’t want useful alterations to become themselves restricted.

Waiver — Any of the above conditions can be waived if you get permission from me. Click the contact page and drop me some mail, or give me a call if you’ve already got my number. I’m always happy to chat.

And finally, here’s a link to all the creative commons legalese.

Creative Commons License
Hue vectors graticule by Alexis Van Hurkman is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.
Permissions beyond the scope of this license may be available at http://vanhurkman.com/wordpress/?p=2563.


Color Correction Handbook 2nd Edition: Grading theory and technique for any application.
Color Correction Look Book: Stylized and creative grading techniques for any application.

Flagging Autolinked Clips in Resolve

Flagged Clip

I know; it’s not the most clever title. However, once I fully understood the implications of how flags and marks work in Resolve 9, I discovered a practical use for flags that hadn’t previously occurred to me.

To clarify, I wrote in the Resolve manual that flags are intended to highlight a whole clip, while markers let you highlight individual frames within a clip. This is true; you can only apply one flag of a particular color to a clip, but you can apply several markers of a color to different frames within that clip.

What I hadn’t thought to emphasize, however, is that when you flag a clip, you’re really flagging the source clip within the media pool (in other words, the clip that appears in the Master Timeline). This means that, if there are several clips in an edited timeline that all connect to a single source clip in the media pool, flagging one of these clips results in you flagging them all.

At first, I thought this was a nuisance, until I realized the following:

(a) While flags exhibit this behavior, markers are specific to a particular timecode, which makes them specific to a particular clip. So, if you want to mark just one clip for future reference, you’re better off using a marker.

(b) Flags follow the same rules as auto-linked clips in timelines using Remote grades.

This latter behavior is what leads to a valuable tip—you can use flags to quickly isolate every other clip in the timeline that’s auto-linked to the current clip. This gives you a way to deal with situations where you’re not sure how many other clips will be affected by a grade you’re about to make when you’re working with automatically linked clips and Remote versions.

The following example shows a timeline using Remote grades, where the currently selected clip is auto-linked to other clips in the timeline. This means that any changes you make to the grade of the current clip will automatically ripple to all other clips that exhibit the little orange arrow (to the right of the timecode above each thumbnail).

A timeline with auto-linked clips

A timeline with auto-linked clips

A frequent criticism of this behavior is that it’s impossible to know, at a glance, just how many other clips to the right and left of the visible area of the timeline are automatically linked. In particular, it’s not uncommon for there to be a handful of auto-linked clips that require a different grade; for example, a section of interview after an exposure adjustment has been made.

Using flags, there’s a simple way of filtering just the auto-linked clips. First, right-click the thumbnail of one of the auto-linked clips, and add a flag using the Flags submenu. In this case, I’m adding a blue flag.

Adding a flag using the contextual menu

Adding a flag using the contextual menu

Now, each auto-linked clip will have a blue flag attached to it. Even auto-linked clips outside of the currently visible area of the timeline.


Flagged thumbnails in the timeline

Now, using the Timeline Filtering pop-up menu, you can filter out everything but the blue-flagged clips.

Filtering only the blue flagged clips on the timeline

Filtering only the blue flagged clips on the timeline

This results in a shortened timeline that shows every single clip that is auto-linked to the current one.

Only the auto-linked clips are filtered, using flags

At this point, you can spot check the other clips to make sure they match, and you’ll know for certain just how many other clips, to the front and to the rear of the current one, will be affected by the operation you’re about to perform.

When you’re finished, choose Show All Clips from the Timeline Filtering pop-up menu.

Showing all clips again

Showing all clips again

If you want to get rid of the flags, you can choose Clear All from the flags submenu of the thumbnail contextual menu.

Clearing all flags

Clearing all flags

Keep in mind that this behavior works when your timeline is using Local grades. In the following example, the timeline is set to local grades, which can be seen by the (L) underneath each thumbnail. However, the procedure is the same.

Filtering clips using Local grades

Filtering clips using Local grades

This means that, even if you’re grading each clip individually, you can still take advantage of Resolve’s built-in auto-linking to sort groups of related clips in the timeline.

So there you go, one more use for flagging and timeline filtering, to help you keep organized when grading long timelines. I’ve been working as 2nd colorist on a History Channel program, and using Remote grades has been a real time-saver since there are so many repeated sequences of clips. This technique has been helpful in letting me keep keep track of auto-linking in situations where I want to check to see how many clips will be affected by a particular adjustment.

Special thanks to producer Neil Gobioff and Director Shawn Paonessa for thumbnails from their short, “The Bedford Devil.”

Color Correction Handbook 2nd Edition: Grading theory and technique for any application.
Color Correction Look Book: Stylized and creative grading techniques for any application.

Resolve to Be Creative

It's a party in your mouth

It’s a party in your mouth

If you read this blog and follow me on Twitter, chances are you’re a creative person. Chances are, even if you do client work most of the year, the “real” reason you’re in your particular line of postproduction or production work is that you want to do creative projects of your own, whether it’s writing, acting, filmmaking, animating, game design, whatever.

It’s so easy to lose oneself in the day-to-day need to do the work to pay the bills that let you live to do the work yet another day. Boy, do I know it.

However, there is one thing that I’ve learned, and this year has been a rather unexpected case in point given the strange and wonderful mix of things I’ve been doing. Short of saying something like “make time to be creative” (easy to say, difficult to do), I think the best advice I could give would be something even simpler.

Do creative things.

That’s all. Just commit to doing creative things in whatever spare time you have to eke out. Read a book that makes you think. Play a game that resembles something you’ve been interested in doing. Watch more movies that challenge you, and discuss them with your friends. And then, do something with what you’ve learned. Write a one page synopsis of a new story that was inspired by something. Get out your digital camera or cell phone and make a one-minute movie of whatever. Edit something new. Composite something clever, even if it’s just one shot.

Make something small.

Because what I’ve learned is that creativity snowballs. And the more little creative things you do, the easier it will be to start undertaking bigger things. And the more of anything you do, the easier and quicker new ideas will come, if you’re paying attention.

However, very little creative motivation comes of not doing anything but what you’re told.

So that’s my New Year’s wish for all of you. If you have creative aspirations, I wish you the energy to start doing small, interesting projects that are easily accomplished in a limited amount of time. And to keep doing them as long as you are able, in the hope that soon you’ll discover your activities have snowballed into something as big and interesting as you wanted your creative life to be.

That’s all I got. And now, off to struggle practicing what I preach.

All the best,


Color Correction Handbook 2nd Edition: Grading theory and technique for any application.
Color Correction Look Book: Stylized and creative grading techniques for any application.

Winners of the Sieben the Cat Contest

Sieben the Cat

Sieben the Cat

Here it is, the Final Contest Update

Here are the results of my #siebenthecat contest on Twitter: Sieben weighs 14.8 lbs. Consequently, @bellafaccie and @camera_stooge are the winners, guessing 14.9 and 14.5 lbs respectively! (please use the contact page to email me your addresses) Thanks so much to everyone for participating, it was fun.

As a consolation prize to all 43 entrants of #siebenthecat, I got Ripple Training to offer 30% off of my Resolve 9 title before Jan 5th (there’s a good use of the 12 days of Christmas for you).

I don’t want to spam everyone, so if you entered my little contest and still want to get my Resolve 9 Core Training video title, go ahead and use my contact page to send me your email address and twitter name; if (and only if) you’re on my entrant’s list of 43 twitter names, I’ll email you the code so you can get the discount.

Happy Holidays!

Color Correction Handbook 2nd Edition: Grading theory and technique for any application.
Color Correction Look Book: Stylized and creative grading techniques for any application.

Shooting “The Place Where You Live”

It Takes a Village to Make a Movie

It takes a village to make a movie

It took a little while for the dust to settle, but I wanted to share a little more about the narrative shoot I directed a couple of weeks ago. It’s been five years since my last narrative project has been in front of audiences on the film festival circuit, the feature “Four Weeks, Four Hours.” In that time, I’ve been hard at work developing other projects, none of which got to the point where I was behind the camera.

Given that my last project was a character-driven survival story shot on location in brutal desert landscapes with a distinctly verité aesthetic, I’d been wanting to do something a little more effects and action-oriented for my next project. In casual conversation with folks at Autodesk during an event, I was lucky enough to pique their interest in putting up a bit of financing for a narrative short in exchange for use of some of the footage in a book I’m to write for Wiley about Smoke. Shocked at getting a yes after so much time laboring in solitude, I quickly shifted gears and got back into production mode on relatively short notice.

I kicked out the first draft of “The Place Where You Live” at the beginning of October. Showing it around to my usual suspects, it elicited an immediately great response, which gave me the confidence to put it forth as the project of interest, and to start putting together a production plan even as I refined the script through October. Happily, it was good enough to attract some great Twin Cities talent as I started assembling the cast and crew, the first two members of which included cinematographer Bo Hakala and actress Dawn Krosnowski.

DP Bo Hakala and lead actress Dawn Krosnowski

The story is that of a professor of physics who is abducted by her counterpart from an alternate dimension–one in which her husband has died. Her doppleganger changes places with her in order to get her husband back again, leaving our hero struggling to rebuild the machine that opens the gateway between dimensions to regain the life that should be hers.


“Ninja Nina” shooting her counterpart with a tranquilizer dart
Note: All grades have preliminary offline color

The story demanded two different sets that intersect via a “dimensional doorway.” My original idea, of securing a large enough stage so we could actually build two  physically intersecting sets, ended up being impractical as I couldn’t afford the appropriate time with the necessary space. Searching for alternate solutions, I found another venue with two smaller stages that would suit the two sets I needed to create, but this would necessitate greenscreen compositing to join the two locations together. On the other hand, this more effects-driven two location approach would give me the freedom to do some crazier visuals, and the lower cost would give us more time to build the sets, so the answer was clear.

The Final Greenscreen Lab Set

The final greenscreen lab set

Moreover, a cheaper stage would allow us to secure it for a longer time in order to take the art direction farther. My wife Kaylynn Raschke is a stylist and set decorator (as well as actress and playwright) with over twenty years of experience who did wonders on my last film, so I naturally drafted her to slip into Production Designer mode to create the “high-energy physics lab” and “college professor office” sets. Given her additional background in narrative, Kaylynn brings a story-driven approach to her styling that I really appreciate. She also has amazing ways to stretch the ridiculously low budgets I have for art department expenses.

The Initial State of the Lab Set Stage

Production Designer Kaylynn Raschke on the raw stage for the lab

Working with a crew of builders and assistants over a week and a half, she created a pair of very distinct spaces that looked fantastic on camera.

Production Designer Kaylynn Raschke, Mark, and Ken Raschke

Production Designer Kaylynn Raschke with builders Mark Storm and Ken Raschke

While the sets were being built, I divided my time in late November and early December (with the shoot scheduled Dec 8-10th) filling holes in the crew, keeping on top of the details and logistics of key props and equipment needed, and working through the previsualization that would be critical for me to be able to direct the lead actress against both herself and her body double on two different sets within any single given scene.

Once the sets were underway, I modeled them with accurate dimensions in Google’s Sketchup 3D application, and then fit them together as they would be in the movie, as superimposed locations joined by a “doorway.” This diagram let me put together my shooting diagrams, and start figuring out how I wanted to use this dual location approach in my storyboards.

Shooting Diagram for the First Abduction Scene

Shooting diagram for the abduction scene

Previs using 3D models in Sketchup proved to be invaluable in letting me and Bo keep all of the differently joined angles in mind as the lighting and camera crews set up each position. Given the spaghetti-like order of the shot list we were using to maximize time, the four passes I spent putting the boards together myself ended up making it much easier to keep my director’s mind on the ball as we did a shot from Scene 2, then a reverse from scene 6 in different costume, then another shot from scene 8 in yet another costume, and so on during the two days of shooting on our sets.

Previs Storyboards for the Abduction Scene

Previs storyboards for the abduction scene

The week before the shoot was also the week I found the body double, Emily Muyskens (special thanks to Moore Creative Talent in Minneapolis) who would substitute for the lead actresses alternate character in closeups. With both actresses now known quantities, Kaylynn put on her wardrobe hat and outfitted both women with matching costumes for the various stages the character and her doppleganger go through in the story. As a director/colorist, I really appreciate Kaylynn’s ability to keep the color choices for both set and costume in mind, an important consideration when you’re trying to create an overall look for a project.

Double Emily Muyskens and Actress Dawn Krosnowski

Double Emily Muyskens and Actress Dawn Krosnowski

Friday before the shoot was prelighting day (lighting and grip equipment was provided by Tasty Lighting Supply, with Michael Handley working the shoot as Gaffer). Much as I was struggling to keep the budget down, Bo wisely convinced me to spend the money and take the time on Friday to do some preliminary lighting setups. That also ended up being the day the big greenscreen was put up on the lab set, so it was a full afternoon, with Kaylynn working late to apply the final touches to the dressing of the lab set.

Prelighting and Dressing the Lab Set

Prelighting and dressing the lab set

The giant green screen in the lab would facilitate a nice chunk of CG set extension to visualize “the machine,” the design of which had already been preliminarily worked out by New York-based illustrator Ryan Beckwith (with whom I’ve been laboring on Starship Detritus). Ryan does a lot of professional storyboarding for commercial spots and features as well, so he’s uniquely qualified to create set-friendly designs.

Preliminary Design of "The Machine"

Ryan Beckwith’s preliminary sketch of “the machine”

Day 1 of the shoot, Saturday, was lab set day, which was challenging for a variety of reasons. Much of the script takes place in the lab, and scenes with the “dimensional doorway” all look into the office set, so the camera setups had to be rigorously measured and kept track of so they could be aligned with matching plates to be shot the following day.

Measuring the Shot

Measuring each shot

An additional complication was the need to balance the camera setups I wanted against multiple costume changes for each setup, since the actress was playing against herself in all scenes, and the double only worked in closeups that framed her face out. Miki Sautbine did a great job quick-changing hair and makeup, with Kaylynn doing double duty as wardrobe.

Dawn, Emily, and Hair/Makeup Artist Miki Sautbine

Dawn, Emily, and Hair/Makeup artist Miki Sautbine

Molly Katagari also served double-duty as both Script Supervisor and Assistant Director, having done a masterful job of taking my shot list and hammering it into a daily schedule that allowed us to pack an incredibly ambitious amount of work into two long days.

AC Chris Hadland and Script Sup/AD Molly Katagari

AC Chris Hadland and Script Sup/AD Molly Katagari

With my previs boards to guide us, the majority of the day saw us slogging through all of the finicky effects shots in the lab, with Bo pulling out an amazing assortment of creative lighting techniques to bring Kaylynn’s set to life. I’ve never worked in lighting or grip (PA-ing in San Francisco didn’t really count), so my notes to Bo came from my colorist background as we discussed pools of light, where I wanted areas of shadow, and the general characteristics of the visuals that I wanted. Bo then took all of that five steps further utilizing an impressive variety of lighting instruments. I love being surprised, and Bo constantly presented me with visual options that contributed loads to the atmosphere.

Creative Lighting Cues

Creative lighting cues

We were shooting with a RED ONE MX. Not the most cutting edge choice, but when we discussed camera packages versus budget (it sucks to be both the director and the bean-counting producer at the same time), Bo suggested going for a more affordable camera in order to free up funds for better lenses. Having graded RED ONE MX footage for clients, I was comfortable with the choice, and definitely in favor of the clarity better glass would provide, in our case a set of Zeiss Super Speeds (the whole camera package was provided by CineMechanics).

Set Dog Penny Helps DP Bo Hakala Operate

Penny helps DP Bo Hakala operate

I made it clear that I liked to shoot wide, and I’m a fan of composing shots with subjects at multiple levels of depth. We were on the same page there, and Bo used a set of 14mm to 85mm primes to great effect.

Nina Powering Up "The Machine"

Nina Powering Up “The Machine”
Note: All grades have preliminary offline color

On top of that, I wanted to do one or two tricky shots. I obtained a chroma-green face mask in order to shoot plates with Dawn’s double for doing a head replacement. Honestly, it’s a brief moment and I wasn’t sure how well the composite would work, so I made sure to cover the same moment of dialog and action in a more conventional way as a closeup against Emily the body double. However, if it works, this’ll be a cool shot.

Nina Dragging Her Alternate

Nina dragging her alternate

There were four lab scenes in particular; two involving effects, and two that were simply within the environment (albeit with the giant greenscreen that will be replaced with the CG “big physics machine”). As the shots trickled in, I kept saying as encouragingly as I could that things would speed up once we got out of the effects thicket, and it was nice to be proven right. After hours spent on collections of individual shots, we slid right through the non-effects scenes with some fantastic dolly work and nice overlapping angles of coverage at the end of the evening.

Working Through the Evening

Working through the evening

I started directing on 16mm, and there were plenty of times when we were down to a single 400′ reel, and every shot had to count. Those experiences, and my background editing video in the 90s and being given boxes of tapes to wade through, drilled the need for specificity and economy into me, so that even now, with stock effectively “free” thanks to endlessly swappable digital memory cards, I tend to keep the number of takes I shoot as low as possible, and I don’t generally let the camera run. Sure as hell makes logging and editing easier later.

Nina Preparing to Act

Nina preparing to act

That also requires a terrific camera crew (AC Chris Hadland, Key Grip Joe Gallup, with DP Bo Hakala operating were fantastic) and excellent talent, and I have to give a major shout out to Dawn Krosnowski for giving me the most one-take shots I’ve ever had on a project. As confusing as the shot ordering could be, she took my notes and nailed moment after moment. Between the excellent camera work and performances, after three days of shooting I ended up with only 10 reels and 400GB of RED .R3D media. I admit, I took a quiet joy in giving DIT Dmitry Futoryan little to do.

AC Chris Hadland

AC Chris Hadland

On day two we finished the lab shooting with the very last of the effects, some mirror shots that let us virtually pull the camera back to match its position in the office set, a very nifty bit of geometric trickery suggested by Bo weeks before. Since the green screen in the lab set was right against the wall, the only way to achieve the distance from camera to subject necessary for the matching office shot was to shoot an angled reflection, for which I had a 4′ by 7′ mirror custom-made the week before. It worked great, although matching the actress’ eyelines to material we’d not yet shot was challenging.

A mirror shot

A mirror shot
Note: All grades have preliminary offline color

With that concluded, we moved into the Office set, which was being prelit and dressed even as we wrapped the lab. While there were more shots in the lab, we were aided by the fact that now we had actual matching plates to reference, and camera height/distance/angle measurements we could use, but still the finicky effects work consumed much of the day.

Nina Crossing Over

Nina crossing over

An added benefit of the Office set was more dialog. The Lab scenes involved few lines. Most of the conversations take place in the office, with the character chatting with people on heads-up video displays (to be added in later, that’s why she keeps pointing at the air). Production recordist Tom Colvin came equipped with a fantastic audio set-up, I wanted to shoot dual-system sound to keep the camera free of entanglements, and Tom did a great job using combined wireless and boom mics as necessary to capture ideal sound in our unusual environments.

Production Sound Recordist Tom Colvin upper left

Production Sound Recordist Tom Colvin upper left

I was up front that our stages, not being sound stages, were pretty poor acoustic environments, not least because we were in a band rehearsal area, and while the owner was fantastic and encouraged everyone to take the weekend off, there were still one or two bands that popped up at awkward times. However, the bands in attendance were lovely about giving us some space, and after reviewing the audio, it’s astonishingly usable, not the lease because of Tom’s efforts, so I’ve high hopes the need for dialog replacement will be minimal. I absolutely love having a dedicated sound professional on set.

Nina Speaking With an Element Not Yet Composited

Nina speaking with an element not yet composited

Again, once we hit the “non-effects” scenes, things went much faster, but by then much of the day had been spent, and I was between a rock and a hard place in terms of being forced to go for a second long day with a crew that was already going the distance, and giving me low rates on top of that. I hate forcing long days, and I especially hate it when I feel like I’m not really paying for it. On the other hand, the producer in me needed to get those scenes, so as soon as I knew we’d be going long, I made sure to have a chat with the whole crew to see if there was anyone who wouldn’t be able to go along with the schedule.

Always Something to Discuss

There’s always something to discuss

Happily, everyone was incredibly professional and had a fantastic attitude, and a few pizzas made it possible to continue through the evening. However, not wanting to be a complete jackass, I got down with my boards between takes and started weeding out unnecessary coverage in order to make time without losing scenes (I’d cut the script down to the bare essentials in my fourth draft, so losing anything now would mean sacrificing story).

Editing Down the Storyboards

Editing down the storyboards

I managed to collapse some angles together and mentally edit a path through the remaining two scenes that would need only five shots, including close-up inserts necessary for my “Twilight Zone” style ending. What I didn’t know at the time (Kay mentioned it when I was reviewing my footage) was that, while I was cutting shots, the DP was stealing insert shots he knew I wanted in-between camera setups. I owe him a bottle of something nice.

Bo Hakala and Chris Hadland Grabbing Shots

Bo Hakala and Chris Hadland stealing shots

Wrapping the office scene and packing up the grip truck ended up going even later than I’d anticipated (note to part-time producers, factor in how long it takes to load the grip truck!). It didn’t help that it started snowing that morning and it didn’t quit all day, which is one of the many benefits of shooting in Minnesota in the winter. Consequently, I punted the early call the following morning to let folks get a bit more sleep before the final on-location shoot.

Ah, the joys of shooting in Minnesota winters…

Ah, the joys of shooting in Minnesota winters…

The next morning, I got up early, checked my budget, grabbed my binder, and headed to the location to let the owner of the business we were shooting at know that the crew would be coming an hour later. Rewarding myself for getting through the effects scenes with a nice café mocha, I waited in front of the business. And waited. And waited. A few phone calls later, I was convinced that I somehow screwed the pooch with the owner on following up about the shoot, and gave some quick thought to how to change the scene to shoot in a new location. Luckily for me, Kaylynn’s and my house was just a few blocks away (in fact, the actresses for that day were already there), so as crew started showing up, I redirected everyone to the new location, and Kaylynn arranged for some heavy-duty snow-blowing to make a path in our yard through the previous day’s blizzard.

I later found out that the business owner’s phone had fallen into a snow drift the evening before, and he was subsequently snowed in that morning, so while he wanted to come and let us in, he was trapped at home and had no way of reaching me. I felt bad for him, but was pleased to learn that it wasn’t ultimately an oversight on my part! At any rate, by the time Bo arrived, I had my new plan of coverage for the scene, and added a few lines of changed dialog to account for the difference.

Actresses Lana Rosario, Dawn Krosnowski, and Director Alexis Van Hurkman

Actresses Lana Rosario, Dawn Krosnowski, and me

Honestly, the new location (our living room) once Kaylynn got done re-dressing it, ended up working much better for the project, lending a more intimate vibe to a scene between friends. And Bo and I were both relieved to be able to shoot a real scene with no effects nonsense dragging us down. Dawn was joined by  Lana Rosario, and we wrapped the scene and the shoot after an honest half-day.

Nina Visiting a Friend

Nina visiting a friend
Note: All grades have preliminary offline color

After that, there was much packing and writing of checks, and I was left with ten reels of fantastic footage that are crying out for editing and compositing, and a set that needed striking the following week. Ah, the glamour of independent filmmaking.

I'm a Hands-On Director

I’m a hands-on director

So that was our shoot. There’s still work to be done, in particular I’ve got a handful of establishing shots with the actress that I need to arrange, and I had to punt the reverse part of her conversation with the husband to another day (probably after the holidays), but the bulk of the project is in the can, and now the postproduction begins.

You can be sure that I’ll announce loud and clear when the movie’s ready to be watched. Until then, the director’s work is never done…

Color Correction Handbook 2nd Edition: Grading theory and technique for any application.
Color Correction Look Book: Stylized and creative grading techniques for any application.

Return of the Holiday Contest…

My Cat, Sieben

My Cat, Sieben

The Holiday Season is upon us once again, so merry merry, everyone. It’s certainly been an eventful year for me, and judging from the folks I keep track of on Twitter, it seems to have been so for a lot of folks.

To commemorate this year’s activities, I thought it would be fun to run another contest, this time offering one of two free Ripple Training USB memory sticks with my brand new DaVinci Resolve 9 Core Training on it (as seen below) to two lucky winners. I’ve been hearing lots of compliments from those of you who’ve already picked this up (and I thank you), but if you haven’t yet had the chance, this is your opportunity to learn more about how to use DaVinci Resolve 9 via 11 hours of show and tell, from me, for free. As in beer.


All you have to do is to be one of two people to guess how much my cat Sieben weighs, in either Kilograms or Pounds (international entries welcome) and tell me via Twitter using the appropriate hashtag (#siebenthecat). The two closest guesses by midnight CST, December 24th (Christmas Eve) will win. The weight and two winners will be announced by me on December 25th. Here are the rules–

  1. All guesses must be submitted via Twitter (@hurkman) and must bear the hashtag #siebenthecat
  2. I mean it, all guesses must be submitted via Twitter, with the hashtag #siebenthecat
  3. No emails.
  4. All guesses must be submitted by midnight CST December 24th. Weighing is on Xmas day. No appeals.
  5. I’ll ship it to you for free, but no guarantees on how long it’ll take if you’re not in the US.
  6. If you don’t tweet using the hashtag, I’m not obligated to include your entry as I may not find it, so don’t forget #siebenthecat

Good luck, and best wishes on a peaceful, happy holiday of your choosing!


Xmas Day Update—Well that’s a big oops. I was just informed that we do not actually own a scale in the Hurkman/Raschke co-prosperity sphere. Alas, the great #siebenthecat weigh-in will have to be postponed until December 26th, when stores open for me to buy a scale. Announcements of the two winners will commence at that time. My sincerest apologies for being such an inadvertent scrooge, although this gives me the chance to find a scale with two decimal places of precision to accommodate the great specificity of the guesses. Happy Merry Whatever, my friends.

Color Correction Handbook 2nd Edition: Grading theory and technique for any application.
Color Correction Look Book: Stylized and creative grading techniques for any application.

A Modest Proposal

It saddens me to post in response to such a horrific occurrence as the mass shooting in Connecticut. However, this occurrence brings up issues of national importance that we all should be contemplating, so I thought it would be worth writing out my contribution to the discourse here.

I just directed a short that has a gun; a tranquilizer-dart-shooting air pistol, but a gun nonetheless. As we were shooting the scene the weekend before this tragedy, there was some discussion about the “cool factor” of a gun in a movie. I’m being honest. As outraged as I’ve been that yet another mass gun murder has taken place, I’m still putting guns on screen. Because they’re cool. Because people like them in these morality plays we create and call entertainment. And because it served my narrative purpose.

While I fall short of calling for an outright ban on all guns, I have been fairly vocal on social networking sites where I participate that I believe gun ownership should be subject to increased regulation. I use car ownership as my model, as it requires the training and testing needed to obtain a license, and regulation to the extent that certain infractions (repeated drunk driving, for example) rescinds your right to drive. Furthermore, you don’t get to drive anything you want, your license entitles you to drive a certain type of vehicle, and that vehicle needs to conform to certain standards to be considered “street legal.”

My rationale is this; I’m not looking to take cars away, I just want to make sure that people who own cars know how to use them, that the cars they’re driving are safe for the purpose of transportation, and that their license for use is dependent on their behavior with their vehicle. Vehicles are dangerous, and we want drivers to be responsible.

Likewise with guns. Guns have one purpose, to put a bullet into a target. Whether that target is a clay pigeon, a deer, or a person. Their inherent danger both to their owners (accidents are not unknown) and to others I believe demands the same level of training and oversight as ownership of an automobile. And I don’t want to hear about voluntary measures from anyone who’s not willing to let driver training and testing for licensing be voluntary as well.

Furthermore, I don’t believe that firearms with automatic actions, assault weapons, and large magazines are necessary for general civilian use. If you need a magazine with 15 rounds to hunt deer, you might be doing it wrong. And you should not require a Glock with 26 rounds to fend off an assailant; if so I sure as hell don’t want to be around while you’re doing it.

We used a dummy prop in my movie, and the logistics of the effects meant that our actress didn’t actually have to point the fake gun at anyone at all, just at a green screen. Consequently, little oversight was necessary beyond me taking the time to go through the motions of confirming the fakeness of the gun, confirming or unloading the chamber of our fake gun, and making sure the darts being handled were properly capped.

However, were it a production necessitating actual firearms firing actual blanks around other actual actors, we would have required an arms master, multiple levels of checks and safety drills, and our insurance premiums for the shoot would have gone up.

Which leads to my actual question—what if gun owners were required to carry the same kind of liability insurance that car owners and filmmakers need to carry?

Regardless of who you are and what your track record is, if you’re a responsible filmmaker, you should carry liability insurance. There are dozens of things that can go wrong, and all kinds of ways that your cast and crew members can be hurt. If you don’t carry insurance beyond what your rental agreements require, you are being irresponsible. My small shoot, three days of principal photography, ended up carrying a burden of $1900 worth of insurance overall. I gladly paid, because I wanted to make sure that if the unthinkable went wrong and anything bad happened, the right thing would be done.

Insurance is a free market solution. Let private insurers handle gun owner liability, setting premiums using the same metrics they use for other insurable activities. If you’re uninsurable, then you don’t get to have a gun, because you probably shouldn’t.

As far as I’m concerned, this would be a ratification of the personal responsibility borne by gun owners. If you’re responsible, your premiums will be low, and your motivation to properly store and lock up your dangerous tool will be high.

Furthermore, this insurance would provide a fund to deal with the inevitable accidents and homicides that we in the United States are declaring we’re willing to accept as a consequence of wide-spread gun ownership. Make no mistake, if we are willing to accept civilian ownership of guns, we are saying that we accept the accompanying accidents and homicides committed as a result.

If the general consensus remains that gun ownership is, in fact, worth that price, then we owe it to ourselves to create a comprehensive, non-optional culture of gun safety and oversight, and to limit the potential for mass fatalities by setting realistic limits on what can be owned.

Ranchers, hunters, and enthusiasts can continue to keep those guns deemed acceptable for civilian availability. There are many legitimate uses, and people who need them. But if you’re a gun owner who’s against gun controls, you need to own up to the hazards these tools present, and you especially need to own up to the fact that not everyone is as responsible in their gun ownership as you are.

Color Correction Handbook 2nd Edition: Grading theory and technique for any application.
Color Correction Look Book: Stylized and creative grading techniques for any application.

Back in the Saddle Again

Those of you who’ve been following me on Twitter have undoubtedly been noting my now constant stream of preproduction tweets, so I figured it was time to stop being such a tease and share a bit of what’s going on. I’m producing and directing a short subject I’ve written called “The Place Where You Live.” I’m not going to spoil the plot, as you’ll have an opportunity to see it soon enough, but here’s a clue…

The headshot shows our lead, Dawn Krosnowski, who’s heading up a cast of three in this tightly scripted “Twilight Zone-esque” excursion into speculative fiction. Kaylynn Raschke, art director extrordinaire, has been overseeing construction of our two sets for this project.

Building in the Office Set

We’re saving money by shooting in a non-traditional space, in this case, a pair of stages usually reserved for band rehearsals. The tradeoff is that they’re odd spaces, but the advantage is that Kaylynn has had more time to work on the sets, and we haven’t needed such a huge crew.

From This…

To This

The last time I had a set built was 1991, since then it’s been all on-location shooting for me. However, it’s nice to be back in a mode where things are more tightly controlled. While not a feature, this is easily the most ambitious project I’ve worked on, with dual set compositing, lots of VFX, and a terrific crew headed up by cinematographer Bo Hakela.

So that’s what all the fuss is about. After six years of being in development with two other projects, it’s nice to be shooting something again. I’ll be posting updates once principal photography is finished, and more as we shift from production into post, and the post specialist in me gets to rue decisions that the director in me just had to make on the set.

Color Correction Handbook 2nd Edition: Grading theory and technique for any application.
Color Correction Look Book: Stylized and creative grading techniques for any application.

New Video Training for DaVinci Resolve 9

First, I want to express my sincere gratitude to everyone who’s been waiting so patiently. It’s a tough balance between waiting for DaVinci to nail everything down so my lessons are as up-to-date as possible, and getting things started in time for Ripple to be able to add the kind of polish and organization that makes their titles something special.

So here’s the announcement—Ripple Training has completed and made available my DaVinci Resolve 9 Core Training. Right now.

This is a completely new title, dramatically expanded to almost 12 hours of instruction covering nearly all of Resolve 9 from media ingest and project conform, through every grading tool that’s available, to final output for a variety of workflows. In total, 65 exhaustively chapter-marked and organized movies span the gamut of techniques that DaVinci Resolve 9 makes possible.

All new lessons on multi-node grading

This title is appropriate for both DaVinci Resolve and DaVinci Resolve Lite (the free version). And since both versions of Resolve now allow the use of unlimited nodes, this new title has all new lessons on multi-node grading for both practical and creative effect.

If you’re brand-new to Resolve, this series will walk you through all of the basics, and move you seamlessly from importing projects to making the grade. If you’re an old Resolve hand, Ripple Training’s fantastic chapter-marker organization makes it easy to zero in on just the features you’re interested in, and with 12 hours of content to choose from, you’re bound to learn something new. It’s like having a video handbook for using Resolve.

Each movie has extensive chapter markers, making it easy to find the information you’re looking for

I’ve worked hard to offer information and instruction that’s concise, yet clear to users of all levels of experience, that also includes interesting tips and techniques that I employ in my own work. I’m thrilled with how this title has turned out, and have to give everyone at Ripple my sincere thanks for doing such a great job of adding editorial polish and visual clarity via zoom-ins, graphics, and call-outs, to make every control and adjustment immediately obvious at every step of the way.

This entire collection of 65 movies is available at two resolutions—

  • 720p download ($79.99) for optimal playback on the iPad 1, 2, and mini
  • 1080p download ($89.99) for playback on desktops, laptops, and the iPad 3 and 4
  • If downloading is a hassle, you can also order the whole set, at either resolution, on a whiffy Ripple Training USB thumb drive ($99.99)

The audio and visual quality have been vastly improved over the prior version – the 1080 version in particular really looks fantastic – making for an extremely enjoyable watching experience.

We’re deliberately keeping this affordable, despite the expansion, so that no one has any excuse not to be able to add this to their reference collection, alongside the other great training resources that are available. So please, check out Ripple Training’s DaVinci Resolve 9 Core Training page, which has example movies, stills, and a complete table of contents. If you’re been wanting to learn Resolve, or have just upgraded to Resolve 9 and want to learn how to get more out of it, you can get fast answers and support this blog of mine by ordering your copy today.

Thank you for your support.

Color Correction Handbook 2nd Edition: Grading theory and technique for any application.
Color Correction Look Book: Stylized and creative grading techniques for any application.

What Display Should I Buy? An Opinion Piece…

Lest you later accuse me of false advertising, I’m admitting right now that I’m not going to tell you what you should buy. Rather, I’ll deliver an epistemological monologue about choosing monitors that may accidentally be helpful to you. I welcome comment, especially from those of you who are genuine color scientists. Just please be nice to each other.

I’m talking color critical displays, and gird your loins, because this post is a long one.

Before We Begin

Okay. I lied. I’ll do you a favor and give you a recommendation that will free you from needing to read the rest of this article. Pull together $30,000 and buy a Dolby PRM-4200. It’s big (42″), it has nice deep blacks because of its insane backlighting technology, it’s got stable color, excellent shadow detail, multi-standard support, contains the full gamut of both Rec 709 and P3 for DCI work, and it’s a giant piece of equipment that will impress everyone who comes into your suite.

Also, the European Broadcast Union (EBU) have declared that the Dolby is replacing CRT as the new standard reference monitor for their compression codec testing (May 2012, Post Magazine). If it’s good enough for the European Union, it’s good enough for me, and nobody is going to complain about working with one of these. No, I haven’t tested it. Yes, I’ve seen it in a controlled environment, and based on unscientific observation it looked fantastic. I’m electing to trust that Dolby and the EBU have done their jobs, so you can buy yourself out of this whole debate. I’d get one if I could.

On the other hand, if like me you don’t have $30K to spend on a display, read on.

Plusses and Minuses

My friends, we are making ourselves crazy. To a certain extent, this is necessary. We are professional colorists, and we require excellence in our display technologies. However, in the pursuit of excellence, we have been set to the task of achieving a pinnacle of perfection, while being given imperfect tools.

First off, I am not a color scientist. I am a pro user, but when it comes to display technology, I rely on expert information from a variety of sources. At the end of the day, I never recommend any display I haven’t personally seen, but I don’t claim to have rigorously tested the calibration profiles of different displays, other than my own.

What I usually do is what any educated consumer does: poll experts I know, check with company representatives when possible, and lurk the TIG and other online forums like crazy, collating every specific opinion I read from other colorists in the field about what they see, what they like, and why.

It seems to me that shopping for a color critical display is similar to being an audiophile—you can make yourself crazy searching for the Nth degree of perfection. Unlike audio technology, displays are subject to a concrete standard; Rec 601, Rec 709, or P3 dictating the gamut, and a gamma setting that depends on your specific application (more on that here), and a specific peak brightness measured in foot-lamberts (more on that here).

However, where once there was a single display technology, CRT, that served the entire industry, now there are legion. LCD with LED backlighting, LCD with fluorescent backlighting, different models of Plasma, OLED panels, LCOS Projectors, and DPL Projectors all vie for the dollars of colorists furnishing their grading suites.

Furthermore, the actual light-emitting technology is just one aspect of a display. Then there is the hardware and software that transforms an HD-SDI signal into a streaming wall of photons within the appropriate gamut, at the appropriate gamma and foot-lamberts. And, for every single display in existance, this requires some manner of calibration.

Speaking as an end user, display calibration is a frustrating field to follow. The frustration is thus: you’re told to adhere to a standard, and theoretically that’s cut and dried. Here are the numbers, make the display match the numbers. In practice, getting your display to match those numbers is a pretty challenging task, and different probes and software do this differently, and the results have minor deviations from one another, and then everyone gets to quibble about whose delta-E is smaller. (Crudely put, delta-E is the measured difference between what your display is showing, and what is should be showing, during a controlled calibration procedure.)

Consider probes. I’ve had various conversations with different folks, and read many articles, and the general consensus is that (a) hugely expensive probes are accurate, (b) even with expensive probes, different folks have different favorites, and (c) below a certain threshold ($10K) different probes are good at measuring different ranges of tonality. So, as with all things, if you’re not spending a shed-load of money, then virtually any decision you make is a compromise of some sort, and you won’t get perfection. You’ll get something somewhere close.

Now, consider calibration software, whether stand-alone (a package such as LightSpace, CineSpace, or Truelight), or built into a display like the Flanders Scientific or TVLogic at the low end, or the Dolby PRM-4200 at the high end. Each of these vendors will tout the advantages of their system, and the excellence of their color scientists, in crafting the ideal algorithms for mastering the heavy math of measuring and transforming accurate color. They all employ smart folks, so how do you choose?

And we haven’t even gotten to the displays themselves. Again, in the absence of a single technology that everyone in our industry can agree on, we’re stuck comparing different trade-offs. Here’s what I perceive at the moment:

  • LCD (fluroescent backlight)—Plusses: Inexpensive (relatively), stable color, wide choice of vendors offering different sizes; Minuses: narrower viewing angles than Plasma, a black level that’s comparatively light (how this is perceived depends on viewing conditions)
  • OLED—Plusses: Deep black level, stable color, appealing image quality; Minuses: even narrower viewing angles than LCD, 24″ is the largest realistically available as of this writing, reports of perceptual “magenta tinge” with older observers is worrysome
  • Plasma—Plusses: Wide viewing angle, inexpensive at large sizes (55″+), deep black level; Minuses: less stable color requires more frequent calibration than other technologies (still better than CRT), slight crushing of data in the very darkest shadows, Auto Brightness Level (ABL) circuit modifies images with brightness above a certain threshold and is worrisome
  • Projection—Plusses: Wide viewing angle, huge viewing angle, stable color, unique image quality matching the theatrical experience, “budget” models from JVC are inexpensive relative to size; Minuses: unique image quality matching the theatrical experience (it will never pop like a self-illuminated display), DCP-compliant high-resolution models are expensive and require infrastructure (cooling, a booth, etc), you need more space than a simple display.

So there you go. No one relatively affordable display technology has everything you want. Period.

What does this mean? Are we all screwed? Do we lose sleep because, whatever decision we made, we’re wrong and the programs we’re grading are all catastrophically off by some obvious margin?

No. Of course not. Across the world, there are hundreds upon hundreds of grading suites and thousands of multi-purpose video postproduction suites that are using what have been represented as calibarated, color critical versions of every technology I’ve just described. And they’re all producing hundreds of thousands of hours of programming and entertainment, much of which you probably watch on cable, at film festivals, in theaters, on your computer screens, and on your portable devices.

In my case, I’m using Plasma, my trusty VT30 series Panasonic display. Yes, it has the ABL circuit, and on Steve Shaw’s advice I’ve taken special care to adjust the calibration patch size to avoid its effects when I use LightSpace to generate LUTs for it. Is this circuit kicking in during some portion of whatever program I grade? Probably, but to be honest I’ve never noticed an artifact during a session, it’s never impeded my decision making, and I’ve never had a client come back with a program I’ve delivered and say “what the hell did you give me, hurkman!” And yes, I realize I’m missing some values of detail down at the bottom blacks, but again it’s nothing that’s stopped me from making creative decisions, and I’ve got a set of HD scopes to show me the data of my signal to make sure I know what’s there for QC purposes. I also have a CRT that I baby, but it’s got the phosphors, so it’s not really a match, though it’s close.

When I decided to go plasma, and I can’t stress enough that this is a personal decision that should be based as much on your particular clientele and needs as on your budget, it was based on the following: I like the blacks of plasma, I like having a larger display so my clients can comfortably sit behind me, and I like the viewing angle so that several clients can sit off axis (even in my home suite, it happens).

What’s Everybody Else Using?

My decision was also based on a certain critical mass that I perceived—as of 2010 I was aware of many high-end post houses that installed plasma in their grading suites as hero monitors, and I decided if it was good enough for them, then it’d be good enough for me.

And at the end of the day, I believe that’s a fair call. I want to trust my display, and I want my clients to trust my display. If I’ve got a technology that’s deployed elsewhere, that provides consistency and some confidence for all involved with the process. That was the advantage of the old Sony CRTs; everybody had one, so nobody had any doubt. It’s the same reason you see Genelec speakers in every editing suite you’ve ever been in. Sure, they’re great speakers, but it helps that no client is going to walk into a room and say, “what the hell kind of speakers are you using here, anyway?”

Time and technology march on. When I was in New York, my clients were a bunch of indie filmmakers, and I used a calibrated JVC DILA projector to give them the theatrical experience. Now I’ve got a Plasma I’ve been using for a year and a half, as I have a smaller room and am doing more commercial/broadcast types of work. If I were to get something new today, I’m not sure what I’d get; maybe another Plasma, or possibly an LCD-based display to save a little space in my current independent suite, it’s hard to say.

We Need A Tolerance

I’m going to go out on a limb and say that I believe our industry desparately needs a SMPTE-recommended tolerance for Rec 709 displays. In other words, yes we know what the standard is for HD video, but how far can we acceptably deviate from this standard before the warning buzzer goes off and we end up getting dunked in the clown pool.

And no, I’m not completely insane. There is precedent in the specified tolerance for DCI-compliant reference projectors; go out and find SMPTE Recommended Practice document 431.2 (this is also discussed in Glenn Kennel’s excellent Color and Mastering for Digital Cinema).

There are actually two projector tolerances given, a more conservative tolerance for review rooms (that’s you), and a slightly more liberal tolerance for theaters (the audience). For review rooms, Gamma is allowed to be ±2%, and color accuracy is allowed to be ±4 delta E*. I imagine this is to accomodate some slight aging of the projector bulb, but the point is if you’re within this tolerance, you’re good.

Absolute perfection, for we independent post folks, is cost prohibitive. I would also argue that it’s demonstrably unnecessary for professional work. I say demonstrably because Sony CRT monitors are still widely in use in high-end color grading suites across the world, and these CRTs are not set up to display Rec 709 primaries. They use SMPTE-C phosphors, which have different primaries, so the resulting gamuts closely align but do not perfectly match. Most folks will tell you that the reds appear subtly different when compared. Here’s a plot of the tri-stimulus primary values of Rec. 709 and SMPTE-C, compared.

However, they’re really close, and the truth is, if you review a program on a CRT, then output it to tape, get in your car, drive across town, get a sandwich, drive the rest of the way to the other post house, load the tape, sit in a different suite with a Rec 709 display, and watch the program, you probably won’t notice any difference, because the variation is only really going to be apparent if the two displays were sitting side by side, and we humans don’t have particularly good scene-specific color memory, and besides all color is relative to the other colors in the scene, and likely the interior of the gamut is going to be more consistent, so hooray.

Why am I bringing this up? Because if suites at the highest levels of our industry continue to employ display technologies with demonstrable variation from the Rec 709 standard, in pipelines involving other displays that are more closely adherent to the Rec 709 standard, then that means there’s already an accepable tolerance being informally used. It’s just not being admitted to or documented.

I’m not saying we shouldn’t strive for accuracy. I’m simply suggesting that what tolerance or margin for error is pragmatically acceptable needs to be more closely considered, ratified, and documented, so that we can all stop debating endlessly about what we should buy. Instead, a given display is either within tolerance or not, and there’s enough wiggle room for realistically minimum perceived differences between different technologies.

At that point, if a display or display type is outside the tolerance, then it’s easily discarded. And if it’s within tolerance, then you’re good to go, free of fear. The very thought of this lowers my blood pressure considerably.


By the way, if there’s already an acceptable tolerance for reference Rec 709 displays and I just don’t know about it, please enlighten me. I would love for this to be something that’s already defined.


Color Correction Handbook 2nd Edition: Grading theory and technique for any application.
Color Correction Look Book: Stylized and creative grading techniques for any application.