american society of cinematographers technology committee

18
By Curtis Clark, ASC; David Reisner; Don Eklund; Bill Mandel; Michael Karagosian; Eric Rodli; Steve Schklair; Gary Demos; Gary Mandle; Greg Ciaccio; Lou Levinson; David Stump, ASC; Bill Bennett, ASC; David Morin; Michael Friend; W. Thomas Wall ASC Technology Committee Officers Chair: Curtis Clark, ASC Vice-Chair: Richard Edlund, ASC Vice-Chair: Steven Poster, ASC Secretary: David Reisner, D-Cinema Consulting Introduction ASC Technology Committee Chair: Curtis Clark, ASC Following on from our 2015 American Society of Cine- matographers (ASC) Technology Committee progress re- port, we have been proactively engaged with several key technology developments shaping our motion imaging fu- ture. Prominent among these are high dynamic range (HDR), digital cinema laser projection, and wide color gamut (WCG) beyond both BT.709 and DCI P3. The rapid advance of HDR being deployed for ultra-high- definition (UHD) television (TV) consumer displays, in- cluding the proposed BT.2020 WCG, has raised urgent questions regarding standards-based implementation, which filmmakers need to support their creative intent and ensure consistent display quality across multiple con- tent distribution platforms. The release of the Academy Color Encoding System (ACES) 1.0 has encouraged wider industry adoption of this crucial standards-based color management system that supports filmmakerscreative use of WCG with HDR and defines an important new expanded creative canvas. The following reports from our subcommittees cover in detail the crucial work being done to address the array of motion imaging technology developments that are im- pacting the art of filmmaking. The ASC Technology Committee is guided by its pri- mary mission to engage and influence motion imaging technology developments in ways that better serve and protect the filmmakers creative intent and the role of the cinematographer in realizing a creative vision that best serves that creative intent. I would like to thank all those who devote their time and expertise to support the mission of the ASC Technol- ogy Committee. Secretarys Comment ASC Technology Committee Secretary: David Reisner For the past 15 years, our industries have continued to change at a remarkable rate, and the ASC Technology Committee has helped find, guide, defend, and expand artistic options through those changes. The most significant recent issue is HDR imaging. I immediately changed from an HDR skeptic to an HDR believer when I participated in the first HDR color correc- tion experiments for the Image Control Assessment Series (ICAS). When used judiciously, we were much better able to express the intent of the original footage, both for bright sunlit scenes (e.g., out the window of the diner) and for dark scenes (e.g., bicycle ride into the nighttime alley and subsequent festivity). HDR is often immediately noticeable and desired by viewers. Some forms are being delivered by over-the-top (OTT) TV providers, who sometimes also have some control of TV settings/behavior, and others by UHD Blu- ray. However, although the UHD Alliance and the Inter- national Telecommunication Union (ITU) have published some HDR formats (e.g., hybrid log-gamma [HLG] and Dolbys perceptual quantizer [PQ]), we do not have a clear, single, consistent agreement on what HDR will be when ultimately delivered to the viewer. HDR capabilities are different for individual models of display, which makes it particularly difficult to deliver the artistic intent reason- ably consistently. Image reproduction adjustments for sig- nificantly different display capabilities are particularly dependent on viewing environmentsomething we are only beginning to account for. We clearly still need ad- ditional work in handling of out-of-gamut and out-of- dynamic-range images on a display. BT.2020 defines a very good WCGnicely encloses the Pointers gamut (all colors reflected by the surface of real objects)so it makes a good goal. However, we need to figure out an appropriate set of primaries that are less prone to Digital Object Identifier 10.5594/JMI.2016.2592338 Date of publication: xx Month Year American Society of Cinematographers Technology Committee Progress Report 2016 1545-0279/16B2016SMPTE September 2016 | SMPTE Motion Imaging Journal 1

Upload: others

Post on 04-Jan-2022

2 views

Category:

Documents


0 download

TRANSCRIPT

By Curtis Clark, ASC; David Reisner; Don Eklund;Bill Mandel; Michael Karagosian; Eric Rodli;Steve Schklair; Gary Demos; Gary Mandle;Greg Ciaccio; Lou Levinson; David Stump, ASC;Bill Bennett, ASC; David Morin; Michael Friend;W. Thomas Wall

ASC Technology Committee Officers

Chair: Curtis Clark, ASCVice-Chair: Richard Edlund, ASCVice-Chair: Steven Poster, ASCSecretary: David Reisner, D-Cinema Consulting

IntroductionASC Technology Committee Chair:Curtis Clark, ASC

Following on from our 2015 American Society of Cine-matographers (ASC) Technology Committee progress re-port, we have been proactively engaged with several keytechnology developments shaping our motion imaging fu-ture. Prominent among these are high dynamic range(HDR), digital cinema laser projection, and wide colorgamut (WCG) beyond both BT.709 and DCI P3. Therapid advance of HDR being deployed for ultra-high-definition (UHD) television (TV) consumer displays, in-cluding the proposed BT.2020 WCG, has raised urgentquestions regarding standards-based implementation,which filmmakers need to support their creative intentand ensure consistent display quality across multiple con-tent distribution platforms.

The release of the Academy Color Encoding System(ACES) 1.0 has encouraged wider industry adoption ofthis crucial standards-based color management systemthat supports filmmakers’ creative use of WCG with HDRand defines an important new expanded creative canvas.

The following reports from our subcommittees coverin detail the crucial work being done to address the arrayof motion imaging technology developments that are im-pacting the art of filmmaking.

The ASC Technology Committee is guided by its pri-mary mission to engage and influence motion imaging

technology developments in ways that better serve andprotect the filmmaker’s creative intent and the role of thecinematographer in realizing a creative vision that bestserves that creative intent.

I would like to thank all those who devote their timeand expertise to support the mission of the ASC Technol-ogy Committee.

Secretary’s CommentASC Technology Committee Secretary:David Reisner

For the past 15 years, our industries have continued tochange at a remarkable rate, and the ASC TechnologyCommittee has helped find, guide, defend, and expandartistic options through those changes.

The most significant recent issue is HDR imaging.I immediately changed from an HDR skeptic to an HDRbeliever when I participated in the first HDR color correc-tion experiments for the Image Control Assessment Series(ICAS). When used judiciously, we were much better ableto express the intent of the original footage, both for brightsunlit scenes (e.g., out the window of the diner) and fordark scenes (e.g., bicycle ride into the nighttime alley andsubsequent festivity).

HDR is often immediately noticeable and desired byviewers. Some forms are being delivered by over-the-top(OTT) TV providers, who sometimes also have somecontrol of TV settings/behavior, and others by UHD Blu-ray. However, although the UHD Alliance and the Inter-national Telecommunication Union (ITU) have publishedsome HDR formats (e.g., hybrid log-gamma [HLG] andDolby’s perceptual quantizer [PQ]), we do not have aclear, single, consistent agreement on what HDR will bewhen ultimately delivered to the viewer. HDR capabilitiesare different for individual models of display, which makesit particularly difficult to deliver the artistic intent reason-ably consistently. Image reproduction adjustments for sig-nificantly different display capabilities are particularlydependent on viewing environment—something we areonly beginning to account for. We clearly still need ad-ditional work in handling of out-of-gamut and out-of-dynamic-range images on a display. BT.2020 defines avery good WCG—nicely encloses the Pointer’s gamut(all colors reflected by the surface of real objects)—soit makes a good goal. However, we need to figure outan appropriate set of primaries that are less prone to

Digital Object Identifier 10.5594/JMI.2016.2592338Date of publication: xx Month Year

American Society of Cinematographers TechnologyCommittee Progress Report 2016

1545-0279/16B2016SMPTE September 2016 | SMPTE Motion Imaging Journal 1

metamerism and an economically reasonable way to il-luminate those primaries. At this point, BT.2020 is“aspirational”—it describes a goal, not what we canwidely build and deliver today. Partly for that reason,the current generation of consumer displays has takenP3—essentially movie film’s color gamut and digitalcinema’s minimum gamut—as the de facto standardand reference for color performance. BT.2020 imple-mentation is an issue for the new generation of TVsand will likely be an issue for any future wider gamutcinema.

For everyone who needs to deal with the issues ofHDR, I strongly recommend a careful reading of thisyear’s Advanced Imaging Subcommittee Progress Report,by Gordon Sawyer Award winner Gary Demos, whoseexpertise in motion picture imaging is exceptional.Demos is doing some of the most sophisticated and ad-vanced exploration of HDR imaging characteristics andperception. It is in no way a “casual” read, but it raises anumber of practical questions that should be considered.The ASC Technology Committee is exploring the possi-bility of research, testing, and demonstration to increaseunderstanding and to find and test solutions on someof those issues. However, if you master content, makedisplays, or are involved in content delivery, I recom-mend that you read Demos’ report and think about itcarefully.

The next several years will probably show as strong anemphasis on digital motion picture camera lenses as onthe camera bodies and imagers themselves.

Computational imaging—multiple sensors plus com-putation, rather than sensor plus lens—is going to beshowing up very widely very soon. Computational im-aging will show up on cell phones this year and next.It provides a different enough set of imaging choicesthat it is not clear when or if it will play a significantrole in motion picture or TV entertainment imaging.Some versions of computational imaging fit right inwith the revolution in cloud-based storage and process-ing that we are presently in. In addition, on the com-puter side of imaging, it is possible that an as-yetundeveloped application of artificial intelligence “DeepLearning,” cloud storage, cloud computation, and adifferent slice through Big Data may actually hold ourbest promise for automatically creating and deliveringHDR artistic intent for our wide range of devices andenvironments.

Our emphatic thanks to outgoing ASC presidentRichard Crudo, ASC, and new president Kees VanOostrum, ASC, and to the ASC membership and thestaff and team at the ASC. The ASC was formed 97 yearsago to help industry experts work together as a team toproduce exceptional imaging and tell exceptional stories.The modern ASC actively continues that traditionthrough the everyday work of its members, associates, andstaff.

UHDTV SubcommitteeChair: Don EklundVice-Chair: Bill MandelSecond Vice-Chair: David Reisner

It has been a busy year for the subcommittee membersand for the industries involved in UHDTV, which canlargely be attributed to HDR and WCG. All major manu-facturers now have HDR-capable display devices in themarket, with a range of content now available from UHDBlu-ray and OTT, and testing under way with cable andsatellite networks. Early adopters and average consumersalike can now experience UHD with the resolution, color,and HDR capabilities that have been the subject of muchwork in standards developing organizations (SDOs) andother organizations.

While the performance of first-generation displays andcontent have often been impressive, there have been few,if any, commercial examples of “live” HDR, which will al-most certainly be a driving force in the adoption of UHD.Broadcasters, multichannel video programming distribu-tors, SDOs, trade organizations, and others have been fo-cused on the challenge of finding the most effective meansof making WCG and HDR available to consumers. Livecontent presents unique challenges, given the priority ofservicing existing high-definition (HD) customers andsupporting a forward-looking strategy that includes “4K”

(3840 ! 2160), HDR, and WCG.Demonstrations of UHD leave little doubt that the ef-

fort being expended is worthwhile. While it represents anincremental change in multiple parameters, the result ismore than that. For the first time, consumers can get anexperience that looks “real” (at least when objects are notin motion). It should be noted that, for filmmakers, ren-dering a picture that looks like reality is not necessarilywhat they seek. At a recent event hosted by the ASC, asample group of cinematographers were impressed by ex-amples of what a high-performance HDR system can do,but none said it was essential for their next project.

The tools are available to create UHD content; how-ever, despite published standards, they do not all performequally or accurately. Cameras, grading tools, compres-sion, and virtually every component of the production andpost-production system need to be studied and tested tovalidate UHD performance. Consumer products alsohave performance problems, mostly related to signaling,but as of this writing, they are being addressed by theirrespective manufacturers.

The UHDTV subcommittee has maintained an opendoor for advocates to come discuss and present relatedtechnology, devices, and tools. Recently, Dolby hasbriefed our subcommittee on their ICtCp encoding in-tended as a replacement for nonconstant luminanceY0CbCr with HDR and WCG signals. ICtCp is being con-sidered for inclusion in ITU BT.2390 “HDR-TV,” which

2 SMPTE Motion Imaging Journal | September 2016

specifies parameters for HDR-TV signals to be used forprogram production and international program exchange,using either perceptual quality (PQ) or HLG HDRsignals.

The challenges of delivering an ideal UHD experiencefrom the camera to the consumer’s display are many. Dis-tributing a format with the capability of UHD providesample opportunity to break the “looks real” experiencewith any mistake. Conversely, when UHD parameters arewell utilized, the results are outstanding and represent theopportunity to transform the industry and reset consumerexpectations.

Input to the group is welcome, provided that it can beshared on a nonconfidential basis (contact [email protected]).

Next Generation CinemaDisplay SubcommitteeCo-chair: Michael KaragosianCo-chair: Eric RodliCo-chair: Steve Schklair

The Next Generation Cinema Display subcommittee isconcerned with satisfying the creative intent of filmmakerswith the emergence of new technologies for projectionlight sources and displays in cinemas. The subcommitteehas over 70 members, representing all major projectiontechnology providers, the major studios, and the creativeand technology leaders within the motion picture commu-nity. Our goal is to provide expert review and guidance inthe presentation quality of the cinematic image that bestsupports the filmmaker’s aesthetic intent with the addedobjective of influencing the adoption of emerging technol-ogies that improve image quality.

The group’s scope has expanded to include all emerg-ing display technologies for cinema, including systems uti-lizing red, green, and blue (RGB) laser and laser-phosphorilluminators. RGB laser-illuminated projection systemshave the potential for higher light level and wider colorgamut, whereas laser-phosphor illuminators target thebroad market with a lower cost of ownership than xenon.We continue to monitor the direct-view light-emittingdiode (LED) market with its potential for future high-resolution, WCG, and HDR large-screen displays.

To gain a deeper understanding of the laser illumina-tor market, in 2015, the group issued a Request forInformation (RFI) to the several manufacturers of laserillumination systems, seeking information and recom-mendations concerning factors that affect performance,including primary selection and achievable dynamicrange. The RFI also requested observational informationconcerning the visual artifacts of metameric variabilityand speckle. The majority of manufacturers, however,

were not prepared to respond in detail. The commonsentiment is that the laser illuminator market is in anearly stage of development, and suppliers and/or specifi-cations remain in flux. The good news is that there isroom for guidance.

A more recent focus for our group has been the de-velopment of a test protocol for evaluating projectedimages in terms of the characteristics of interest fornext-generation cinema. This includes practical limitsfor deep blacks and peak whites, practical primaries fora wider color gamut, and contrast. “Practical limits” in-clude not only limitations imposed by environment andtechnology but also the point in which diminishing re-turns are perceived from improved performance. Onemight think of this as a visual obstacle course for bothprojectors and audience. A study group is working onthis task as this report is being written.

The first phase of evaluations will utilize test charts andcopyrighted content toward the determination of criticalparameters and a trial baseline for higher performance. Asecond phase is targeted, in which a revised test protocolwill be produced utilizing noncopyrighted content, withthe goal to widely share the test protocol and test materialsfor the benefit of technology providers and other stake-holders worldwide. Other work for the group includes astudy of new-generation cost-efficient light sources to eval-uate their ability to generate the full P3 color space.

Advanced ImagingSubcommitteeChair: Gary DemosVice-Chair: Jim FancherVice-Chair: Bill MandelSecretary: David Reisner

The work of the Advanced Imaging subcommittee hascontinued to support the work of other subcommittees,including the UHDTV and Professional Display subcom-mittees. The emphasis has been on system architectureissues, which consider the entire workflow from scenecapture (or synthesis) to presentation display.

Our research on high-quality delivery of artistic intentwith the varied set of current displays and formats, aswell as on these new approaches, continues. We areworking on ways to further describe and demonstratesome of these issues and effects for a broader viewingaudience.

Self-Relative Characterization

A methodology has been developed in which self-relativepercentage deviation (percentage of deviation, comparedto the magnitude of the sample value) is utilized to deter-mine the visibility of undesirable artifacts, such as image

September 2016 | SMPTE Motion Imaging Journal 3

contouring and quantization noise. This methodologyprovides a quick and simple approach to evaluating sys-tem architecture components, individually or in combina-tion. System weaknesses and bugs have been found easilyby testing codecs and transfer characteristics, throughevaluating self-relative variations as a function of bright-ness and/or color (usually using RGB). Ultimately, thecomparison can be checked against the perceptual thresh-old, although existing 10-bit HDR systems lie mostly (orentirely) above that threshold. Deviations that have a pat-tern, such as contour bands on smooth gradients, haveinherently different visibility than noise thresholds andnoise-related buildup due to coarse quantization. How-ever, self-relative percentage deviation, using absolutemeasures as well as one or more multiples of the stan-dard deviation, provides a directly useful characterizationfor all types of potentially visible deviations. The percep-tual threshold self-relative percentage has proven broadlyuseful for evaluating HDR deviations of all types. Notethat the self-relative percentage is about 1/4% acrossmost of the HDR brightness ranges and for most colors(a little higher for blue), coming up from dark graythrough the brightest scene elements. However, typicaltesting of current HDR systems shows deviations muchhigher than this threshold. Self-relative testing notonly shows the comparison versus the perceptual thresh-old over the brightness range but also points the way to-ward future improvements to the system architecture(see Fig. 1).

Transfer Characteristic Considerations

There has been some testing of the PQ curve and, to alesser extent, some preliminary testing of the HLG curveas transfer characteristics. Both of these curves are an-chored by specific mappings to radiometric brightnessthat are specified in terms of candelas per square meter(cd/sq. m, also known as “nits”) at D65 neutral. Boththe PQ and HLG curves allow brightness to be sentto displays in which the image brightness can exceedthe capability of the display. In this case, the change inimage appearance and desaturation behavior for these

brighter-than-the-display regions becomes undefined andcan be implemented differently for each model of display.Such a concept differs from BT.709 (having BT.709 pri-maries and gamma 2.22/2.4) and D-Cinema (havingminimum gamut P3 primaries and gamma 2.6). InBT.709 and D-Cinema specifications, the value oflogical 1.0 (e.g., 1023/1023 or 4095/4095) is mappedto display maximum. Furthermore, in D-Cinema, thedisplay maximum is set to 48 nits (14 fL).

Wide Gamut

In BT.709 and D-Cinema, the color behavior is specifi-cally defined only to the BT.709 primaries, or to the P3“minimum gamut” in D-Cinema. The D-Cinema X’Y’Z’(International Commission on Illumination [CIE] 1931XYZ “tristimulus” values to the inverse gamma 2.6) al-lows colors to be specified outside of the P3 minimumgamut. However, the display or projector behavior is un-defined. The BT.709 xvYCC extension also allowedcolors beyond BT.709 primaries. Thus, the behavior out-side of P3 for D-Cinema, and outside of BT.709 pri-maries for BT.709, is inherently device specific and notstandardized.

This is also the situation when using the BT.2020WCG primaries. However, BT.709 primaries and P3primaries are specified in terms of “correlated color tem-perature” using CIE 1931 chromaticities. Thus, there isan infinite number of spectra that can be used to makeany color (including neutral) in BT.709 and D-Cinema.When BT.709 (and CCIR 601) used primarily cathoderay tube (CRT) displays, the display emission spectrafor RGB were similar, from mastering through distribu-tion. In D-Cinema, if the projector used for masteringwas made using the same technology as the final presen-tation, the same consistency of spectra existed. However,when changing projection technologies, particularly lightsources, there is potential for individual and average de-viation from intended colors. Similarly, as flat display al-ternatives have replaced CRTs in BT.709 presentation,the emission spectra also vary with each display technol-ogy. The inherent potential matching precision of acolor specified in CIE 1931 chromaticity (or tristimulus)is reduced when the emission spectra vary between pro-jectors or displays. This is due to CIE 1931 color-matching functions (or any color-matching functions)being approximate by their nature as a statistical averagefor color perception.

It should be noted that BT.2020 does not specify aminimum gamut as did D-Cinema. The implied intent isthat chromaticities all the way to the spectrum locus arespecifically allowed in any given master. However, aswith brightness, there is no definition of the gamutboundary clip for displays having a gamut less than theentire BT.2020 gamut. Thus, any colors specified in anHDR/BT.2020 master that extend beyond a given dis-play’s gamut will have undefined colors (not matchingFIGURE 1. Sigma_compare RGB round-trip example.

4 SMPTE Motion Imaging Journal | September 2016

the mastered colors). Gamut reduction is known tobe difficult; thus, this is potentially problematic whenusing colors wider than a typical color gamut (e.g.,BT.709 or P3).

BT.2020 primaries are very specifically defined asmonochromatic red at 630 nm, green at 532 nm, andblue at 467 nm. However, there is a second definitionusing the corresponding CIE 1931 chromaticities. Unlessa given display uses narrow-spectrum RGB emission atthese wavelengths (e.g., using lasers), there is the samereliance on CIE 1931 color-matching functions as withBT.709 and D-Cinema. Even if lasers are used, individ-ual (or average) color perception is significantly variablefor monochromatic primaries. In other words, all colorsspecified as CIE 1931 chromaticities as well as theBT.2020 monochromatic primaries have inherently re-duced color accuracy (versus every display being similar,as was the case with the CRT). This reduced color accu-racy is due to weakness in the 85-year-old CIE 1931color-matching functions, as well as to individual variationin color sensing and perception. Note that the subtendedangular size of a color also affects color perception (due tothe “yellow spot” on the retinal focus). This is the reasonthat the CIE made the 1964 “supplementary color stan-dard observer” for 10° as an alternative to the 2° CIE1931 color-matching functions. Age also affects the colorperceived. The CIE 170-1:2006 “cone fundamentals” fur-ther make color perception parametrically variable by ageand subtended angle.

In the past, we have done some testing on the behaviorof CIE 1931 when used in cinema color gamut, bright-ness, and viewing environment. Similar testing may be ofvalue in the very different gamut, brightness, and viewingenvironment of modern consumer displays.

Dark Behavior in HDR Systems

At low brightness, a small subtracted constant is com-monly used (e.g., BT.1886) as a crude ambient blackcompensation. This subtracted constant should probablybe removed in an HDR system architecture (e.g., above250 nits), particularly in light of deep-black displays, suchas organic LEDs (OLEDs), being a portion of the in-tended distribution displays.

It should also be a consideration that dark behavior forbright displays is inherently different from dark presenta-tion, such as D-Cinema, or even historic BT.709/601presentation. In D-Cinema, the darkest regions of theimage cross the boundary of low-light vision. In the eye,bright portions of an image are perceived in color by the“cones,” but dark regions (or dark scenes) are perceivedwithout color by the “rods.” D-Cinema at 48 nits (14 fL)crosses the cone/rod boundary. Cone bright color visionis called “photopic”; rod dark vision without color iscalled “scotopic.” Images that cross the boundary, suchas D-Cinema, are called “mesopic.” (Generalizations ofhow and when scotopic vision and “Hunt” effect issues

contribute are difficult for the dynamically changingbrightness of moving images.)

Rod vision becomes prominent below about 0.1 nits(although there is variation between individuals). ForD-Cinema, this is a high black at about 1/500th of themaximum 48 nits (14 fL). The transition region be-tween rod and color (color) perception is gradualacross the range of 0.1 to 10 nits, although rod visionis weak when it begins to see color starting at approxi-mately 0.1 nits. However, there is a correlation betweenabsolute brightness and perceived “colorfulness,” whichis the increased color that is seen by the rods, which ex-tends from 0.1 nits all the way up to daylight brightness.Thus, a 12 nit (3.5 fL) 3D presentation will appear lesscolorful than a 48 nit (14 fL) presentation. Note thatLaboratory Aim Density (LAD) at 10% of peak white ina 12 nit 3D presentation will be 1.2 nits (0.35 fL). Colorperception certainly extends to brightness below LAD inthis case, but with less colorfulness.

In the other direction, with each stop (factor of 2) ofbrightness increase, chromaticities will appear more color-ful. This is called the Hunt colorfulness appearance effect(named after the color scientist R. G. W. Hunt who char-acterized it). A simple experiment at 50, 100, 200, 400,800, and 1600 nits with identical chromaticities demon-strates that this colorfulness is a substantial appearanceeffect.

In addition to bright colors becoming more colorfulin appearance, dark colors gradually move from rodvision up into photopic color vision. The 1/500th of peakwhite for a 2000 nit display will now be at 4 nits(if shown with the same pixel value proportions andchromaticities with respect to peak white). Even with re-shaping of the brightness appearance curve, dark regionsremain photopic. Given that dark image regions havethus become photopic on HDR displays, the use of desa-turating S-curve processing becomes problematic. Theimplication is that color mastering and distribution con-cepts from D-Cinema and BT.709 cannot be extendedinto HDR. Rather, significant perceived color in dark im-age regions is required. Desaturated colors in dark re-gions adapted from D-Cinema or BT.709 masters willhave an unnaturally dull appearance when presented di-rectly on HDR displays. Repurposing standard dynamicrange (SDR) masters for HDR must consider that darkregions in HDR are photopic.

This same issue has implications when attempting toextend a common master between HDR and SDR.

Parametric Appearance Compensation

Appearance variation is a significant issue given the varia-tion in the capabilities of displays as well as the variationsin the lightness of the rooms in which the displays areviewed. The colorfulness effect was already mentioned(a function of absolute brightness). In addition, ambientroom brightness affects the perception of dark regions of

September 2016 | SMPTE Motion Imaging Journal 5

an image. Essentially, a brighter room surround makesdark scene regions appear significantly darker. Failure tocompensate for room surround appearance effects cancause dark portions of scenes to become darker and al-most muddy in appearance. Facial expressions can evenbecome invisible when viewed in bright surround, whenhaving been mastered in dark surround (in which facesare perceived as being brighter).

Both LAD and midwhite are perceived most consis-tently if varied as a function of absolute brightness. Inpractice, it was convenient to combine the ambient sur-round and absolute brightness compensation into a singleparameter. Brighter displays need a darker LAD and mid-white, and brighter surroundings need a brighter LADand midwhite.

RGB Ratio Preservation

It was found to be a convenient enabler for our test HDRarchitecture to adjust brightness independent of chroma-ticity. For example, this allowed preservation of photopicvision in dark regions. It also allowed support for a widevariety of brightness ranges (both maximum brightnessand dark range). By maintaining the relative ratios of R,G, and B for each pixel, the chromaticity remains con-stant. Note that this constancy is independent of color-matching functions if/when the color primaries are spec-trally specified (e.g., as mastering display primary emissionspectra for each of R, G, and B).

The one exception to chromaticity preservation is col-orfulness compensation, which must vary saturation toachieve constant perceived colorfulness. Applying this as adelta to an otherwise constant chromaticity parametric ar-chitecture yielded a simple and effective means of unifyinga variety of displays to approach a common masteredappearance. At some brightness, however, matching amastering brightness, the chromaticity is unchanged, re-presenting the mastered intent for colorfulness at thatspecific brightness.

The use of chromaticity preservation has the practicalbenefit of easier system calibration. Spectral radiometersand chromaticity meters become a direct tool for calibrat-ing and verifying imaging system components.

Where Information Resides

It is necessary to consider where information resides re-lated to parametric appearance compensation. The abso-lute brightness of a display is a function of the specificdisplay type and its settings. The ambient surround isknown only at a given display during viewing (and it willtypically vary with daytime). The implication is that ap-pearance compensation requires that the signal receivedat each display be in a neutral form appropriate for suchappearance adjustments. Furthermore, each display mustalgorithmically compensate using those parameters thatcan be determined only at that specific display.

Note that some HDR system architectures do not takeinto account this issue of where appearance compensationparametric information is to be found.

Mastered Appearance

In working on the sample parametric HDR system, anend-to-end camera-to-display appearance model was ex-perimentally utilized. This is a type of optical-to-opticaltransfer function with aesthetic appearance adjustment. Itwas found that such an automatic appearance adjustmentcould yield a master in which scene grading adjustmentswere made with respect to a neutral grade, with no ap-pearance bias. This astounding discovery, although pre-liminary, suggests that it might be possible to have aneutral scene-referred appearance as input to grading.This concept could possibly be further extended to syn-thetic (e.g., computer generated) scenes, as well as com-posite scenes. Although there are many architecturalcomponents in an HDR system that is scene referred,such a system naturally extends the natural scene appear-ance to a range of displays. For example, even if themastering display has limited range, the grading can in-clude natural scene information (from cameras or syn-thetic scenes) beyond the master display’s range.Although currently mostly untested, there may be appli-cation to extension beyond a mastering displays’ gamutas well.

It should also be prominently noted that the chro-maticities defined in the master form a natural archivalrecord of the color appearance, although the dynamicrange itself is intended to vary with the ranges of pre-sentation displays. Together with a sufficient characteri-zation of the mastering display, an archival HDRmaster becomes thoroughly defined by preserving chro-maticities from the master within the intended systemarchitecture.

Looking Forward

It can be seen from this description that there has beengood progress, but substantial architectural explorationand refinement remain. Our work suggests that the mas-tered appearance needs to be parametrically extended toprovide that same appearance on diverse displays (and intheir environments), to the degree that this is possible.This suggests that such parametric appearance adjust-ment, utilizing information residing exclusively at eachpresentation display, is a necessary part of an HDR archi-tecture. The parametric algorithms utilized to accomplishthis should perhaps be considered for standardization,since correct HDR appearance requires not only the cor-rect distribution of the HDR master but also the correctparametric adjustment of that master to yield the intent ofthe mastered appearance.

We look forward to doing additional research and test-ing in these areas. We are also working on providing cleardemonstrations of some of these effects and behaviors to

6 SMPTE Motion Imaging Journal | September 2016

help people working in these areas understand the effectsof their imaging and equipment decisions.

Professional DisplaySubcommitteeChair: Jim FancherVice-Chair: Gary Mandle

The subcommittee has been supporting several activitiesinvolving both UHD and HDR display. In particular, themonitor subcommittee has supported demonstrations or-ganized by the UHD subcommittee.

The monitor subcommittee has been investigatingpost-production trends with regard to newer technologiesused for finishing and quality control. These trendsshow a rapid movement toward HDR production andthe expansion of HDR displays. In particular, there is atrend toward using SMPTE ST 2084 electro-opticaltransfer functions and increasing activity in 4K. Bothliquid crystal display (LCD) and OLED displays arebeing used.

The subcommittee is in the planning stage of two sepa-rate future projects. The first project has identified theneed to investigate the performance of some cell phoneand tablet devices to determine their image accuracy. It isbecoming more of a common practice for directors anddirectors of photography (DPs) to use these types of de-vices for color reference and general imaging decisionsduring production. The subcommittee would like to in-vestigate and measure these devices and see on averagewhat their performance is and draft a document thatwould help operators understand the performance level tobe expected from such a display.

The second project that the subcommittee is planningis drafting a document describing a method for an opera-tor to determine whether a display is performing to a levelrequired for the production. The method described woulduse the minimum of tools so that anyone with knowledgeof display calibration could judge the accuracy of thedisplay in the field quickly and have the ability to makeadjustments if the display is not calibrated.

Motion Imaging WorkflowSubcommitteeChair: Greg CiaccioVice-Chair: Tim KangVice-Chair: George JobloveCurrent Subcommittee Focus: ACES

For the last couple of years, the ASC Technology Com-mittee’s Motion Imaging Workflow subcommittee has

continued to focus on helping to educate and guide indus-try professionals on Academy Color Encoding System(ACES) benefits parallel with the efforts by the Academyof Motion Picture Arts and Sciences (AMPAS) Scienceand Technology Council.

The subcommittee is composed of key individuals ina variety of positions involved in production and post-production, who provide valuable real-world feedback.Frequently, prominent cinematographers attend andcontribute with fresh perspective.

ACES v.1.0 was introduced at the end of 2014, andsince then, a significant number of productions haveused ACES. For a partial list, see www.shotonwhat.com/aces.

Our subcommittee continues to work in conjunctionwith the AMPAS Science and Technology Council andhas created a clear and concise definition of ACES:

The Academy Color Encoding System (ACES)is becoming the industry standard for managingcolor throughout the life cycle of a motion pictureor television production. From image capturethrough editing, VFX, mastering, public presenta-tion, archiving and future remastering, ACES en-sures a consistent color experience that preservesthe filmmaker’s creative vision.

With so many new imaging advances being introducedconcurrently (increases in spatial resolution, dynamicrange, color gamut, etc.), it is vital to faithfully processand preserve creative intent by making sure that no bitsare lost along the way. This is particularly important nowas interest in HDR imagery has taken center stage, requir-ing a standard that can not only accommodate the extrarange needed but also more easily produce the numerousversions needed for new and legacy sets of varying bright-ness capabilities.

Current subcommittee discussions include the crea-tion of a section of the ASC site highlighting ACESbenefits, including support documents and links toACES-supporting manufacturer resources to help accli-mate end users to the new standard. In addition, a listof ACES Product Partners is provided on the AMPASsite, shown below.

As ACES user experiences are shared within our in-dustry, the practical benefits are being realized. Pre-senters and panelists at various trade events, such asthe Hollywood Professional Association (HPA) TechRetreat and CineGear, have presented real ACES bene-fits to attendees. At least one major studio has ex-pressed great interest in integrating ACES into theirproduction and post-production pipelines, as the bene-fits of ACES were realized in cost and time savings aswell as in archival.

More information regarding ACES is available at http://www.oscars.org/aces.

September 2016 | SMPTE Motion Imaging Journal 7

Digital IntermediateSubcommitteeChair: Lou LevinsonVice-Chair: Joshua PinesSecretary: David Reisner

This is a timely moment for our subcommittee to report.Although we have been quiescent as a subcommittee formany reasons, it has become obvious to those of us privi-leged to lead this group that we can no longer rest on ourlaurels and need to become more active.

There are still minor issues with the ASC Color Deci-sion List (CDL), either unresolved or having arisen dueto more recent developments that were not anticipated atthe time of its creation. Some issues have arisen usingthe ASC CDL in an ACES or wide gamut context. Theperformance of the saturation operator out near the edgesof wide gamut color spaces needs looking into. The fixcould be as simple as fixing the math a bit, or it may re-quire a more involved solution. We are actively investigat-ing our options.

A topic that fell by the wayside but is now being dis-cussed again is a way to allow for more complex usernotes/metadata: nothing that requires action in the wayof processing the images, but tools to help keep LeonSilverman’s snowflakes off the driveway. There were anumber of solutions proposed, but the current needswill require a solution of more complexity than we hadbeen planning on. We are confident that we can keepthe solutions inside the bounds of the CDL’s simplebut powerful philosophy.

An issue that seems to be moving to the fore is HDRimaging. There seems to be a general consensus that,while we all want to know about it and use it, there is nogood, clear, universal definition of what it is and how itshould be created and designed. We have proposed thatall the relevant subcommittees of the ASC TechnologyCommittee get together for an “HDR Fest” meeting andtry to come to some conclusions about what to recom-mend to a community that is already going ahead andworking in the HDR space, even if they do not knowwhat it is. Among the topics could be the following:h How are we defining an HDR image?h What is the target or targets for a reference display?h How do we define a reference display environment?h How do we deal with HDR images on set?h How do we exchange HDR images during content

creation?h How do we deliver HDR images to studios/content

creators?h What economic and studio political issues arise?h What can we recommend regarding delivery to the con-

sumer to yield the best and most accurate images possi-ble on multiple platforms with diverse characteristics?

There is chaos and misinformation out there, and oursubcommittee, with the entire ASC Technology Commit-tee, would be a perfect place for the community to comefor clarity. We have the knowledge and experience basein these matters, as well as a highly regarded, one-step-removed objectivity that this space seems to cry out for.It was once said about an ex-president who still had someeligibility left to serve: “He’s tanned, he’s rested, he’sready.” While probably neither tanned nor rested, we areready to once again help make content creation endeavorsmore predictable and dedicate ourselves to helping crea-tive authors both in the creation process and in what maycome after as deliverables and archive.

Our thanks to the ASC for support, as well as our bestwishes for everyone going forward from here.

To get the current ASC CDL specification, send ane-mail to [email protected]; an autoresponder willsend terms and instructions. For additional informationabout the ASC CDL or the Digital Intermediate sub-committee, contact Joshua Pines at [email protected],Lou Levinson at [email protected], or David Reisnerat [email protected].

Camera SubcommitteeChair: David Stump, ASCVice-Chair: Richard Edlund, ASCVice-Chair: Bill Bennett, ASC

The ASC Camera Subcommittee engaged in a number ofcommunity activities this year.

Metadata

We have been interacting with ARRI, both in Los Angelesand in Munich, to revisit camera metadata, particularly inthe realm of lens data. There is re-emerging interest atnumerous camera manufacturers, lens companies, andsoftware companies in moving forward to unify metadatastandards in cinematography, particularly lens data for in-clusion in image files. Accordingly, we have suppliedspreadsheets of metadata fields that the subcommitteepreviously vetted for consideration in designing an even-tual standard.

Panasonic VariCam LT

On 5 April 2016, Panasonic visited the Clubhouse to in-troduce their new VariCam 35 LT camera. It is a lighter,smaller version of their VariCam 35 camera, with a Super35mm 4K sensor, weighing 6 lb, capable of in-camera 4Krecording and simultaneous proxies. It has the same dualISO capability of 800 and 5000 as the larger version ofthe camera.

Panavision DXL

Panavision introduced their new DXL cinema cameraat the ASC Clubhouse on 1 June 2016—sensor type:

8 SMPTE Motion Imaging Journal | September 2016

16-bit, 35.5-megapixel complementary metal oxide semi-conductor provided by Red; digital resolution: 8192 ×4320; sensor size: slightly larger than the VistaVision film;format: 40.96 mm ! 21.60 mm; diagonal: 46.31 mm;max frame rate: 60 frames/sec at 8K full frame (8192 !4320), 75 frames/sec at 8K 2.4:1 (8192 ! 3456); re-cording codec: 8K RedRAW with simultaneous 4Kproxy (ProRes or DNx); recording media: Red SSD (upto 1 h on a single magazine); file type: .r3d (supportedin RED SDK); color profile: light iron color (compatiblewith all popular gamuts and transfer curves); weight:10 lb body only, without lens, battery, or accessories;lenses: directly motorize Primo 70 lenses through wire-less control.

Radiant Images: VR Imaging Systems

Thirty percent of Radiant Images rentals are providingvirtual reality (VR) camera systems and support. Theyoffer more than ten different VR camera systems, rang-ing from simple arrays of GoPro cameras up to 17 cam-era arrays made from Codex Action cameras and RAWrecorders. We learned one thing that was very informa-tive: VR shots must be carefully designed to avoid peo-ple or objects crossing the “stitch boundary” betweentwo cameras, at a distance closer than 4 to 8 ft, depend-ing on the system. If crossing occurs closer than that dis-tance, a producer must be prepared to spend between$30,000 and $50,000 per minute of finished footageto fix the crossing errors. We learned that this is not asimple “plunk the camera down, roll it, then post toYouTube” system, as many naive producers would liketo think it is.Fujinon Lens Demo Day

Bill Bennett helped Fujinon optics organize a lens demon-stration day at the ASC Clubhouse on 24 May 2016,where the entire line of Premiere, Cabrio, and 2/3 in.broadcast lenses were demonstrated for the ASC member-ship and other interested industries. We lit a scene in theClubhouse bar, shooting with an ARRI Alexa Miniand monitoring live HDR images on two 2000 nitmonitors—the Dolby Maui and Canon HDR monitors.Several other cameras by different manufacturers—Sony, Red, Panasonic, etc.—mounted with Fujinonlenses were demonstrated in the Great Room. We setup a Chrosziel lens projector in the Boardroom, to doa lens projection demonstration, moderated by MatDuclos of Duclos Lenses. Fujinon also introduced thenew 20-120 mm T3.5 Cabrio Premier PL Lens, withremovable zoom control.

Emerging Camera Technology

At the beginning of 2016, the Camera Subcommittee ofthe ASC received a request to evaluate a new prototypecamera technology—the Lytro Light Field Cinema cam-era. As a matter of personal interest, Dave Stump hasbeen following the development of Light Field capture

and plenoptic computational imaging via the research pa-pers on the topic that first came out of Stanford Univer-sity; hence, after discussions with Lytro, Inc., The VirtualReality Company (VRC), and Curtis Clark of ASC, heagreed to work as DP for director Robert Stromberg toshoot a short test film with the Lytro Cinema prototypecamera.

The short film Life was developed and designed byStromberg, and shots were defined together withLytro’s executive producer Jeff Barnes to produce atest that could demonstrate the capabilities of the tech-nology. Life is a visual poem that tells the story of boyand girl as they traverse through the stages in life fromyouth to old age using colored walking paths as a met-aphor for the various decisions that they make alongthe way. We knew that this project had to be ready fora National Association of Broadcasters (NAB) premiere,which gave us around 2 1/2 months from beginningto end. In light of where the prototype camera was indevelopment, the undefined post-production pipelineand the short window before NAB, this was a braveexperiment.

Since one of the test’s goals was to determine how theLytro Cinema camera and an ARRI Alexa would work inthe same workflow and how the Lytro Cinema cameracould intercut with conventional footage, it was decidedto capture some of the shots in Life on an Alexa.

Because we were going to be working with a prototypecamera, Stump visited Lytro’s headquarters in Palo Alto,California, to get a feel for the camera and gauge the light-ing and cabling needs. The prototype camera had a vari-able length extending from about 6 to 11 ft, depending onthe framing and refocusing range. The camera also had asubstantial weight; hence, from a planning perspective, weneeded to adequately prepare for the mechanics of mov-ing, panning, tilting, and dollying on set, but with properpreparation, the weight was not a major issue. We alsohad to take into consideration the parameters of shootingat very high frame rates, with a fixed T-stop lens at thesensor’s native ISO, which was set at ISO 320. Giventhese circumstances, I went to Mole Richardson to speakto Larry Mole Parker to spec out lamps that would giveus the light levels to create the beams and shafts of light insmoke that were designed into the storyboards.

The shoot was designed to function in a traditionalworkflow. In addition to capturing the Light Field dataat 755 raw megapixels with 16 stops of dynamic range,the camera also captures QuickTime ProRes thatallowed us to review files immediately after the take onset. The metadata in the Light Field is tied directly tothe information in the preview capture; hence, the creware able to make focus decisions on set in terms of focusrange, and could manipulate the camera just like onewould on a traditional shoot. Focus pullers, cinematog-raphers, and creatives can make traditional on-set deci-sions that get baked into the metadata of the files and

September 2016 | SMPTE Motion Imaging Journal 9

then see those versions in the real-time preview, just likeany other system. As we shot, proxy files were beinguploaded to our editor at VRC, and he began cuttingwhile we were still on set. Most of the shots in the shortwere captured between 72 and 120 frames/sec, enablingpost-production adjustment of synthetic output framerate and shutter angle. The camera is capable of up to300 frames/sec, and file size coming off of the camera isapproximately 650 Mbytes/frame; camera data weremoved through a 100 m fiber optic cable to a drive ar-ray. The eventual workflow design for the Lytro Cinemacamera will be able to upload image files for storage andmanipulation in the cloud.

There were several remarkable shots in the test short,among them a wedding scene in which the couple isstanding at the altar in front of a preacher. The actorswere photographed on a little walkway with an arch offlowers behind them, with reflective Mylar confetti flyingthrough the air in front of them. This shot was accom-plished on a bare stage, with grips walking behind the ac-tors while the scene was shot, rather than shooting themagainst a blue or green screen. From the Light Field data,depth extraction was used to create mattes for the coupleand to composite them into the final matte painting envi-ronment. Another notable shot featured a child actor play-ing baseball, rendered with an impossibly shallow stop ofT.3 with an accordingly shallow depth of field, demon-strating the Lytro Cinema camera’s ability to let the cine-matographer select what to keep in focus in a scene andto then selectively change the aperture of that focal planeto any desired F-stop in post.

Joint TechnologySubcommittee on VirtualProductionChair: David MorinCo-Chair: John Scheele

Since the last SMPTE report, the Joint Technology Sub-committee on Virtual Production of the ASC TechnologyCommittee continued its series of case studies on thebroadening use of real-time computer graphics on set.

Case Study: The Walk

The Virtual Production Committee held a case study ofThe Walk on 15 December 2015 on the Sony lot.Introduced by Tom Rothman, the chairman of Sony Pic-tures Entertainment Motion Picture group, the case studyfeatured a live interview with director RobertZemeckis and a making-of presentation by Kevin Bailliefrom Atomic Fiction. The evening also included a show-ing of The Walk: VR Experience, which was experienced

individually by most in the audience (Fig. 2). This wasthe tenth meeting of the committee, and we paid an ap-propriate tribute to Robert Zemeckis, who contributedmore than most to the development of virtual productionin the pioneering days by directing Polar Express (2005)and the movies that he later made at his ImageMoversDigital studio, such as A Christmas Carol (2009). The casestudy highlighted how VR in production is gaining trac-tion in the motion picture business and can be used suc-cessfully on a small production budget.

The Virtual Production Track at FMX 2016

For the fifth year in a row, the Virtual Production Com-mittee curated the renamed “Virtual Reality for Produc-tion Track” at the Conference on Animations, Effects,Games and Transmedia (FMX) in Stuttgart, Germany.The track, held on 27 April 2016, was curated by VirtualProduction Committee chair David Morin and showcasedsix case studies. The presentations covered the use of pre-visualization and virtual production on the films Pan,Gods of Egypt, The Martian, The Walk, and the newly re-leased The Jungle Book, a milestone in virtual production(Fig. 3).

VR for Production

As we mentioned in our last report to SMPTE, virtualproduction is VR and augmented reality (AR) for produc-tion. Massive investments in VR and AR are producingnew visualization and capture devices that are becomingcommercially available at a high rate and are being usedin production (Figs. 4 and 5).

Future Activities

The Virtual Production Committee will continue to pur-sue its goal of educating and helping to define the newworkflow and is currently planning its meeting #11, alongwith other ancillary events, the Virtual Production track atFMX 2017, and the possible addition of a new workgroupon AR and VR, specifically. The subcommittee is plan-ning to transform its proceedings at meeting #12, as perthe original plan. More details on that transformation willbecome available during the upcoming year.

Participation is encouraged. Those interested may con-tact the following:David Morin, Chair, [email protected] Scheele, Co-Chair, [email protected]

Digital PreservationSubcommitteeChair: Grover CrispVice-Chair: Michael Friend

As reported last year, preservation of moving images con-tinues to be in a state of flux. As new workflows are

10 SMPTE Motion Imaging Journal | September 2016

developing in response to the breakdown of the standardmodel, challenges increase for archives and libraries, whichare generally underfunded and find themselves trying toplan for a future that is technically uncertain. With the his-torical distribution methods in a state of disruption, it isnot clear how to construct an economic model for the dataarchive.

We have entered a period of transition with an unde-fined time scale and an undefined objective for imagingstandard and continually changing requirements. We haveexperienced the emergence of a platform-independent ar-chive that serves a series of temporally available distribu-tion scenarios. This new environment, which has beenlong expected, provides a challenge for the stabilization,retention, and representation of creative intent not justacross time but also on multiple incompatible platforms.

The image-making standard of the classical film erahas been completely displaced by technical developmentsin digital imaging. The superiority of the new cameras, interms of spatial resolution, temporal resolution, image,and color stability, is not simply a set of relative advan-tages in digital. It now emerges as a new form of imagemaking that is no longer only an emulation of film charac-teristics but regularly exceeds film in most parameters andconstitutes a new and undefined medium. HDR, HFR,and VR are quickly deploying in mainstream image mak-ing, and there is a consequent adjustment in productionworkflows to accommodate platforms and data products,which means new classes of data that need to be capturedin the archive. The implementation of ACES is just start-ing to transform practices in the moving image industry,with some studios beginning to implement ACES in allareas of image capture. The arrival of the BT.2020 colorspace and the display devices to support it is also havingan effect on the archive, in terms of scale, file type, andmetadata retention.

Because the capture, production, display, and archiveenvironments continue to transform rapidly, it is difficultto predict how the archive will be deployed in the future.The model of a 35 mm original negative as the stablesource of future images is a thing of the past, and black-and-white color separations on film—once the ne plus ultraof color retention—continue to attenuate as a viable meth-odology. The evolution from a static model of fixed ele-ments toward the idea of high-quality, nonstandardizedoriginal data resources and a series of metadata compo-nents (which allow the transformation of these resourcesinto specific and well-defined data products for specificuses) is now a reality. To capture, manage, and deploya full range of production data, new pathways for acquisi-tion will be necessary, and these may include accommo-dations, such as capture of camera files and metadatadirectly from the set. Along with the collection of morecomplex sets of data, the issues of metadata to character-ize essence and to transform it are becoming central tonew archival scenarios. Standardization of metadata isbeing introduced through SMPTE, but there is a consid-erable amount of conceptual work to be done with regardto what is sufficient and necessary as well as economicallyfeasible for the archive. There is interest in exploring theviability of the Archive eXchange Format (AXF), and sev-eral studios are testing whether this format can play a rolein the management of archival resources.

Cloud storage begins to assume greater importanceand urgency for archives, not just from the pure storageperspective but also in association with the trend towardthe virtualization of post-production services (includingthose required for preservation and for transition of datafrom preservation to distribution state) in a cloud environ-ment. The proximity of storage-state data to online facili-ties that will perform required modifications is beginningto emerge as a dimension of cloud storage, despite the

FIGURE 2. Intensive previsualization and on-set visualization were necessary to produce the vertiginous story on time and on budget.

September 2016 | SMPTE Motion Imaging Journal 11

unresolved issues of cost prediction for services and with-drawal, security, access, and control.

SSIMWave HDR EvaluationWorking GroupCoordinator: W. Thomas Wall

The purpose of this ASC Technology Committee Work-ing Group is to assess, evaluate, and improve the preser-vation of the original creative intent of HDR, WCG,ultra-high-resolution digital video imagery during its

distribution and delivery, from grading suite to con-sumers, as judged by cinematographers and colorists.

With the introduction of HDR, WCG, 4K, and UHDresolution displays and the upcoming delivery of suchcontent to consumers freed from the constraints of tradi-tional broadcast TV, the ASC has an opportunity to pro-vide input and guidance into how such content deliverycan best preserve the original creative intent of cinematog-raphers. This Working Group was established at the endof 2015 to pursue these goals.

The Working Group will, with industry partners,perform the following:1) Establish a controlled testbed environment in which

cinematographers and colorists can evaluate the quality

FIGURE 4. Visualization: The HTC Vive VR headset became commercially available in May 2016.

FIGURE 3. A series of six case studies on VR for production was presented in Stuttgart, Germany, on 27 April 2016.

12 SMPTE Motion Imaging Journal | September 2016

of HDR, WCG, ultra-high-resolution digital motionpicture imagery as delivered to high-end consumer dis-plays compared to that same imagery as seen and ap-proved in a color grading suite.

2) Establish and execute a process to evaluate the efficacyof the Structural SIMilarity (SSIM) perceptual qualitymetric—as implemented in the latest SSIMWave Inc.quality metric software—to quantify the delivered qual-ity of HDR, WCG imagery, as judged by contentcreators.

3) Evaluate the usefulness of SSIM perceptual qualitymonitoring software in the creative process.

4) Suggest improvements, where necessary, to improvesuch quality metrics and the delivered image qualityto better preserve the original creative intent of cine-matic imagery as viewed and approved in a colorgrading suite.

The SSIMWave software produces a grayscale map ofthe salient, noticeable differences between an original andan altered image, and a numeric “Quality of Experience”metric score, to indicate the level of perceptual differ-ences between two images. This software is currentlyused to evaluate quality only in the final stages of cableand internet BT.709 delivery chains, and it has beenevaluated and calibrated based on tests using “typicalviewers” as subjects.

The Working Group has been collaborating withDr. Zhou Wang, the winner of the 2015 Primetime En-gineering Emmy Award for his development of SSIM,professor in the Department of Electrical and ComputerEngineering at the University of Waterloo, Ontario, Can-ada, and chief scientist and cofounder of SSIMWave,Inc.; Dr. Abdul Rehman, chief executive officer andcofounder of SSIMWave, Inc. and a postdoctoral fellowat the University of Waterloo; and with Dr. Kai Zeng,chief technology officer of SSIMWave, Inc., to design

and set up a testbed and evaluation by experienced con-tent creators.

Two documents have been created and agreed to bySSIMWave, Inc. and the ASC, describing their respectiveroles in this evaluation process and the methodology thatwill be used. The study will be limited to HDR (ST 2084PQ encoded), wide color (DCI P3 or BT.2020), UHD, or4K digital imagery at up to 60 frames/sec and will evaluatehow those images are perceptually altered from the be-ginning to the end of the internet OTT delivery chainby comparing the imagery as seen and approved in thecolor grading suite to that same imagery as delivered toa customer UHD, HDR-TV display over high-speed in-ternet connections. (We will evaluate neither differentconsumer display devices nor different manufacturers ormodels of TV sets. Nor will the initial study evaluate orcompare HLG or other encoding versus ST 2084.) Theevaluation will be based on how well the original crea-tive intent, embodied in cinematic digital imagery asviewed in a color grading suite, is preserved—not justhow bright or how sharp any given image appears—asjudged by professional cinematographers and colorists.

The evaluations have been designed consistent with thesubjective user study protocol for perceptual quality as-sessment and will be performed, as shown in Fig. 6.

Based on input from internet content distributors, clipsof well-calibrated, graded original ACES content will betranscoded, encoded, compressed, decompressed, and un-encoded to reproduce how that content would be deliveredto a consumer. Different levels of encoding and compres-sion will reflect those used in typical, real-world distribu-tion networks by distributors of high-quality HDR, WCGmaterial. The SSIMWave software will then be used togenerate difference maps and quality scores based on theperceivable differences in the two data streams. The origi-nal “as graded” and altered “as delivered” clips, and the

FIGURE 5. Capture: Many 360° cameras became commercially available. Professional camera rigs are exploring new formats. Image,courtesy of thefulldomeblog.com.

September 2016 | SMPTE Motion Imaging Journal 13

SSIMWave metric results will be stored in a database.Clips will be chosen to illustrate a variety of content foundin narrative motion pictures and TV.

The testbeds will be set up in environments to emulatethose found in color grading suites. Two Sony BVM-X300 HDR, 4K reference monitors will be set up side byside to allow participants to view and compare the originaland altered versions of each clip simultaneously. Duringeach clip, the playback may be stopped or repeated as de-sired for close inspection. The participant will rate the de-gree of degradation between the “as graded” and “asdelivered” versions and indicate the types of differencesnoted. The clip will then be played back again, this timewith the results of the SSIMWave software being dis-played on a third (consumer) UHD monitor; the partici-pant will evaluate and record how well the quality metricsoftware’s automated evaluation matches their perceivedimage alterations and the overall degree of quality degra-dation (Fig. 7).

A report will be prepared by SSIMWave/University ofWaterloo researchers, in collaboration with the ASC, de-scribing the types of degradation and the degree to whichthey are detected by professional, experienced observersas altering the original creative intent. It will also reporthow closely the SSIMWave quality metric software reflects

those professional judgments. Recommendations and sug-gestions will be solicited from participants, content crea-tors, and content distributors, as well as the quality metricdevelopers and researchers, as to how to improve andmaintain creative intent throughout the delivery chain.

The Working Group has been set up with over 40 inter-ested parties from the ASC and industry partners partici-pating. SSIMWave, Inc. has agreed to devote considerableresources to this study and to ensure its scientific validity.We have begun involving HDR, UHD distributors, suchas Amazon, with discussions ongoing for participation ofother content distributors. Industry partners, such asEFilm/Deluxe, FotoKem, Sony, Panasonic, and others,have expressed interest in participating in the evaluationsor providing testbed facilities and/or equipment. TheICAS ACES files are being prepared for use as sourcematerial in the study, and other test materials are beingevaluated.

The Working Group met for an initial review of thestudy process and began filling out the details of its imple-mentation. It was made clear that participants want to besure that the study will be of use to both creatives and toHDR content distributors in educating all concerned withwhat happens once content leaves the grading suite andin encouraging the active participation of creatives in

FIGURE 6. Test file preparation and objective evaluation stages.

14 SMPTE Motion Imaging Journal | September 2016

discussions and decisions on how their images will be de-livered. This effort is ongoing.

Inquiries regarding the ASC Technology Committeeshould be sent to Delphine Figueras, [email protected].

About the AuthorsCurtis Clark, ASC, studied theaterat the Art Institute of Chicago’sGoodman School of Drama and cin-ematography at the London FilmSchool. After graduation, he beganhis career by shooting and directingnumerous documentary films in Brit-ain before transitioning to shootingfeature films and TV commercials in

Britain and the U.S. Following on the success of his shortnarrative film, “The Arrival,” Clark more recently com-pleted his highly praised short narrative film “Eldorado.”He has just completed a short narrative film for Netflixentitled “Meridian.”

Clark is chairman of the ASC Technology Committee.Since its inception in 2003, the Committee under Clark’s

leadership has achieved a series of notable successes in-cluding its collaborative work with Digital Cinema Initia-tives, LLC (DCI) to produce standardized evaluationmaterial (know as StEM) for assessing the performance ofdigital projectors and other elements of DCI standards-based digital cinema systems, as well as the 2009 CameraAssessment Series and 2012 Image Control AssessmentSeries. Clark has been an active contributor to AMPASworking groups responsible for developing ACES(Academy Color Encoding System).The ASC Technology Committee, at Clark’s instiga-

tion, embarked on the development of a groundbreakingproject to create cross-platform data exchange for primaryRGB digital color correction known as the ASC CDL.The ASC CDL was recognized by the Academy of Tele-vision Arts and Sciences with a prestigious 2012 Prime-time Emmy Engineering Award. Clark also received anAMPAS Technical Achievement Award recognizing hiswork developing the ASC CDL. Clark was recipient ofthe prestigious ASC Presidents Award in recognition ofhis creative and technology achievements. Clark is a mem-ber of the ASC and the ASC Board of Governors, as wellas a member of AMPAS.

FIGURE 7. Subjective evaluation stage.

September 2016 | SMPTE Motion Imaging Journal 15

David Reisner received a 2014Academy Technical AchievementAward and was recognized in a2012 Primetime Emmy EngineeringAward as a codesigner of the ASCCDL, which is used in the workflowof 95% of motion pictures, 70% ofscripted TV, and 99% of visual ef-fects turnover. He was a lead de-

signer of the ASC-DCI Standard Evaluation Materialused to determine the quality required for the deploy-ment of digital cinema and was Vice Chair of theSMPTE Working Groups responsible for the digital cin-ema imaging and security standards. Ninety-seven per-cent of cinema screens worldwide now use digitalcinema. Reisner also had leading roles in activities in-cluding design and production of the ASC and the Pro-ducers Guild of America (PGA) Camera AssessmentSeries and elements of the ACES. He made one of thefirst proposals for the Virtual Print Fee model used tofund the digital cinema rollout. Reisner’s “firsts” includeprogrammable portable computers, handheld video juke-box, and other computer and consumer electronics, andhe originated very long instruction word (VLIW) com-puter architecture. He has produced concerts internation-ally and trained killer whales. Reisner is well published inbooks, technical articles, and has spoken widely, includ-ing on manned space exploration at the 2014 Interna-tional Space Development Conference. He is a Memberof the SMPTE, the founding secretary of the ASC Tech-nology Committee and an ASC associate, and a memberof the Visual Effects Society; he chaired committees forthe Academy’s Scientific and Technical Awards.

Don Eklund is the senior vice presi-dent for new format promotion withSony Corporation of America. Hehas helped launch multiple con-sumer entertainment formats sincestarting his career with Sony. He co-developed and staffed the operationthat launched DVD with Sony Pic-tures and went on to oversee the de-

velopment of software tools and hardware systems thatsupported compression, authoring, and quality control forBlu-ray. Eklund also participates in industry standards or-ganizations and consortiums that focus on next-generationentertainment.

Bill Mandel is the vice president for technology at UniversalPictures. He has been involved with technology that the stu-dio distributes through for more than 20 years. He hasworked on technology and licensing related to many serviceand format launches, including DVD, electronic sell-through, interactivity, and HDR. His interests lie in formats,video, audio, digital rights management, and networks.

Michael Karagosian is the founderand president of MKPE ConsultingLLC, a Los Angeles-based consul-tancy for business development inthe entertainment technology. He isa 30-year veteran of the cinema in-dustry and has been active in thedigital cinema space for the past 11years. Karagosian led the develop-

ment of license-free standards for closed-caption systemsin digital cinema. He served for eight years as senior tech-nology adviser to the U.S.-based National Association ofTheatre Owners. He is a member of the Board of Direc-tors of In-Three and was an advisor to the U.K. FilmCouncil in the U.K. government-financed rollout of digi-tal cinema. In the late 1970s and early 1980s, Karago-sian led the development of cinema and studioproducts at Dolby Laboratories. In the 1990s, he wasa cocreator of the Peavey CinemAcoustics product lineand led the development of networked audio and con-trol systems for Disney theme parks. His company siteis http://mkpe.com.

Eric Rodli has been involved in themanagement and development ofentertainment technology since thelate 1980s, when he became thepresident of Iwerks Entertainment, apioneer in large-format film, motionsimulation theaters, and other im-mersive technologies. He subse-quently has had senior roles in a

variety of entertainment and media organizations, includ-ing being a partner in the entertainment consulting prac-tice of Pricewaterhouse Coopers, as well as a president ofKodak’s Motion Picture Division. He currently providesstrategic advisory services to companies in the entertain-ment technology industry. He is an associate member ofthe ASC. Rodli received a BA in economics from the Uni-versity of California, San Diego, and an MBA from theUniversity of Chicago.

Steve Schklair, photograph and biography not availableat the time of publication.

Gary Demos has been a pioneerin the development of computer-generated images for use in motionpictures, digital image processing,and image compression. He wasa founder of Digital Productions(1982-1986) and received theAMPAS Scientific and EngineeringAward in 1984 along with John

Whitney Jr., for their work on “For the Practical Simula-tion of Motion Picture Photograph By Means of

16 SMPTE Motion Imaging Journal | September 2016

Computer-Generated Images.” Demos also foundedWhitney-Demos Productions (1986-1988), DemoGraFX(1988-2003), and Image Essence. LLC (2005 to pres-ent). Demos received the AMPAS 2005 Gordon E.Sawyer Oscar for lifetime technical achievement. He isactively involved in the ASC Technology Committeeand has worked on the AMPAS ACES project. Demoshas presented numerous papers at SMPTE, given anSMPTE webinar, is an SMPTE Fellow, and receivedthe 2012 SMPTE Digital Processing Medal. He is theinventor of approximately 100 patents.

Gary Mandle has been working withnew display technologies for the last35 years, including CRT, fieldemission display, large-field-of-viewLED, LCD, liquid crystal on silicon,and OLED. His work has includedthe introduction of LCD for profes-sional monitoring, the implementa-tion of Sony’s digital cinema

projection systems, and the development of Sony’s OLEDreference monitor technology. His current focus is onRGB and white OLED technologies. Mandle has beenpublished in several journals for the SMPTE, the Instituteof Electrical and Electronics Engineers (IEEE), and theSociety for Information Display (SID) and has been acontributing author to the CRT, LCD, and OLED sec-tions of several textbooks. His other areas of work includethe design of camera image stabilization systems andcharge-coupled device sensor development, in which heholds multiple patents. He is a member of a number of in-dustry organizations, including SMPTE, IEEE, and SID,and he is an associate member of the ASC.

Greg Ciaccio is a post-productionprofessional focused primarily onfinding new location-based technol-ogy and workflow solutions formotion picture and TV clients. Previ-ously, Ciaccio served in technicalmanagement roles for the respectiveCreative Services divisions for bothDeluxe and Technicolor. Key devel-

opments include the first DP Lights deployments for Tech-nicolor and full near-set dailies solutions for DeluxeTelevision. Ciaccio is a member of the SMPTE, the ASCTechnologyCommittee, theAMPASScience andTechnol-ogy Council, the Hollywood Professional Association, andthe Digital Cinema Society. He holds a BA degree in radio-TV-film from California State University, Northridge,where he currently teaches post-production part-time.

Lou Levinson is a long-time associ-ate member of the ASC and thechair of the Digital IntermediateSubcommittee. A member of theASC Technology Committee sinceits inception, he has been a frontlinecolorist from the “on the fly” analogera to today’s advanced ACES andbeyond digital pipelines, having

worked with notables from Woody Allen to Rob Zombie.He is currently working for Apple in the San FranciscoBay Area.

David Stump is a working DP, vi-sual effects DP, visual effects super-visor, and stereographer, earning anEmmy Award, an Academy Awardfor Scientific and Technical Achieve-ment, and an International Cinema-tographers Guild Award. He iscurrently the chairman of the Cam-era Subcommittee of the ASC Tech-

nical Committee and a member of the AMPAS Scienceand Technology Council, where he chairs the Next Gen-eration Cinema Technology Work Group and participatesin the AMPAS ACES project. Under his guidance, thecombined efforts of the PGA and the ASC produced boththe ASC/PGA Camera Assessment Series and the ASC/PGA ICAS, which are side-by-side comparisons of virtu-ally all of the high-end digital cinema cameras againstfilm. He has lectured and taught classes in cinematogra-phy and visual effects and has spoken at many conferencesand trade shows, including the National Association ofBroadcasters and the International Broadcast Convention.

Bill Bennett has been a cinematog-rapher for more than 35 years, pri-marily shooting TV commercials forhundreds of major clients: Ford,Lexus, Coca Cola, Apple Computer,American Airlines, McDonalds, andBudweiser. Bennett had the greathonor of being the first cinematog-rapher, with a career consisting of

primarily shooting TV commercials, to be invited to jointhe ASC. In 2016, the ASC presented Bennett with thePresident’s Award at the 30th Annual ASC AwardsShow. He is currently serving as a vice president at theASC. Bennett often advises ARRI, Zeiss, and others onequipment design.

September 2016 | SMPTE Motion Imaging Journal 17

David Morin is the president ofDavid Morin, LLC, a diversifiedconsultancy specializing in VR forproduction. Morin is also the chair-man of the Joint Technology Sub-committee on Virtual Production, ajoint effort of six Hollywood-basedorganizations: the ASC, the ArtDirector’s Guild (ADG), the Visual

Effects Society (VES), the Previsualization Society, thePGA, and the International Cinematographers Guild. Heis also a past chair of the Autodesk Film CTO AdvisoryCouncil, a product focus group of large studio facilities,

and a past co-chair of the ASC-ADG-VES Joint Technol-ogy Subcommittee on Previsualization, a committee thathelped define the role of previsualization in the filmindustry.

Michael Friend is the director of the digital archive inSony Pictures Entertainment’s Asset Management groupand teaches at the University of California, Los Angeles,in the Moving Image Archive Studies Program.

W. Thomas Wall is a retired computer systems designerand a professional photographer. He is a chief technologyofficer at LightView.

18 SMPTE Motion Imaging Journal | September 2016