Adding a .3DL file to Hiero’s Viewer

1. Copy the .3dl file into the following directory:

/Applications/Hiero1.6v1/Hiero1.6v1.app/Contents/Plugins/OCIO/nuke-default/luts

2. Edit the following file in your favorite text editor:

/Applications/Hiero1.6v1/Hiero1.6v1.app/Contents/Plugins/OCIO/nuke-default/config.ocio

3. Look for the section in the file that specifies the display luts. It looks like this:

displays:
  default:
    - !<View> {name: None, colorspace: raw}
    - !<View> {name: sRGB, colorspace: sRGB}
    - !<View> {name: rec709, colorspace: rec709}

4. Add another view definition for your custom LUT:

    - !<View> {name: LogCRec709, colorspace: LogCRec709}

5. Go to the very end of the file, and add a colorspace definition:

  - !<ColorSpace>
    name: LogCRec709
    family: ""
    equalitygroup: ""
    bitdepth: 32f
    description: |
      Conversion from Alexa LogC to Rec. 709
    isdata: false
    allocation: uniform
    allocationvars: [-0.125, 1.125]
    from_reference: !<FileTransform> {src: LogC_to_Rec709.3dl, interpolation: linear}

6. Restart Hiero

And that should do it.

On Quicktime and Gamma shifts

The desire of users in the post-production community has always been simple. When you encode a Quicktime movie, it should look the same on every computer that you view it on as the machine it was encoded on. Has this ever worked? Not exactly. Here’s why.

PCs and Linux systems have long used a display gamma standard known as sRGB. This is roughly a gamma curve of 2.2, with a few notable exceptions, the most significant being the Mac. Until the release of MacOS 10.6 (Snow Leopard), Macintosh systems used a display gamma of roughly 1.8.

Many common still image formats, such as JPEG and PNG, contain explicitly-encoded values. The gamma curve of the display is baked in to the image data by the machine that encoded it. If it was done on a PC, that curve is roughly 2.2. If it was done on a Mac, that curve was roughly 1.8. Certain types of movie files, such as Quicktime and MPEG, also have these values explicitly encoded. In theory, if you encoded a JPEG image on an older Mac, it would look dark on a PC. If you encoded that same image on a PC, it would appear bright and washed out on the Mac.

For still images, this problem was addressed years ago by the International Color Consortium, with the development of ICC Profiles. JPEG files, for example, are encoded using the international ISO/IEC 10918-1 standard. So, when a piece of software, like Photoshop, displays a JPEG image, it is aware of the platform and operating system upon which it runs, and it will compensate for this discrepancy in its image decoding process.

The same applies for many types of video standards, like MPEG. MPEG video is converted first into the Y’CbCr colorspace, commonly used in television, compressed, and encoded. MPEG, however, is an open standard. The same is not the case for Quicktime, which is a proprietary format developed and controlled by Apple.

In theory, Quicktimes are decoded with a gamma of 1.8, and encoded with a gamma of roughly .55. This matches the legacy Macintosh systems that the format was originally developed for. The major issue here is that the Quicktime format is designed to be a simple container. As such, it does not mandate a certain colorspace, profile, or gamma curve be used. A Quicktime file is a collection of atoms. Some of these atoms are tracks that contain either raw video or audio data. Others are metadata that describe the content contained within the file. It is assumed that the raw data is encoded with gamma .55, as mentioned above. However certain atoms can contain gamma values, color transformation matrices, and more recently color profiles.

Since Quicktime is a closed standard, Apple can add atoms to the container, and add support for those atoms to the pieces of software that it maintains. However, this becomes an ugly, nasty mess when you start thinking about 3rd-party applications that may use older versions of the Quicktime libraries to either encode or decode and display these files. This is further compounded by the display gamma discrepancy between older Macs and PCs. Apple has devised various schemes over the years to compensate for this, and to attempt to ensure uniform appearance of color across multiple devices, from print to acquisition to display. For Quicktimes, Apple has created a series of atoms which define some sort of color transform, which I will explain in detail below.

The ‘gama’ atom

The first of these display metadata atoms to surface was the ‘gama’ atom. This was designed to describe the gamma curve that was used to encode the video data. Suppose we take a Quicktime file that was encoded on a PC. Providing the PC used Apple’s Quicktime library under the hood, the resulting Quicktime would have a ‘gama’ tag set that would indicate the PC encoding gamma. You would open this file on Quicktime on both Mac and PC, and providing you used Apple’s Quicktime player, it would look the same on both monitors. The problem for those of us in post production was always the Avid. Avid used the Quicktime API to simply read the raw video data in from the file. Since older versions of Avid were not aware of the ‘gama’ atom, they did not know to apply a slightly different gamma curve to the imagery to get it to display correctly. This resulted in film editors calling VFX vendors all over town with the exact same complaint: “How come your Quicktimes look dark on my monitor?” Frantic Films released a small utility called “QuicktimeGammaStripper”. This would simply remove the atom from the file, so it would look the same between Quicktime player and Avid.

The ‘nclc’ and ‘colr’ atoms

Around the release of Snow Leopard, we started to see a ‘nclc’ atom instead of a ‘gama’ atom. For an incredibly detailed, make-my-head-hurt explanation, check out Apple’s Developer website. The idea behind this is incredibly well-intentioned, and was an attempt to compensate how different display devices will alter outgoing image data from the filesystem to our eyes, and how different digital camera sensors (and film scanners, for that matter) will alter incoming data from our eyes to the filesystem. This becomes problematic when encoding Quicktime files out of Nuke. Nuke takes linear image data, puts it through a look up table and also through a gamma curve, and displays it on the screen. When we apply the same LUT and gamma curve and write out a Quicktime, it should look the same in the Quicktime player as it does in Nuke, right? Wrong. Beginning with Nuke 6.0, the Foundry has changed how Quicktimes are encoded internally. The result is that Quicktimes written out of newer versions of Nuke are tagged with the ‘nclc’ atom, and the Quicktime player applies a transformation matrix that results not only in a gamma shift, but frequently a hue and saturation shift as well. The work-around for this was to find an old Mac or Windows XP machine, and use Nuke5.2.

Sadly, Nuke 5.2 is starting to no longer be a viable solution, given the age of the application, and that it no longer runs on newer versions of Mac OS. Right now, the conventional wisdom seems to be to perform all of your color transformations in Nuke, but write out a sequence of PNG images with “raw data” checked. From there, these image sequences must be transcoded into a Quicktime by another piece of software. Robert Nederhorst has an excellent tutorial on how to do this with FFMpeg, an open-source (and free) product. It is located here. Tweak Software also offers an incredibly full-featured and useful conversion tool called rvio. It is available for purchase for $199, and can be found here.

Yet another alternative would be to re-write the gamma stripper utility for newer versions of MacOS, and to also support the ‘nclc’ atom. I have done a bit of research on this, and will take a stab at it when I get some free time. First, there are three Quicktime APIs available from Apple, which I will detail below.

Quicktime

This is Apple’s legacy, C-based 32-bit API. It is built on top of Carbon, and has been around since the nineties; it pre-dates MacOS X. Apple reluctantly continues to support this code as it does not want developers to be forced to re-write massive amounts of code. However, Apple has indicated in no uncertain terms that the Quicktime API is nearing end-of-life, and as such no new development should use it.

QTKit

QTKit is Apple’s Objective-C based Cocoa API for Quicktime. It is currently supported in all newer versions of MacOS, and is 64-bit capable. Sadly, it is not very full featured, and lacks many of the functionality that developers look for, including removing and adjusting atoms.

AVFoundation

AVFoundation was originally introduced as part of iOS 3, as a new way for iOS devices to access and edit audio-visual content. It provides the type of low-level access that was missing from QTKit. As of MacOS 10.7 (Lion), Apple has ported this code from iOS to MacOS. It is not supported on platforms other than MacOS 10.7 and 10.8. Apple has stated that if you do not need to support legacy operating systems in your code, all Quicktime development should be done using AVFoundation going forward. It is built on top of Cocoa, and is fully 64-bit compliant.

It looks like I will have to write the gamma stripper using AVFoundation. Now, does anyone have any example code I can use? 🙂