Niagara Takes Flight

Project: Niagara Takes Flight - An On-site Color Grading Adventure

Niagara Takes Flight opens today! Its a beautifully Immersive 4d adventure over Niagara Parks. I’m very greatfull that my long time collaborator Rick Rothschild tapped me again to make the images sing like the locations do in real life. Rick and I go all the way back to Soarin’ Over California when we first met. It’s honor to work with such a consummate professional and true leader and innovator in the space.

I've spent a lot of time in the controlled environment of the theater at Warner Bros. Water Tower Color, but sometimes a project comes along that requires you to get out in the field. This was one of those times. "Niagara Takes Flight" is an immersive ride experience that uses four laser projectors and a Seventh Sense system to warp and project the content onto a massive, curved dome.

There's just no substitute for being in the venue. The interaction between those four projectors presents a unique set of grading challenges that you simply can't solve in a standard grading suite. My goal was to create a seamless, powerful, and truly immersive experience for the audience. To do that I had to take Water Tower Color to Niagara.

The Powerhouse Duo

My portable setup was built around a MacBook Pro M3 Ultra and a 12TB IOdyne Pro Data drive, configured in RAID 0 for pure performance. The disk's speed and small footprint made all the difference; honestly, there's nothing else on the market that would have let me travel so light, and set up so fast.

My Peripherals

  • Control Surface: Tangent Elements. My first choice would have been a Slate, but the Elements' modularity made it perfect for packing.

  • I/O: A Blackmagic UltraStudio Mini 4K connected via Thunderbolt.

  • Storage: An IOdyne Pro Data 12TB array. Be sure to enable multipathing and connect the disk on two different busses, for example 1&3 not 1&2.

  • Misc: An Elgato Stream Deck for macros and a 4th gen iPad Pro used as a secondary display for scopes and stills via Sidecar.

Media and Timeline:

The footage was 60fps, 5760x4320 PIZ-compressed 16-bit half-float EXR files. I kept my Baselight project on the Mac's local database, but all the source media, caches, proxies, and renders lived on the Pro Data drive. The timeline was complex. We worked in a scene-referred grading space, which meant I wasn't doing any harm to the pixels, keeping the master as fluid as water until the final delivery targets were struck for the theater and frozen into ice, ACES archival elements, and other client versions were also rendered which insures longevity for any systems that may come along in the future.

Performance & Reliability:

Playback was a dream—60fps with no problem. I did have one render hiccup: the Mac went to sleep during a long render when it ran out of power even though it was connected to the Pro Data. The solution was simple— disconnect the panels and breakout box, since they're not needed for rendering and just plug in the power adapter in.

Shout Outs

Peter Postma and the Filmlight crew came through in a big way with early versions of Baselight M. It’s such a pleasure to have that kind of power in my backpack. Another huge thank you to Mike Gitig and the entire IOdyne team for thier support. I also want to thank NPC, Brogent, Rick Rothschild and Mike Quigley for allowing me to contribute my small part to this spectacular attraction. I have a feeling it will be there for many years to come so go check it out next time you are planning your family vacation! The falls are truly one of those bucket list places you must visit. Nothing else like it on the planet.

LG Roadshow - A New Era of Home Entertainment: Ambient Light Compensation

Here is a promo video I was recently a part of, showcasing a technology I'm incredibly passionate about: ambient light compensation at the LG Roadshow. If you've ever wondered how we ensure the picture you see at home matches what we see in the color grading suite, this is a big part of the answer. In a way, a little bit of my eye is in every display that has this technology.

A huge thank you to Mike Zink, Annie Chang, Mike Smith and the UHDA for their tireless efforts in bringing this functionality to living rooms everywhere.

And a friendly reminder: please use Filmmaker Mode. It should be the default, in my opinion. If you care about seeing a film as the artists intended, make sure to turn it on!

Check out the video below!

ACES at Cine Gear Saturday June 7th

Join me on the ACES panel "Color Management That Can Up Your Game!" at Cine Gear Expo! Saturday, June 7th, 1:15 PM - 2:05 PM, Theater 1. Looking forward to a great discussion with Lynette Duensing, Patrick Renner, and Mark Weingartner, ASC. See you there!

Building a Retro CRT Effect: From Shadertoy to Baselight Matchbox Shader

Hey everyone, lately, I've been asked to dive back into the world of retro aesthetics, and one thing that always stands out is the look of old CRT (Cathode Ray Tube) monitors. That characteristic scanline flicker, the subtle noise, and the slight color fringing – it's all part of a nostalgic visual language. I wanted to bring that look into Baselight (last time I did this it was for Mistika. https://www.johndaro.com/blog/2020/10/28/vhs-shader), so I built a custom Matchbox shader that accurately simulates these CRT artifacts. This post breaks down the process, from finding inspiration to creating a user-friendly tool.

The Inspiration: Shadertoy

My journey started on Shadertoy, a fantastic resource for exploring and learning about GLSL shaders. I found a great CRT effect shader (https://www.shadertoy.com/view/Ms3XWH) that captured the essence of the look I was after. It used clever techniques to generate scanlines, add noise, and even simulate chromatic aberration (that slight color separation you see at the edges of objects on old TVs).

However, Shadertoy shaders are self-contained and designed for a specific environment. To make this useful in Baselight I needed to adapt it for the Matchbox framework.

From Shadertoy to GLSL Standard

The first step was to "translate" the Shadertoy-specific code into standard GLSL. This involved a few key changes:

  1. mainImage to main: Shadertoy uses a function signature mainImage(out vec4 fragColor, in vec2 fragCoord). Standard GLSL, and Matchbox, use void main(void). We also replace fragCoord with the built-in gl_FragCoord and output the color to gl_FragColor.

  2. Uniform Inputs: Shadertoy provides inputs like iResolution (screen resolution) and iChannel0 (the input texture) automatically. In Matchbox, we need to explicitly declare these as uniform variables: adsk_result_w, adsk_result_h, and src, respectively. We also added iTime as a uniform to control animation.

  3. Texture Sampling: Shadertoy's texture function becomes the standard texture2D in GLSL.

Here's a snippet illustrating the change:

Shadertoy:

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
    vec2 uv = fragCoord.xy / iResolution.xy;
    // ...
    vec4 tex = texture(iChannel0, uv);
    fragColor = tex;
}

Standard GLSL (for Matchbox):

uniform float adsk_result_w;
uniform float adsk_result_h;
uniform sampler2D src;

void main (void)
{
    vec2 uv = gl_FragCoord.xy / vec2(adsk_result_w, adsk_result_h);
    // ...
    vec4 tex = texture2D(src, uv);
    gl_FragColor = tex;
}

Making it Controllable: Matchbox XML

The real power of Matchbox comes from its ability to expose shader parameters as user-adjustable controls. This is done through an XML file that describes the interface. I wanted to give users control over the key aspects of the CRT effect:

  • Scanline Width: How thick the scanlines appear.

  • Noise Quality: The granularity of the vertical noise (lower values create more distinct lines).

  • Noise Intensity: The amount of horizontal jitter.

  • Scanline Offset: The intensity of the vertical scanline displacement.

  • Chromatic Aberration: The strength of the color fringing.

  • Time: The speed of the animation.

To achieve this, I did the following:

  1. GLSL Changes: In the GLSL code, I replaced the const float variables that controlled these parameters with uniform float variables. This is crucial – it tells Matchbox that these values can be changed externally.

// Before (hardcoded):
const float range = 0.05;

// After (Matchbox controllable):
uniform float scanlineRange;

XML Creation: I created an XML file (with the same name as the GLSL file) that defines the controls. Each control is specified using a <Uniform> tag. The most important attribute is Name, which must match the corresponding uniform variable name in the GLSL code.

<Uniform Max="0.1" Min="0.0" Default="0.05" Inc="0.001" ... Name="scanlineRange">
</Uniform>

The XML also includes attributes like DisplayName (the label in the Baselight UI), Min, Max, Default, Tooltip, and layout information (Row, Col, Page). These define how the control appears and behaves in Baselight.

Putting it All Together

The final step was to place both the .glsl and .xml files in the usr/fl/shaders directory. Baselight automatically recognizes the shader and makes it available in the Matchbox node. Pro tip, my shaders directory is a link to a network location. This way all the Baselights can share the same shaders and it makes updating easier.

Now, when I add a Matchbox node and select the CRT effect, I get a set of sliders and controls that let me tweak the look in real-time. I can easily adjust the scanline thickness, add more or less noise, and dial in the perfect amount of retro goodness.

Download the Files

You can download the complete GLSL and XML files for this Matchbox shader here:


Conclusion

This project was a great upgrade to my GLSL knowledge, demonstrating how to take a cool shader effect from a platform like Shadertoy and adapt it into a practical, user-friendly tool for Baselight. The combination of GLSL's power, speed, and Matchbox's flexibility opens up a world of possibilities for creating custom effects that can be used through out the entire post pipeline. It might be old tech but still very useful today. I hope this breakdown inspires you to experiment with your own implementation. Let me know what you think, and feel free to share your own shader creations!

Best Practices - Remote Color Approvals on iPad Pro

Color Settings on iPad Pro

Introduction

In the realm of professional film and video production, maintaining color accuracy is paramount. The iPad Pro, with its exceptional display capabilities, has become a valuable tool for remote color approvals. However, to ensure optimal results, it's essential to configure your iPad Pro correctly. This guide consolidates information from various sources to provide comprehensive instructions for achieving color accuracy on your iPad Pro during remote sessions.

iPad Pro (5th Generation and Newer)

Display Settings

  1. Reference Mode: Turn ON

    • This automatically sets optimal color settings and locks them

    • Exception: Disable for Dolby Vision content review

  2. Additional Settings (Automatically Set by Reference Mode):

    • Brightness: True Tone OFF

    • Night Shift: OFF

    • Accessibility > Display & Text Size > Auto-Brightness: OFF

  3. Fine-Tune Calibration:

    • Recommended if you have a measurement device (like a spectroradiometer)

    • Measure a 100-nit gray patch and use the values in the Fine Tune Calibration menu

Brightness

  1. Reference Mode:

    • Brightness slider is disabled

    • HDR: Peak brightness at approx. 800 nits

    • SDR: Peak brightness at approx. 100 nits

  2. Dolby Vision Content:

    • Disable Reference Mode

    • Set brightness slider to 38%

    • Ensure other settings match what Reference Mode would set

  3. Colorist Calibration (if applicable):

    • Colorist compares iPad Pro to reference monitor to fine-tune brightness

  4. Shortcuts App:

    • Can be used to:

      • Determine the current brightness percentage

      • Create Siri shortcuts to set specific brightness levels

iPad Pro (2nd to 4th Generation)

Display Settings

  1. Appearance:

    • Dark Appearance: ON

    • Automatic Appearance: OFF

  2. Other Settings:

    • True Tone: OFF

    • Night Shift: OFF

    • Accessibility > Display & Brightness > Auto-Brightness: OFF

    • Accessibility > Display & Text Size > Reduce White Point: OFF

Brightness

  1. Colorist Calibration (Recommended):

    • Colorist compares iPad Pro to reference monitor to determine base brightness setting

  2. General Recommendations (If colorist comparison isn't possible):

    • SDR: Brightness slider at 50%

    • HDR:

      • Dark room: Brightness slider at 50%

      • Bright room: Brightness slider at 100%

Additional Notes for ClearView (Applicable to All Generations)

  • Streaming Apps: Download the required Sohonet ClearView 'Flex' app beforehand

  • Session IDs: Obtain necessary session IDs or an invite from myself or Paul ahead of the session by email.

Remember that these are guidelines, and some adjustments might be needed based on specific project requirements or for newer iPad models.


Let me know if you have any other questions or need further assistance!

Happy Grading,

JD

How to - Add a LUT in Avid Media Composer

There are several ways to load LUTs into Avid Media Composer:

1. Through Source Settings:

  • Right-click on a clip in your bin or timeline.  

  • Select "Source Settings."  

  • Under the "Color Encoding" tab, go to "Color Adapter Type."  

  • Click on the dropdown menu and choose "User Installed LUTs." You can add LUTs you have already loaded to the source clip and hit “Apply”

  • To load a LUT click Color Management Settings and then click “Select LUT file”

  • Browse your LUT file's location (.cube or .lut format) and select it.

  • Click "Open" to load the LUT.

2. Through Color Management Settings:

Color Management Settings
  • Go to "Settings" in the menu bar.

  • Select "Color Management."  

  • Under the "Project" or "Shared" tab (depending on where you want the LUT to be available), click "Select LUT File."  

  • Browse for your LUT file and select it.  

  • Click "Open" to load the LUT.  

3. Using the "Color LUT" Effect:

  • Go to the "Effects" tab in the Effect Palette.

  • Under "Image" effects, find and drag the "Color LUT" effect onto your clip in the timeline.  

  • In the Effect Editor, click on the dropdown menu next to "LUT File" and choose your loaded LUT.

Avid LUT Effect on Filler track

Additional Tips:

  • LUTs should be in either .cube or .lut format to be compatible with Avid.

  • Make sure to place the LUT files in a location you can easily access and remember.

  • You can organize your LUTs by creating subfolders within the Avid LUTs directory.

  • Shared LUTs are available to all projects, while project-specific LUTs are only accessible within the current project.

For more detailed instructions and visual guides, you can refer to these resources:

I hope this helps! Let me know if you have any other questions.

John Daro's Take on the Future of Color Grading: AI, Cloud, and Remote Workflows


Hey everyone,

Post Magazine - Color grading trends: John Daro looks at AI, the cloud & remote workflows

Here is an interview from a bit ago where I shared my thoughts on some of the buzzwords in the color grading world:

  • AI models: This stuff is exploding. It's got the potential to change the game for colorists, but like any tool, it's how we use it that matters. Oh, and a good Nvidia GPU.

  • Cloud: Remote collaboration and cloud-based grading are becoming the norm. While there are challenges, the benefits for flexibility and efficiency are hard to ignore.

  • Remote Workflows: The pandemic forced us all to adapt. I think remote grading is here to stay, but it's important to find the right balance.

Always curious to hear what you guys think. Agree? Disagree? Hit me up and let me know. The best conversations happen when we challenge each other's ideas.

Let's keep pushing pixels and the boundaries of color grading!

-JD

Baselight X Grade: A Game-Changer? Hands-On Impressions

I'm not always the quickest adopter of fancy new tools – it takes some serious wow factor to shift my well-worn workflow. But folks, Baselight's latest update has me geeking out a bit over its new "X Grade" feature demonstrated in this video https://www.youtube.com/watch?v=R42j8JNmFIU. Here's my take as a working colorist:

What on Earth IS X Grade?

In a nutshell, X Grade combines the precision of curves with the intuitive color-wheel experience. It displays a 3D representation of your image's color information. You can:

  • Warp: Stretch and squeeze specific color areas of the image. Think pinpointing a too-reddish skintone and shifting it more towards yellow.

  • Tweak: Isolate colors to boost or decrease saturation and luminosity.

  • Paint: Freehand selections to make adjustments, similar to qualifiers, but with this cool 3D visual aid.

Why This Isn't Just a Gimmick

Here's where I think it shines, especially compared to traditional tools:

  • Visualizing the Impossible: Sometimes, we know what change we want, but getting there with curves/color wheels is a fiddly nightmare. X Grade makes those subtle, targeted shifts easier to conceptualize.

  • Speed for Specific Tasks: Need to quickly fix a muddy green background? Isolate and tweak in X Grade, likely faster than precise keying.

  • Nuanced Skin Tones: The video hints this is a skin-tone savior. Being able to pinpoint and shift exact ranges within skin while seeing it visually? Sign me up.

X Grade Won't Replace Everything

Let's be realistic – this isn't magic. My good ol' curves and scopes aren't going anywhere. Here's where I see the limits:

  • Learning Curve: Like any powerful tool, mastery takes time. Don't expect instant transformative grades.

  • Won't Fix Bad Footage: As always, a well-shot image is still the foundation for top-notch color.

My Verdict: A Potentially Powerful Addition

I'm genuinely excited to put X Grade through its paces on upcoming projects. It feels like an intuitive tool with the potential to solve some of those stubborn grading situations with more finesse and speed. My advice: Don't sleep on this update – get experimenting!

Question for Y'all

Baselight users, have you tried X Grade yet? What are YOUR first impressions? Is there a specific grading challenge you think it might be perfect for? Leave your insights in the comments!

ShotDeck & the Art of the Steal: How References Fuel Your Vision

Hey colorists! John Daro here, back to talk about the tools that fuel our creative fire. Today's spotlight is on ShotDeck, a fantastic resource for visual inspiration, and how to leverage it alongside references to take your grading to the next level.

ShotDeck: Your Inspiration Library

ShotDeck is a goldmine of high-resolution stills from movies, meticulously tagged and categorized. Need a shot showcasing a specific lighting technique, camera angle, or emotional tone? ShotDeck has you covered. Here's how to use it effectively:

  • Spark Creativity: Stuck in a rut? Browse trending shots or explore genres outside your usual ballpark. Unexpected visuals can jolt your brain and ignite fresh ideas for your project.

  • Refine Your Vision: Have a general idea but need concrete examples? Search by tags like "low-key lighting," "cinematographer Roger Deakins," or even "melancholy mood." Narrowing down your search helps you articulate your vision more precisely.

  • Build Decks for Collaboration: Working with a director or DP? Create a shared deck with reference shots that embody the desired aesthetic. It fosters clear communication and keeps everyone on the same page.

Beyond ShotDeck: The Power of Personal References

While ShotDeck is fantastic, don't underestimate the power of your own curated references. Keep an eye out for:

  • Movies & Shows: Always be on the lookout for scenes that resonate visually. Take screenshots or note down the title and specific scenes for future reference.

  • Paintings & Photography: The world beyond film offers a wealth of inspiration. A captivating painting's use of light or a photographer's masterful color palette can translate beautifully into your grade.

  • Real-Life Observations: Notice the way sunlight filters through trees or the warm glow of streetlamps at night. These everyday moments can spark unique color ideas.

Using References Effectively: From Inspiration to Implementation

Now, how do you bridge the gap between a cool reference shot and your own project?

  • Identify Key Elements: Analyze the reference – what lighting techniques, color palettes, or compositional choices are at play? Break it down into manageable aspects for your grade.

  • Adaptation is Key: Remember, it's about inspiration, not replication. Adapt elements from the reference to fit your specific scene and story.

  • Use As a Springboard, Not a Blueprint: Your reference should ignite your creativity, not dictate your every move. Experiment and explore variations to achieve your own unique vision.

Bonus Tip: Don't Be Afraid to Mix It Up!

The beauty of references is that they can come from anywhere. Maybe a classic painting inspires a bold color contrast, while a shot from a gritty indie film influences your low-light approach. Don't be afraid to pull inspiration from unexpected places!

By combining ShotDeck's vast database with your own curated references, you build a powerful arsenal of visual inspiration. Remember, the best grades are informed by a strong vision, fueled by inspiration, and executed with your own unique touch. Now go out there, steal like an artist (but ethically of course!), and create something stunning.

Let me know in the comments below – what are your favorite ways to use references in your grading workflow?

8AE - Really Really

I was happy to help out my old friend 8AE on this one. My favorite aspect of this project was the custom shader we created for a distressed graphic rendering. She’s a super creative person and it was a blast getting the look to where she wanted it to be. Check out the video below.

Los Angeles native 8AE (pronouced "bae") is a 2 time Grammy nominated artist and songwriter. She co-wrote on Metro Boomin's Billboard #1, GRAMMY nominated album "Heroes and Villains" and was a feature on the Creed 3 soundtrack single "Culture".