Nuke Reference
Free reference guide: Nuke Reference
About Nuke Reference
The Foundry Nuke VFX Reference is a quick-lookup guide for The Foundry's Nuke, the industry-standard node-based compositing software used in film and television visual effects. It covers the complete compositing pipeline from Read/Write nodes for EXR, DPX, and TIFF image sequences through Merge operations (over, plus, multiply, screen), channel manipulation with Shuffle2 and Copy nodes, Premultiply/Unpremultiply alpha handling, and final render output with frame number padding patterns.
The reference spans ten categories: Overview (Nuke/NukeX/NukeStudio editions, render execution), I/O (Read/Write for EXR sequences), Compositing (Merge blending modes, ChannelMerge, Shuffle2, Premult/Unpremult), Color Correction (ColorCorrect shadows/midtones/highlights, Grade blackpoint/whitepoint/gamma, HueCorrect per-hue curves, OCIO/ACES color management), Keying/Roto (IBKGizmo, Keylight chroma keyer, Despill, RotoPaint with Clone/Dodge/Burn brushes), Tracking (2D point/planar tracker, CameraTracker for 3D solve with PointCloud output), Transform (Transform, Reformat, CornerPin, LensDistortion undistort/redistort workflow), Filter (Blur, Defocus/ZDefocus depth-based bokeh, VectorBlur motion vectors, Denoise/Grain matching), 3D (Scene/Camera/Card/ScanlineRender, Deep compositing with DeepMerge/DeepHoldout), and Scripting (Expression/TCL knob linking, Python API with nuke.createNode/execute, Gizmo/Group packaging).
This reference is designed for VFX compositors, technical directors, and pipeline engineers who need instant access to node parameters, compositing rules (Unpremult before color correction, Premult before Merge), and Python API calls. Every entry includes practical usage notes drawn from real production workflows, from green screen keying pipelines to projection-based set extensions.
Key Features
- Merge node reference with all blending modes (over, plus, multiply, screen, max, min, difference) and mix parameter control
- Alpha pipeline guide covering Premultiply/Unpremultiply rules: Unpremult before color correction, Premult before Merge to avoid edge fringing
- Color correction nodes comparison: ColorCorrect (shadows/midtones/highlights split), Grade (blackpoint/whitepoint/lift/gain/gamma), HueCorrect (per-hue curves)
- Keying workflow reference: IBKGizmo screen color setup, Keylight chroma keyer, Despill removal, and the full pipeline (Key, Despill, Edge, Comp)
- CameraTracker 3D solve workflow: Track Features, Solve camera, Scene generation with Camera node and PointCloud output for set extensions
- Deep compositing guide covering DeepRead, DeepMerge (automatic depth-based compositing), DeepToImage conversion, and DeepHoldout masking for volumetrics
- LensDistortion workflow: Undistort before compositing, Redistort after, with k1/k2 radial distortion parameters for camera tracking alignment
- Python scripting API reference: nuke.createNode, setValue, selectedNode, allNodes, execute for batch rendering, and Gizmo/Group custom tool packaging
Frequently Asked Questions
What Nuke editions does this reference cover?
This reference covers nodes and workflows common to all Nuke editions: Nuke (base compositing), NukeX (adds CameraTracker, Primatte keyer, RayRender, and advanced LensDistortion), and NukeStudio (adds editorial timeline). NukeX-exclusive features like CameraTracker and RayRender are clearly marked in the entries.
What is the correct Premultiply/Unpremultiply order?
The standard rule is: Unpremultiply before any color correction (Grade, ColorCorrect, etc.) to avoid darkening edges, then Premultiply after color correction and before Merge operations. Most CG renders output premultiplied images. Violating this order causes visible edge fringing and black halos around alpha edges.
How do I set up a green screen keying pipeline?
The standard pipeline is: Key (IBKGizmo or Keylight to generate the matte by selecting the screen color), then Despill (remove green/blue color spill from the foreground), then Edge (refine matte edges with erode/blur), then Comp (Merge the despilled foreground over the background using the refined matte). For difficult keys, combine multiple keyers and use RotoPaint for manual cleanup.
What is Deep compositing and when should I use it?
Deep compositing stores per-pixel depth samples rather than flat RGBA values. DeepMerge automatically handles occlusion ordering based on depth, eliminating manual holdout mattes. It is especially useful for volumetric elements (smoke, fog, atmospheric effects) where CG elements need to naturally intersect with each other. The workflow is: DeepRead, DeepMerge multiple elements, then DeepToImage to flatten for final output.
How does CameraTracker work for 3D compositing?
CameraTracker (NukeX) extracts 3D camera motion from footage in three steps: 1) Track Features automatically detects and tracks feature points across frames, 2) Solve computes the 3D camera path and focal length, 3) Scene generation creates a Camera node and PointCloud. You can then project textures onto Card nodes using ScanlineRender to create set extensions. Always apply LensDistortion Undistort before tracking for accuracy.
How do I manage color with OCIO in Nuke?
Nuke integrates OpenColorIO (OCIO) for color management. Set the OCIO config file (e.g., aces_1.3) in Project Settings > Color Management. Assign source colorspace on Read nodes, output colorspace on Write nodes, and use the Viewer Process for monitor display transform. The OCIOColorSpace node converts between color spaces within the node graph. ACES workflow is the current industry standard for maintaining color fidelity across departments.
What is the difference between Blur, Defocus, and ZDefocus?
Blur applies a simple Gaussian blur by pixel radius. Defocus simulates lens bokeh with controllable shape and aspect ratio (useful for anamorphic looks). ZDefocus uses a depth channel input to apply variable blur based on distance from the focal plane, creating realistic depth-of-field effects with proper bokeh shapes (circular, polygonal). Use ZDefocus for compositing CG elements that need to match live-action camera DOF.
How do I automate Nuke with Python scripting?
Nuke provides a full Python API: nuke.createNode("Grade") creates nodes, node["multiply"].setValue(1.5) sets parameters, nuke.selectedNode() accesses the selection, nuke.allNodes("Read") finds all Read nodes, and nuke.execute(writeNode, 1, 100) renders frames 1-100. Callbacks like nuke.addOnCreate and nuke.addKnobChanged enable event-driven automation. Package reusable tools as Gizmos (.gizmo files in NUKE_PATH) for team distribution.