online store ComfyUI in Nuke | TD Meetup 10
top of page
Writer's pictureAlex

ComfyUI in Nuke | TD Meetup 10

What if we could create AI images directly in Nuke using ComfyUI?


Since the inception of Stable Diffusion in 2022, the introduction of Midjourney and ChatGPT AI generated images became the hot new trend of the past few years. We saw an increase of AI generated images being used as thumbnails, content and even submitted as professional work. But how useful is it actually especially in the much more demanding professional field of visual effects, animation and games?


Can ComfyUI and AI improve our compositing workflow?

Imagine a fast and efficient way to create custom images without leaving Nuke and staying in our familiar workflow. Did you know that you could use ComfyUI inside Nuke? With the power of Python for Nuke and friendship, a few Comp TDs set out to do just that: Implement a ComfyUI workflow directly in Nuke.


Our guest Tim Riopelle, Senior Technical Artist at Unity, explains how he did it all, from creating previs material for his personal projects as well as assets for a video game. We also explore Python tools to ease the transition to ComfyUI for Nuke artists.


Let's take a look into the promising world of ComfyUI in Nuke:




ComfyUI

ComfyUI is a software that allows us to design and run advanced stable diffusion pipelines using a graph-based diagram that can generate AI-generated images.


The node-network approach of ComfyUI allows a maximum of flexibility and freedom to create custom solutions for various problems - which is one of the many reasons why it works so well as a Nuke implementation. Before exploring ComfyUI in Nuke, let's get familiar with the main nodes that are part of our Nuke implementation:


ComfyUI Nodes
ComfyUI Nodes

ComfyUI x Nuke

Using custom created imagery seems like the next step in evolution, connecting ComfyUI to Nuke for a seamless compositing experience. ComfyUI combined with Stable Diffusion provides a dynamic solution to create content faster, using AI for backgrounds, textures and concept art, saving time on manual tasks. It also boosts creativity, allowing us to quickly experiment with new ideas and visual styles.


Francisco Contrera was one of the first to implement these ComfyUI nodes into Nuke allowing compositors to build the same node networks in Nuke. One of the many reasons why it's so important to learn the basics of native ComfyUI networks. This allows us to create unique elements like sky replacements or backgrounds directly in our compositing pipeline and test different AI models without switching software.


ComfyUI x Nuke Network
ComfyUI x Nuke Network

Technical Directors get to take that idea and simplify it into an easy-to-use gizmo to share with their team:


ComfyUI Nuke Gizmo
ComfyUI Nuke Gizmo

Examples

Some examples of how ComfyUI can be used directly in Nuke:


Storyboarding

A storyboarding workflow in Nuke with the help of a LoRA model that helps to keep the style results consistent. This allowed Tim to pre-visualize his short film within 3 days.


Equirectangular image generator 

With the help of ComfyUI we can create a 360 degree image that can be used as HDR to simulate a certain background or lighting situation.


From a grey model to a rendered result.
From 3D gray model to rendered result.

Nuke Scanline to image

Using a simple 3D grayscale model, we're able to shade, light and render it complete with alpha pass using control nets which works with an existing pose, position and depth.



ComfyUI x Nuke: Production

Let's look at some of the key elements that make this workflow possible for production.


Making Comfy Work for Production

  • Hardware: Powerful GPUs, CUDA support, and dedicated servers are essential for efficient AI rendering.

  • Software: Python scripting, Nuke, ComfyUI, and necessary plugins must be integrated for a smooth workflow.

  • Storage: High-speed storage is critical for managing large image sequences and backups in VFX pipelines.


Workflow Integration

  • Asset Management: Integration with tools like Deadline for batch processing and Shotgun for tracking is key.

  • Version Control: A robust version control system is required to store and compare AI-generated assets.

  • Pipeline Compatibility: ComfyUI output must integrate seamlessly with Nuke, using industry standard formats and color spaces.


Collaboration & Scaling

  • Team Collaboration: ComfyUI files are shareable, and with projects sharing the same repository, it is easy to collaborate and move files through a pipeline.

  • Scaling for Productions: Studios can scale ComfyUI with render farms and shared asset libraries for larger productions. Tasks can be built to run through solutions like Deadline for greater efficiency.


Credit: github.com/vinavfx/ComfyUI-for-Nuke
Credit: github.com/vinavfx/ComfyUI-for-Nuke

ComfyUI x Python

Python and machine learning are friends. In fact, most machine learning and AI modules are written in C++ but controlled by Python. ML researchers love Python because it's an easy-to-use scripting language that allows them to create easy-to-use Python APIs around their complex models.


ComfyUI is written in Python.
ComfyUI is written in Python.

Python allows artists to simplify and automate certain tasks, while TDs are able to write custom tools and gizmos that streamline workflows like the complex ComfyUI network into simple Nuke nodes and knobs.

Join our next Python for Nuke cohort and learn professional Python to automate your own workflow.



Resume

While the battle over whether AI is "real art" and concerns over copyright still rage, our focus is on how machine learning and AI can make our professional lives better and easier. A well-implemented ComfyUI pipeline has the potential to at least simplify the exchange of ideas by creating concrete visuals, and potentially help streamline some essential but unglamorous tasks.


Our recommendation is to give native ComfyUI a try, and if it catches your interest, see if it could also have a lasting impact on your ComfyUI Nuke workflow as well.


Thank you for reading,

Vish Patel & Alex



Links

bottom of page