The application of visual effects in modern cinema is a lengthy and painstaking process with different departments working in unison in all stages of the movie production. When the job is this big, a defined workflow helps organize all the tasks are organized and completed on time.
If you are new to the VFX industry then you probably heard the term VFX pipeline. The term sounds more like a computer application but it’s in fact, a work process.
What is a VFX Pipeline?
A VFX pipeline is a process that breaks down the steps of a workflow into logical and efficient segments.
Making a movie, particularly a big-budgeted one, will require many people working together as a team. In a traditional office setting, the concept of VFX pipeline is similar to a big project being handled by different departments or offices. By breaking down the workflows into manageable segments, all the tasks that come into play will be completed on time and within the budget. Most projects come with more than one pipeline and no two pipeline workflows are the same.
Breaking Down the VFX Pipeline into Steps
Research and Development
This is the step where the technical approach of a film is planned and decided. This stage involves much discussion about the software preferences and techniques that will be used for the movie. Initial concepts and ideas are pitched, skeleton teams are assembled and early pre-visualizations are created for presentation purposes.
For VFX-heavy films, artists, programmers, even scientists, and mathematicians are brought in to discuss what special effects programs (mass produced software or custom software programs) should be used to bring the concept of the film to life.
Because concepts and directions will evolve throughout the making of the movie and the material will be worked and reworked, this stage merges with other segments of the pipeline. It remains ongoing throughout most of a project’s life.
Storyboarding and Animatics
At this stage, a team of artists will create visual representations of the actions of the film. Everything from the character motions to the storyline will be analyzed to determine how to execute every shot.
The pre-visualization stage is a process that converts a storyboard or script into low poly models to play out the movie scenes. At this stage, the director will get an idea of how the scenes will be shot. Live digital environments are also used as a reference during the shooting process. Camera angles on complex scenes, the locations of the shoot as well as the composition of the scenes are planned during this stage to save time, money, and energy when on set.
This is the stage where the actual shooting of the film is done. At this point, the source material is still raw but it’s always evolving. As filming begins, the visual effects team will work with the production crew to deliver useful content.
Several VFX-related activities are done on set and back at the studio while the film is still being shot. Because different departments have different specialties, communication within teams should always be open so the content remains true to the concept. A typical visual effects production pipeline is comprised of the following segments:
3D modeling, which is one of the most labor-intensive aspects of CG effects, turns concept art into a digital subject. Because this technique is meant to create things that are neither practical nor cost-effective to have on set, the job involves using a variety of digital applications and animation techniques to complete. The process involves creating model props, environments, buildings, weapons, vehicles, basically any object that’s needed to execute the director’s vision. 3D modeling is conducted throughout the three stages of production.
A good example of 3D modeling done right is Andy Serkin’s many motion-capture roles (LOTR’s Gollum, Planet of the Apes’ Ceasar, Star Wars’ Supreme Leader Snoke, just to name a few). These characters were created using a motion capture suit. The 3D modelers would create the character that the actor is playing. Then, the computer-generated character is rigged and animated using a combination of VFX software including Autodesk Maya and Pixologic Z-Brush. To create hyper-realistic results, 3D modelers would use hundreds of reference photos and 3D scans.
The bulk of a movie’s props, environments, buildings, weapons, vehicles, etc., are done using 3D models but the backgrounds are generated using matte painting. Matte paintings are hardly new, they are one of the earliest VFX techniques used in filmmaking. This technique had its roots in photography but evolved to painted glass panels – and eventually to digitally rendered environments. Matte paintings are used to create a variety of life-like landscapes and give the illusion of an environment that is not present on set.
Back in the day, 2-dimensional images were used to recreate movie landscapes but thanks to technology, it is now possible to create entire 3D sets and backgrounds digitally. Digital cameras are also used to add movement to the background. Matte paintings are also used as reference materials for digital artworks.
Principal and Reference Photography
During the actual shooting of the film, the VFX team will be on set to take pictures of everything – props, environments, background, actors, etc. All these photos will be used to add textures to 3D models. These photos will be used by the VFX team as references to enhance the realism of the scene, paying close attention to how certain attributes of the object/character behave in real life to achieve convincing results. The more detailed the reference photos are, the more realistic the visual effects would be. Often, VFX teams are given their own copy of the film scan once the shooting concludes for reference and tracking.
Post-production brings all the elements of the film – video footage, special effects, computer-generated effects, music, sound, editing – are pulled together and made into a full-length movie. Because the majority of the VFX work is done during post-production, this stage is the largest part of a project’s timeline.
As you can imagine, bringing all these elements of a movie together is a lot of work so it’s common for post-production to take months, even years. As far as visual effects go, post-production involves the following:
Rigging and Animating
A rigging team will build a digital skeleton of a movie character or a system of controls that animators would use to animate an object or character. The process involves the addition of bones, tissues, and motions to create natural movements. The rigging team would use a variety of tools to calculate skin weights while the animators would implement the results to breathe life to a character, prop, or an object. Most times, motion capture cameras are used to collect data, which animators use to achieve the final look.
FX and Simulation
The FX team is tasked to add simulated elements to a film. The simulated elements must exist with all other objects and characters into the film in the most organic way possible to achieve realistic results. The FX team often work alongside the animators to ensure the natural movements of colliding FX elements, such as explosions, destruction, fire, smoke, particle sims, and so on. Polishing is also applied over existing scenes to enhance the visuals.
This process involves the addition of surface color and texture to the 3D models, making the models realistic and nearly complete. For example, the metal shine is added to metal objects to mimic their real-life attributes. Human subjects are given their skin, with detailed characteristics and texture for hyper-realistic results.
Lighting and Rendering
Once the objects and characters are animated and the effects are applied, the 3D elements are added to the respective scenes. Proper lighting is key to making a computer-generated scene look realistic. Poor lighting, on the other hand, could reveal the fakeness of the elements, which ruins the final results.
Proper lighting is applied throughout the 3D scene. The light, color, and intensity of the original shot piece are enhanced then shadows are matched up at this point. Finally, the sequence of frames is rendered out from the needed camera angles. When that’s done, the results are given to the compositor to bring all the VFX elements together.
The final step in post-production is compositing. This process involves taking all the elements of the films and then layering them on top of one another. At this point, color correction, masking, etc. are applied and refined to achieve the final results. Computer-generated characters are added into a live-action shot, an explosion is overlaid on a building, a simulated storm is layered on a shot of the ocean, all these special effects are added into the right scene using a series of hi-tech applications. It is the compositor’s job to make sure that real-life objects and characters are engaging with computer-generated effects and make everything look seamless and realistic.