How much can anyone accomplish with pen and ink alone? Quite a lot, actually, if the rich legacy of animated feature films is any indication. But time and technology were bound to move forward, opening up the creative toolbox in previously unimagined ways. The first toe-dip into the waters of computer-generated animation allowed artists to take visual risks and solve a rainbow of creative challenges. By the 1990s, we watched Aladdin take a ride on a digitally enhanced carpet, the Beast and Belle share a dance in a now iconic CG ballroom, and The Prince of Egypt’s Moses part a CG Red Sea.
These initial milestones opened new windows that resulted in fresh but still timeless stories. Here, Animation Guild members take a look back, exploring what traditional and CG artists have made possible with computers on some of your favorite films.

Tiny Titans
The heroes were teeny insects, but there was nothing remotely small about DreamWorks Animation’s early foray into the world of CG films. Released in 1998, Antz was the studio’s first feature, marking a bold step into a new creative frontier. Completely computer animated, it tracks Z, an ambitious if vaguely neurotic ant, who breaks out of his individualized rut and bucks the system of his colony.


Pablo Valle, who worked as a Visual Effects Lighter on Antz, remembers that part of the intention behind doing this particular film was its supposed simplicity in order to work within the technological constraints of the time. The thought back then was that they could only do robots and insects with their smooth surfaces. “Robots had been done before, so let’s do insects,” he says.
But the process turned out not to be so simple. The movie unfolded from the perspective of the ants, and their world required the visual effects department to create more than 700 effects shots like pyrotechnics and stunts, everything from a beam of light reflecting off a magnifying glass to more elaborate water simulations. Who knew that a drop of water or the gum-festooned sneaker sole would make for such a compelling visual experience? In fact, the film’s Writers and Directors Eric Darnell and Tim Johnson did—and well before the technology had advanced to a level that would make this type of a movie anything but revolutionary. With every effect, they and the rest of the crew were pushing the limits of what was possible.
“One of the biggest contributions from Antz in terms of technology and its long-lasting impact in the industry was actually its pipeline.”
-Pablo Valle, Visual Effects Lighter, Antz
Because an established roadmap for this magnitude of work didn’t exist, a dedicated production pipeline had to be created. “One of the biggest contributions from Antz in terms of technology and its long-lasting impact in the industry was actually its pipeline,” says Valle. “The idea of having dozens, perhaps hundreds of artists working in parallel, in multiple departments, all contributing to a single film was truly revolutionary.” He notes that he continues to feel the impact today. While many parts of the DreamWorks’ pipeline have evolved over the years, “how departments interact, production tracking, overall workflow, etc. have remained mostly unchanged since the days of Antz.”
It was a moment of invention as much as storytelling—opening the door to more complex characters and immersive environments.

Range of Motion
In the first decade of CG-animated filmmaking, advancements in character animation enhanced the way audiences connected with the characters. Studios began developing more sophisticated tools that brought greater realism to facial expressions and body movement.

The path to this kind of realism originated for Disney in Dinosaur (2000), which was a blend of CG with live-action backgrounds. During the pre-production phase, “a muscle and skin simulation system was written in-house,” says Visual Effects Supervisor Steve Goldberg. The software used physics models to simulate the behavior of muscles and skin in a 3D environment. “The muscles would be laid out on top of the character’s rig ‘skeleton’ and would move in accordance with how the rig was animated by the character animators,” he says. “A skin simulation would then be run based on the movement of the underlying muscles. This could be tuned to allow for taught or loose skin or simulated fat layers—all to add a degree of realism and believability.”
By the time films like DreamWorks’ Shrek (2001) and Shrek 2 (2004) came along, technology had advanced significantly, and they showcased detailed muscle simulations and skin shaders that allowed for more nuanced performances. But it took more than advanced technology on its own to elevate a sense of realism.

Disney’s Tangled (2010) contributed to the evolution of more lifelike characters by the way it blended traditional animation principles with evolving CG technologies—resulting in characters that conveyed nuanced emotional depth. Amy Smeed, one of the film’s Animators, says that Glen Keane, Animation Supervisor for Rapunzel, sat in on dailies and physically drew over the CG renderings to help the team think about the principles of animation that 2D animators had long been familiar with.
“In CG, you can take a rig, move it around, and it doesn’t mean anything,” Smeed says. “So they were really getting us to think about: ‘What are you saying in this one scene?’ And really having us strip out the excess movements so you could get to the acting and performances of the character.”
Two of Disney’s next films, Frozen (2013) and Big Hero 6 (2014) carried those character advancements forward. Smeed notes that the latter deals in a significant way with grief. “When you have subtle acting performances, it is so important to get that acting correct so audiences are really feeling for [main character] Hiro,” she says. “If we over-animate that, the feeling isn’t going to be right for the story.”

The Crowd Goes Wild
In traditional animation, most crowd scenes were tackled by a few artists hand-animating dozens of characters, which was both time-consuming and creatively limiting. Introducing crowds was a complex challenge in CG animation, but Antz addressed this issue in a foundational way. The effects team pioneered large-scale digital crowds, featuring tens of thousands of independently moving ants, thanks to a proprietary crowd simulation system called MOB. This was a novel rule-based particle-style system, capable of things like blending motion cycles and triggering head turns to avoid uniform behavior.
Lighting Technical Director Yasser Hamed dealt with different kinds of motion issues on Tangled, one of the first films he worked on for Disney. He noticed that the animation department constructing the dance scene was creating up to 30 characters by hand. “It was exhausting,” he says. “When Wreck-It Ralph (2012) came on board, it was clear from the story that we were going to have a lot of characters, [and] we can’t hand-animate all those characters.”


The film featured complex crowds at Game Central Station and the Candy Crush races, not to mention a couple hundred-thousand flying Cy-Bugs. This meant a new approach was needed. Hamed, now the Crowds Technical Director, collaborated with the Effects Animation and Production Technology departments to add an animation cycle layer on top of any particle system that offered more sophisticated behaviors. Simplified rigs could accommodate head turns, eye gestures, spine deformation, and other features to distinguish characters from each other.
“At the time, it was still very limited because the computing power wasn’t strong enough to do more than maybe five or six joints at a time,” Hamed recalls. “But for every show after that, as the computing power increased, we were able to push it even further and add more layers of animation.”
What was originally a small operation at Disney turned into the Crowds Animation department, paving the way for technical solutions tackling exponentially bigger shots. Hamed points to Strange World (2022) which features 56 million crowds elements, including millions of flowing blood cell-inspired creatures that created a bloodstream characters could walk on. “That was our biggest shot ever,” Hamed says. “It was a big change from animating [a few dozen] dancers on Tangled.”

Turn up the Volume
From the moment he entered the zeitgeist in Shrek 2 to his inevitable breakout headlining his own movies, that swaggering cat Puss may have had his head in the clouds. But by the time filmmakers at DreamWorks were looking to actually send him skyward in a key sequence of Puss in Boots (2011), they needed to make sure those clouds looked spectacular in the fairytale world they were creating.


The film heralded the first cinematic use of Volume Data Blocks (VDB), a game-changing technology that would redefine how volumetric effects like smoke, fire, and water were handled on screen—and, of course, clouds. As Visual Effects Supervisor Jeff Budsberg explains, around 2009, traditional simulation struggled to process the enormous volume of data required to render realistic clouds. With Puss in Boots in the pipeline, the filmmakers faced a challenge.
“A cloud doesn’t really fit in a box. There’s a lot of wasted space,” says Budsberg, explaining that Puss had to interact with cloud environments, with “really demanding cloudscapes that were kind of art directed… The resolution required for this was astronomical, and it would not have been possible with the historical nature of storing data.”
Ultimately winning two Motion Picture Academy Scientific and Technical Awards for this technology along with his team, Computational Scientist and former DreamWorks’ Senior Principal Engineer and Director of Research and Development Ken Museth helmed the design and implementation of the VDB library in 2010 and 2011. Budsberg says its goal was to solve difficult production issues that artists were running into in the effects department, such as 3D volumetric clouds and fast particle-liquid surfacing. Museth strongly urged sharing this innovation as an open source project with the industry at large, and it was initially released to the public in 2012. This led to the widespread adoption of OpenVDB and its integration into software like Maya and Houdini.



A next big leap for DreamWorks came with The Croods (2013), which used OpenVDB for a broad range of effects—along with fire and smoke, there was lots of destruction and notably, a 1.5 billion voxel pyroclastic flow. That sequence became a technical benchmark at the studio. A process that once would have taken painstaking hours could now be achieved efficiently, opening paths for working with elements in new ways that contributed to compelling storytelling.

Controlling the Elements
It’s not as though Disney had never put cold weather into its previous films, but for the fully CG version of Hans Christian Andersen’s The Snow Queen, more than a few flurries and snowballs would be necessary.
“Until [Frozen], most snow was done with a character sinking into a surface a bit and then kicking up some little snowy chunks,” says the film’s Head of Effects Animation Marlon West. “But for scenes where people were buried deep in the snow, the technique that had served our studios [in the past] just wasn’t going to cut it.”




Using Material Point Method (a way to combine points into a grid to mimic behavior that doesn’t conform to a single state, such as honey, lava, or snow), new simulation tools like Disney’s Matterhorn allowed the effects animators to capture the interaction of characters with snow more realistically. The software worked so well that the team tackled even more complicated shots like Anna pelting the snow monster. “It felt really great to have this kind of big technical jump that allowed us to make all this beautiful imagery,” West says.
As OpenVDB led to the dramatic improvement of rendering speeds, it contributed to software used for more dynamic lighting and visual effects. For Moana (2016), the bulk of the movie takes place on the ocean, which serves as both an environment and a character, demanding numerous effects shots. “One of the key elements was to caricature the water so that it’s an exaggerated version of what a viewer may be familiar with,” says the film’s Director of Cinematography, Lighting Adolph Lusinsky. “Our memory tends to remember things in a way that’s grander than what it actually was—a way that is romanticized.” To achieve this required various tricks, such as moving the reflections off the water as it came closer to the camera. His team also used technology to play up the jewel-like caustics—the bright, fractal light pattern seen at the bottom of a pool or on the side of a boat.


Light It Up
In the trajectory of CG animated films, lighting played a crucial role. The introduction of Global Illumination revolutionized how lighting was handled, combining indirect and direct lighting to simulate real-world behavior more accurately. Ray tracing took it one step further, tracing the path of light to allow the simulation of highly realistic reflections, shadows, and color.


DreamWorks’ Visual Effects Supervisor Betsy Nofsinger witnessed massive growth in lighting technology between Kung Fu Panda (2008) and Kung Fu Panda 4 (2024). She says that in key areas, the pipeline for Kung Fu Panda 4 is basically unrecognizable from the one used in the first film. “The quality of the light on the first film had bounce (indirect) lighting made by the artists as baked maps,” she explains. This was cached to be referenced in subsequent shading passes. “For environments,” she adds, “we often shared maps for a whole sequence and even painted them to make some last fixes in the hundreds of maps and avoid a rerender. That meant it was difficult to customize environment lighting shot by shot.”
Kung Fu Panda 4, in contrast, was lit and rendered by the studio’s production renderer: MoonRay, which was first used on How to Train Your Dragon: The Hidden World (2019). From the first Kung Fu Panda, “we had 16 years worth of advancement into raytracing to achieve indirect lighting workflows that are more intuitive to the artist … and more sophisticated in the final result,” Nofsinger says. “Many bounces of indirect influence are possible, and material shaders are able to respond to light either very physically or very non-photorealistically depending on the art direction. The richness and variety of what can be achieved is so advanced compared to the earlier workflows.”

Who Does Your Hare?
Hair should be easy. But hair is never easy. Not when you have to deal with factors like curls or kinks, how it picks up the light, or how it behaves in extreme wind or underwater. Or, as in the case of Tangled, when the main mane—belonging to Rapunzel, naturally—measures 70 feet long and must be picked up and carried, frequently during action scenes or chases.
“That was an insane science project,” says Goldberg, the Visual Effects Supervisor on the film, explaining that Tangled was to be released in November, and the team didn’t have a working hair shot until March of that year. “We were sweating it a bit,” he says.
Thanks to Disney’s Dynamic Wires hair simulation system, Rapunzel’s hair was tamed. “As far as controlling the hair and that level of simulation, Tangled was our big breakthrough,” Goldberg says. “[Hair Simulation Development] Kelly Ward came up with some amazing tools that allowed us to be able to get that level of control.”


An extension of her Ph.D. work, Ward led the team in developing Dynamic Wires—she would go on to win a Motion Picture Academy Scientific and Technical Award for this program. The film’s Technical Supervisor Mark Hammel explains: “Rigging artists would embed the [Dynamic Wires] simulation engine into hair rigs with artist-level controls.” This allowed them to drive the behavior of the hair simulation, such as its stiffness, and control its relationship to objects in the environment. “Technical Animation Artists would be given guidance by the directors about how they would want the hair to move in any particular shot, and [the artists] would set up the conditions for the simulation to run,” says Hammel. “[This] would produce the realistic frame-to-frame motion of the hair, ready to be rendered.”


On the more beastly side of things, with more than 60 species of animals set to populate the world of Zootopia (2016), Disney had to figure out how to depict fur in ways they had never done before. This was a far cry from the studio’s first all-CG film Chicken Little (2005). “[Back then], many tools were developed to help ease the transition from hand-drawn to CG animation for the artists who were learning a different way of creating an animated film,” says Goldberg, the film’s Visual Effects Supervisor. “Traditional Background Painters were trained to become Lighting and Look Development Artists, Character Animators and Effects Animators were trained to work with 3D software rather than pencil and paper, and a number of Cleanup Artists were trained in hair and fur simulation tools.”
The film was a milestone, with a titular character that featured 76,000 individual feathers. Its proprietary toolsets and early in-house Maya plug-in tools laid the foundation for—just a decade later—Zootopia’s rabbit and fox protagonists, Judy Hopps and Nick Wilde, boasting about 2.5 million hair strands each. This magic was achieved via a fur shader within the rendering system, along with a modern version of iGroom to control the look and movement of fur in intricate ways.

Turning the Page
“Make something I’ve never seen before,” recalls Justin K. Thompson, the Production Designer on Sony Pictures Animation’s Spider-Man: Into the Spider-Verse (2018). “That’s all anybody ever said to me.” And that’s exactly what the film’s team achieved—setting a bold course for future animated films by the way it integrated hand-drawn 2D techniques with 3D rendering technology.


2D and 3D had met in animated films before, but Spider-Verse used innovative advances in processing technology that were developed for the film. While most CG movies by this time were focused on more photorealistic depictions, Spider-Verse went its own way. To achieve a blend of different comic stylings—30’s pulp, Mad magazine, manga, and various eras from the Marvel pantheon—the filmmakers revised CG images via an overlay of hand-drawn art, using software that allowed artists to introduce linework, hatching, motion blurs, and comic-style halftone textures directly into a 3D space.
The resulting collision of 2D and 3D was every bit as momentous as the collider discovered by the film’s antagonist, Kingpin, that opened up portals to alternate universes. In the film’s climactic scene, as the main characters swirl around in a jaw-dropping final battle, the filmmakers tested how much the audience’s senses could take.
“[But] the real trick of all these stylized films is not letting the stylization overpower the story and the emotion you’re trying to put into it. Whenever [that happened], we would pull back,” Thompson says. “[What] I think we learned, between Into the Spider-Verse and Across the Spider-Verse [2023], was that the audience’s tolerance was way higher than we thought.”

Back to the Drawing Board
For many CG artists, Spider-Man: Into the Spider-Verse was a pivotal moment. “In some ways, it was a validation of something artists had been saying for decades,” says Jason Mayer, Head of Effects at DreamWorks. “Why can’t we make the movies look like the artwork?” For years, this desire was held back by the industry’s pursuit of visuals that were as realistic as possible. But Spider-Verse proved that a stylized, hand-crafted look could take technology and use it to elevate the artistry of animation storytelling.
In some ways Spider-Verse was a validation of something artists had been saying for decades… Why can’t we make the movies look like the artwork?—Jason Mayer, Head of Effects, Kung Fu Panda 4
At DreamWorks, this sparked a creative renaissance. Artists began exploring how to bring the energy of illustration, comics, and traditional 2D animation into their CG pipeline. “The Boss Baby: Family Business (2021) had stylized fantasy flashbacks, [but] to do an entire movie that way—that looks like the hand of the artist is visible in almost every frame—that’s where it blew me away,” Mayer says. Films like Puss in Boots: The Last Wish (2022), The Bad Guys (2022), and The Wild Robot (2024) explored highly stylized worlds where light, texture, and motion were less about real-world mimicry and more about imagination and stories. But, as with each leap, it required overcoming technical challenges.
“There are laws of physics that describe how light moves and interacts with surfaces,” says Budsberg. Renderers and shaders were based on physics in the real world, but for The Last Wish, fairy tale characters required more painterly details. This meant making things like hair and fur resemble something out of a storybook, and so the team needed to develop filters that allowed the artist’s hand to come through. “We called them painterly filters—to kind of abstract a lot of the image in a way that was akin to how a 2D illustrator might do it … [Essentially], we spent a lot of effort breaking the physicality of the renderer.”


The Bad Guys, in turn, was based on a graphic novel—not an artform to include every blade of grass or strand of Mr. Wolf’s fur. The film’s Head of Effects, Steve Wood, says: “The FX for Bad Guys were very stylized and had a deep rooting in 2D FX … My team were all amazing FX animators with a keen eye for motion. However, I also knew many were new to the 2d FX design stylings of the Bad Guys‘ universe.” He explains that he guided them with a “design first philosophy” and “once an artist had a working design, I knew the motion evolution would be easier.” The goal overall was to incorporate 2D FX where it made sense; the 3D elements needed to fit the style, as well, all the while ensuring that the audience was unable to discern what techniques were used for any given shot.
For The Wild Robot, the island that serves as the film’s setting is a painter’s rendering, down to the tools that allowed animators to draw plants in 3D rather than build them. And among the wild animals on this island, the sleek robot Roz doesn’t fit… until she does. “She’s this precise machine thing in this very loose painterly world,” says Budsberg. To achieve this they developed new materials that would respond to light and allow for feathered transparency. “You’re revealing texture through light, the brush, transparency of assets,” he says. “[We] used all the tricks that we had from Bad Guys and Puss in Boots and pushed them even further… The compositing tools [on The Wild Robot] were the culmination of these three shows—learned advancements across many years.”



In the end, animation’s CG journey has been about so much more than advancing technology. It’s been about embracing innovations while staying true to the heart of the medium. Looking back to his early days as an Animator on movies like The Prince of Egypt, Jakob Jensen recalls how when he got into the industry, his dream was to have a career drawing—using pencils—“like the old guys I admired. Then suddenly came CG, and I must admit I fought it.” Most recently serving as Head of Character Animation on The Wild Robot, he says that over the years he recalibrated his view on why he loves the craft, realizing that only the tools have changed and ultimately, animation will always be about creating memorable characters and performances.
Antz, Puss in Boots, Croods, Kung Fu Panda 4, The Boss Baby: Family Business, and The Wild Robot images courtesy of DreamWorks Animation.
Big Hero 6, Wreck-It Ralph, Strange World, Frozen, Moana, Tangled and Zootopia courtesy of Walt Disney Animation.
Spider-Man: Into the Spider-Verse images courtesy of © 2023 Sony Pictures Animation Inc. All Rights Reserved. | MARVEL and all related character names: © & ™ 2023 MARVEL.



