To understand the computer animation industry's importance, scale, and ubiquity, you can look at the list of highest-grossing films of all time.
There are numerous fully computer-animated movies on the list—Frozen II (2019), Frozen (2013), The Lion King (2019), The Super Mario Bros. Movie (2023), and more. You’d be hard-pressed to find a movie that doesn’t involve computer-generated imagery (CGI).
It’s hard to imagine a time before computer animation, and without it, many modern films would be unrecognizable if they existed. The video game industry would be a very different place, too.
Defining computer animation
First, we need to define computer animation. This general term refers to the art of creating animated images using computers and digital tools.
It follows the same basic principle as other kinds of animation, such as stop motion animation and illustrated 2D animation, where tiny steps are made on a frame-by-frame basis to create the illusion of movement.
In computer-assisted animation, an image is displayed on a computer monitor before being rapidly replaced by a new, almost identical image that’s just a tiny step forward.
This process takes place at a rate of 24, 25, or 30 frames per second (fps), and as a result, the human eye perceives it as continuous movement.
2D animation vs 3D animation
Computer-assisted animation works differently depending on whether it is 3D or 2D. 3D animation objects are digitally produced and placed on conceptual frameworks or ‘skeletons.’
In the animation of 2D illustrations, separate objects, and transparent layers are used, and they can be used with or without a virtual skeleton.
Whether it is 2D or 3D, the technique by which the animation is produced is the same. Take the example of a figure animation: the placement of a character's limbs and facial expressions is moved on keyframes.
The computer then automatically calculates the differences in appearance between the keyframes, filling in the gaps. This key frame illustration process is known as tweening or morphing.
The final stage in the computer animation process is rendering. For 3D animations, the rendering process occurs after all the modeling is complete. For 2D animations, the keyframe animation process is the rendering, and tweened frames are rendered as needed.
Rather than painstakingly painting every single frame or moving 3D figures step-by-step, computing power can be utilized to achieve both things much more quickly, which is the idea for animation studios.
This has led computer animation to be regarded as the digital successor to stop-motion and hand-drawn animation, and it's now far and away the most popular animation technique.
The beginning of computer animation
There were some experiments in computer graphic animations during the 1940s and 1950s, with the most notable coming from John Whitney in 1958 when he created what is credited as the first-ever computer animation.
Collaborating with legendary graphic designer Saul Bass, Whitney used an analog computer to create the title sequence for Alfred Hitchcock’s Vertigo. Whitney is now considered one of the fathers of computer animation and has earned a spot in animation history.
The actual acceleration of modern computer animation came after the advent of digital computers in the 1960s.
Universities like the University of Ohio and the University of Utah established departments to support computer animation, and other institutions like the National Film Board of Canada began experimenting with the new discipline. Many shared the goal of creating an animation school program.
It had the catchy title A Two-Gyro Gravity Gradient Attitude Control System and showed a box with edge lines representing a satellite orbiting around a sphere representing the Earth.
In 1973, computer animation made it out of the lab and onto the big screen when digital image processing was used in a feature film for the first time.
Transform your ideas into animations
Dive into the world of animation with Linearity Move. Our animation design software is beginner-friendly with advanced features for professionals.
Director Michael Crichton enlisted John Whitney Jr. (son of John Whitney) and Gary Demos of Information International, Inc. for some of the scenes in Westworld. They digitally processed the film to appear pixelated to reflect the android point of view in the movie.
Soon after, wireframes appeared in films such as George Lucas’ Star Wars and Ridley Scott’s Alien. Around this time, George Lucas was very interested in pursuing advances in CGI.
In 1979, he took some top talent from the highly respected Computer Graphics Laboratory at the New York Institute of Technology and set up his special effects division.
This division was later split off to form an independent division using funding from Apple’s Steve Jobs. And what was this new division called? Pixar.
Technological advances continued throughout the 1970s and early 1980s, with the framebuffer's introduction. By the 1980s, this new technology pushed digital animation into new places.
Computer animation in the 1980s
Advances in computer power combined with an increase in affordability and new developments in commercial software meant that throughout the 1980s, the quality and prevalence of computer animation and computer-generated imaging kept increasing.
This was the era in which solid 3D CGI was improved and developed to the point where it could be used in a movie for the first time.
Walt Disney’s Tron, released in 1982, is now regarded as a real milestone in the movie industry. Its use of solid 3D CGI is the first for a film that represents a giant step forward.
The Walt Disney film's vehicles and digital terrains are all produced by CGI and show what could be achieved with the technology. From here on in, we see CGI being used in more and more movies right up to the present day.
Morphing or tweening also improved dramatically in the 1980s. Up until this point, morphing was mainly used with vector animations, but by the early 1980s, the technology enabled morphing between human actor photographic images to create photorealistic animation.
The first public example of this in action came from the New York Institute of Technology in 1982 when, at a conference, Tom Brigham presented a video sequence of a woman morphing into a lynx.
By 1988, morphing had made its way onto the big screen in Ron Howard’s movie Willow. It was also used to great effect in Terminator 2: Judgment Day in 1991. The technique probably reached the peak of its trend when Michael Jackson used it in his music video for Black or White.
The video premiered simultaneously in 27 countries to reach an audience of 500 million people and brought morphing to the forefront of public consciousness. Computer animation was heading for the big time.
The growth of computer animation
The 1990s was the decade in which computer animation started to take over and become a significant part of the film and TV industry.
The CGI and morphing used in Terminator 2: Judgment Day was regarded as the most substantial use of CGI in a movie since Tron in 1982. The 1990s also saw the first 2D animated movies produced using only the Computer Animated Production System (CAPS).
In 1990, Disney released The Rescuers Down Under, created using just the CAPS system. Walt Disney then followed up with Beauty and the Beast in 1991. It was made again using only CAPS, but it took it further and incorporated 3D CGI effects.
The movie was a huge box office success and became the first animated film nominated for an Oscar for Best Picture.
CGI is used in movies and TV shows, such as Jurassic Park, Babylon V, and The Lion King. In 1995, another huge milestone was reached: Disney-Pixar released the first fully computer-animated feature film, Toy Story.
Introducing motion capture
Another big development in computer animation in the 1990s came from improvements in motion capture or mocap. In short, motion capture records the movements of people or external objects.
For human motion capture of people, a person wears a series of markers placed near each joint to track the movement of the markers.
Motion capture’s initial application was as a biomechanics research tool. It was first used for commercial purposes in video game production in the late 1980s before being adopted by the film industry in the late 1990s.
A notable example of this time was the use of motion capture to create the Jar-Jar Binks character animation in Star Wars: Episode I—The Phantom Menace (1999). Many people strongly disliked this character, so perhaps it wasn’t the best use of the new computer animation technique.
One of the most significant breakthroughs in motion capture was Andy Serkis’ performance as Gollum in The Lord of the Rings: The Two Towers (2002). This was the first feature film to use a real-time, human actor motion system. The technique transformed the nuances of Serkis’ performance into the facial animation of Gollum, giving the CGI a real human character.
Computer animation today
Nowadays, computer animation and computer-generated images are the absolute norm in the television, film, and video game industries. Incremental technological improvements have increased CGI's capabilities while making it more accessible.
Many home computers can now produce computer animations that would have previously required giant dedicated rendering machines. This has created new opportunities for individuals and companies to experiment with animation.
The intersection of computer-generated animation and traditional animation techniques has significantly shaped the entertainment industry. Companies like Walt Disney, DreamWorks Animation, and Pixar Animation Studios have mastered blending these methodologies, creating films rich in visual effects and emotional depth.
This fusion has been further enhanced by advancements in Affective Computing, which allow for more nuanced digital characters capable of complex motion and realistic animations.
This expansion has required compatibility across multiple file-formats and industry-standard software.
While file size used to be a constraining factor, the advancement of visualization technologies has made it easier to maintain a complete image without compromising quality.
Graphical user interfaces in animation software have also evolved, becoming more user-friendly while offering a more extensive range of technical skills, from tweaking the frame rate to intricate facial motion adjustments.
The animation series and movies produced today are nothing short of dynamic images brought to life, with intricate motion techniques ranging from the simple to the sublime.
Whether it's the realistic ambiance created by Industrial Light & Magic or the heartwarming characters from Pixar and Walt Disney, the industry has made great strides in defining what animation can achieve.
The blend of traditional and modern—whether a simple 2D animation or a complex 3D animation—highlights how far the animation world has come and sets the stage for future innovations.
CGI and computer animation have become so popular because they make almost anything possible—the only limit is your imagination.
If you want to learn how to animate and create digital characters, the barrier to entry has never been lower, and the potential has never been greater. You no longer need an animation degree to get started in the field. You can start right now.
Thousands of graphic designers and marketers use the innovative combination of Linearity Curve and Linearity Move to create captivating animations. Thanks to our powerful tools and professional design templates, your creativity is limitless.
Ready to try new and innovative animation design software? Get started for free below, or check out our premium pricing for professionals and teams.
Frequently asked questions
When was computer animation invented?
Digital computer animation was first invented in the early 1960s. One of the first known uses of digital computer animation was a short film created by Edward E. Zajac in 1961, which simulated the motion of a satellite orbiting Earth. This was followed by other early experiments, including works by Ivan Sutherland, who developed the groundbreaking program Sketchpad in 1963, allowing users to create and manipulate objects interactively on a computer screen.
John Whitney Sr. was a pioneering figure in the development of computer animation. His work in the field dates back to the 1940s and 1950s, when he started experimenting with motion graphics by using mechanical systems like modified anti-aircraft analog computers to create animation.
He's best known for his groundbreaking work on motion graphics using computers. One of his most significant contributions was using a computer to create the opening sequence for Alfred Hitchcock's movie Vertigo in 1958. This was achieved through mechanical means.
By the 1960s, Whitney had transitioned to digital methods.
Who invented computer animation?
There are a number of computer animation pioneers. Two significant contributors are Ivan Sutherland and Ken Knowlton. Sutherland developed the Sketchpad software in 1963, one of the first graphical user interfaces, which greatly influenced how computers could create and manipulate images directly on a screen.
Knowlton developed the BEFLIX (Bell Flicks) animation software, which was among the first programming languages designed for producing computer animations.
How does computer animation work?
Computer animation creates a sequence of images or frames and displays them rapidly to create the illusion of motion. This process involves modeling, rigging, texturing, animating, and rendering objects or characters using specialized software.
What was the first Disney full-length film that used computer-generated animation?
The first full-length film under the Disney banner that used extensive computer-generated animation was Toy Story, released in 1995. This film was produced in collaboration with Pixar Animation Studios and is the first feature film made entirely with computer animation. It marked a significant milestone in the history of animated films.
While Toy Story was a collaboration with Pixar, the first entirely Disney-produced film that heavily utilized computer-generated animation was Dinosaur, released in 2000. This film featured advanced CGI for the characters set against live-action backgrounds, showcasing Disney's expanding capabilities in computer animation.
How can I make animations on my computer?
There are various software tools available for creating animations on computers. One popular option is Linearity Move. Check out Linearity Move’s video tutorials to better understand how this software works.
What computers work best for animation?
Advanced animation software often requires computers with powerful processors, ample RAM, and dedicated graphics cards. Linearity Move can be used on iPhones, iPads, MacBooks, or desktop iMacs, providing a versatile and accessible platform for animation creation.
Share this!
Benjamin Barnhart
Ben is the Marketing Manager at Linearity in Berlin, with extensive experience in content writing. He blends his passion for animation and history to develop impactful marketing strategies.