The filmmaking industry is one that has been subjected, not only to inventions and innovations in the last 50 years, but also to a complete reworking of the way films are made. Long gone are the days when visionaries would craft reality into fantasy with camerawork. With technological advancements, filmmakers can now create entirely new galaxies with computers. Below are some of the most important advancements in the field of filmmaking.
One of the most rudimentary additions to the field of cinema is the invention of the steadicam. First introduced in 1976 in the film, “Bound for Glory,” the steadicam allows for more presise movement of the camera in tight areas. Before it was invented, directors would have to compromise and either film shots completely static or endure a noticeable shake. The smoothness of modern tracking shots is due, almost entirely, to the steadicam.
While many are familiar with the trend of nauseating 3D gimmicks in modern weekend blockbusters, 3D actually dates back all the way to the 1920’s and rose to prominence several times in cinema history. The waves of 3D films throughout the decades offer a sort of patter of 3D films becoming popular and then dying out. For instance, a wave occurred in the 1980’s with “Jaws 3-D.” A more recent example is that of “Avatar” in 2009, which melded together 3D and stunning visuals to become the highest grossing film all time, kickstarting a name wave of 3D films.
Filmmakers have always strived to deliver the best quality possible in their films. The IMAX format allows the maximum quality cameras currently offer. While most movies are filmed in the regular 35mm format, IMAX is filmed in 70mm, allowing greater image clarity and better coloring. When comparing the film negatives side by side, the size of the IMAX film reel is noticeably larger. Director Christopher Nolan is known for utilizing IMAX for long periods of time during his films. Those who look closely at “The Dark Knight” may even notice the aspect ratio change when the camera changes from 35mm to 70mm.
Before the era of previsualization, filmmakers would primarily use storyboards to plan their shots before filming. While storyboards certainly are still used, previsualization works as a low quality computer generated animatic of a scene to plan in more detail. Often previsualization is done to render shots that are heavy on CGI and green screen. One of the first films to plan shots with previsualization is 1989’s “Star Trek V: The Final Frontier.” Now, most major blockbusters use previsualization.
Perhaps one of the most subtle, but drastic, changes to the field of cinema was the invention of the digital camera. First introduced in the 1980’s, digital cameras began to replace cameras loaded with film reel because they were more cost-effective and allowed directors to film more efficiently. Some claim however, that digital camera quality doesn’t compare the standards set by actual film. Most audiences can’t tell the difference, but movies shot on film as opposed to digital tend to be slightly more crisp. That being said, the ease of digital filmmaking has swayed most filmmakers.
By far the most influential change to the movie industry was the introduction of computer generated images (CGI). Filmmakers have always utilized effects to create the impossible, whether it be matte paintings or optical illusions, but CGI allowed them to do so with the click of a button. First introduced in the early 1970’s, but not taking off in the mainstream until 1991 with “T2: Judgement Day,” CGI has become a medium used in almost every major film whether audiences notice it or not. Even period piece dramas use CGI to remove unwanted signs of 21st century filmmaking. While some complain about an overuse of CGI in blockbuster action movies, they often don’t consider the fact that CGI, when used sparingly, can seamlessly blend together reality and the impossible.