During the past few years, some have expressed concerns about the possible decline of the art of cinema, which has captivated the world and commanded a wide audience base for over a century. These concerns arose with the spread of video shooting phenomena, which evolved into digital systems by the end of the last century.
The most critical aspect of this concern is the anticipated decline in cinema attendance, as home display screens have come to resemble the cinematic experience, especially since television has become perfectly suitable for such presentations.
There have been, and continue to be, persistent attempts by filmmakers to attract audiences back to cinema halls, beginning with the CinemaScope filming method, which was presented on larger and more expansive screens to offer a distinct and more impressive viewing experience. However, this experience was not sufficiently successful, as viewers had become accustomed to traditional screen sizes. Subsequently, 3D films emerged, featuring immersive imagery that brought viewers closer to the action, as if positioned at its very center. Nevertheless, even these techniques proved unsuitable for all types of films, much like immersive sound systems.
For decades, since its inception in 1895, the film industry has been closely associated with cinema cameras and celluloid film in various formats, alongside manual editing techniques using the Moviola and a range of traditional methods for visual tricks and effects.
Throughout the history of cinema, cameras have undergone notable developments; however, these changes were not radical innovations. In the late 1920s, a shift occurred in capture speed from 16 frames per second to 24 frames per second, bringing cinematic time closer to natural motion and synchronizing it with spoken sound.
In the 1980s, digital systems emerged, offering images with greater precision, clarity, and color accuracy. Digital cameras became capable of capturing extensive visual information and minute details of reality, achieving clear contrast and dimensionality, as each pixel within the frame recorded digital data with exceptional precision.
Subsequent advancements occurred in storage and display technologies, and at the beginning of the new century, digital cameras evolved further, with digital imagery increasingly rivaling traditional cinematic imagery.
Digital visual effects software has also advanced significantly, allowing extensive image manipulation, including control over actors’ ages and physical features. This includes adapting appearances to suit the nature of on-screen characters or the various life stages depicted within a film. Moreover, such software enables the addition or removal of elements- such as scars, freckles, tattoos, cosmetic alterations, and disfigurements -according to the dramatic requirements.
This process is achieved through comprehensive facial and bodily scanning, enabling highly accurate modifications. This was evident, for example, in the 2019 film (The Irishman), directed by Martin Scorsese.
In the film, esteemed actors such as Robert De Niro and Al Pacino appeared in their elderly stages, while the imagery digitally reverted them to their youthful years. It cannot yet be said that this technique has reached its full potential, as these technologies continue to astonish with ongoing advancements daily, and even hourly.
With innovations in digital production technology, post-production stages such as editing and mixing which once occurred after the completion of filming, can now generate a complete film file automatically at the moment footage is captured.
The scope of the digital revolution in cinema has not been limited to technological dimensions alone; it has extended decisively into artistic and creative practices. Digital filmmaking technology has contributed to a deeper understanding of cinematic realism by facilitating the capture of real-life scenes through small, portable cameras.
Indeed, most American and non-American films are now shot digitally due to this convenience. However, Linus Sandgren, the cinematographer of the Oscar-winning film (La La Land, 2016), stated: “Director Damien Chazelle’s last intention for La La Land was for it to look natural. His first intention, and what he discussed with his cinematographer, was how to give the film an aesthetic vision reminiscent of classic musical films from the 1950s and 1960s. The choice was clear: shooting with a film camera.” Consequently, the use of film cameras has become the exception rather than the norm.
In 2017, three films nominated for the Academy Award for Best Cinematography were all shot using traditional film cameras: (Silence), directed by Martin Scorsese; (Fences), directed by Denzel Washington; and (La La Land), directed by Damien Chazelle.
That same year, more than twenty films were produced in the United States using film technology. The decision to shoot on film was driven by purely artistic considerations, such as film’s capacity to convey facial expressions with heightened sensitivity and to render colors with a sharpness unmatched by digital technology. In addition, certain lenses retain advantages in film cinematography that digital formats do not replicate.
Conversely, in documentary filmmaking and docudramas, digital techniques have contributed substantially, both in terms of capturing live-action footage with greater mobility using lightweight cameras in populated or rugged locations, and in the limitless storage and retrieval of archival materials. Furthermore, digital methods allow for image correction and the seamless integration of footage produced using different techniques and across different historical periods.
Due to its compact size, affordability, portability, and adaptability to varied environments, the digital camera is particularly well suited for filming. Its ability to record and store high-quality images across expansive storage capacities surpasses that of other camera types. These combined factors position digital cameras as a decisive element in contemporary cinematic competitiveness.
In its most basic definition, cinematic realism, as articulated by André Bazin, refers to the “complete and total evocation of reality,” encompassing the space between lived reality and its filmed representation.
Through the use of digital cameras, filmmakers are able to access difficult terrains and capture footage in high-risk locations discreetly, without exposing themselves to the dangers associated with large, heavy cinema cameras. In many instances, this has fostered the emergence of the integrated artist - the director who simultaneously undertakes filming and editing.
Given the current state of digital imagery -its color depth, approximate resolution, visual smoothness, and depth of blur - it has become increasingly difficult for non-specialists, and even some specialists, to distinguish it from traditional cinematic imagery.
As a result, digital cameras have largely replaced film cameras in contemporary production, compelling cinemas to adapt their projection systems accordingly. This shift has enabled both public acceptance and the commercial revival necessary for the industry’s sustainability.
The widespread success of digitally shot films underscores the necessity of adapting to technological advancement. Resistance to artistic and technological change has historically resulted in marginalization. Digital cinema has thus become an undeniable reality - at least until the emergence of new technologies offering even greater potential for achieving cinematic imagery under enhanced conditions and broader possibilities.
