The relationship between ancient theatre and video games is stronger than you may think — technology has always enhanced our ability to tell stories.
The main form of human entertainment has, for the longest time, been theatre. Ancient Greek and later post-Renaissance theatre was a relatively static affair: actors, sets, and dialogue. The medium’s advances were more often narrative innovation than technological, as with Sophocles’ introduction of three-person dialogues. Unimpressive today, his innovation was considered genius in its time. In contrast, Shakespeare later championed the Middle English language as beautiful, flexible, and idiomatic, despite it largely being a Frankenstein-like mash-up of Saxonic, Old Norse and Norman grammar, syntax and vocabulary. Shakespeare’s genius was in his ability to hone previously established narrative techniques rather than innovate technologically.
Yet it was the Industrial Revolution that truly transformed the medium. It brought impressive technological advancements, such as Dion Boucicault’s introduction of theatre fireproofing, which enabled the use of on-stage flames; panoramata, which created illusions of movement on stage (and convenient ways of showing a change of scene); and more obviously, modern stage lighting, for which Thomas Edison’s incandescent lamp proved essential. Thanks to technology, theatre evolved rapidly, until finally developing into another medium entirely: film.
It was photographer Eadweard Muybridge and entrepreneur Leland Stanford’s novel experiment in 1878 that had the largest effect on theatre’s transition to film. Attempting to determine whether a galloping horse’s feet are at any point all simultaneously off the ground, the pair established 24 cameras at Stanford’s racecourse that they rigged to activate via an elaborate series of attached strings. The galloping horse’s collision with these strings would trigger each camera to activate one after another, with the resulting photos showing whether or not a horse’s gait did involve all hooves simultaneously being of the ground. The experiment did illustrate as much – yet more importantly, Muybridge later projected these images onto a screen in succession as part of a presentation at the California School of Fine Arts in 1880, and the first motion picture was born.
Once filmmakers refined motion pictures with inventions such as movie cameras (including Edison’s early kinetoscope), theatrical performances were translated to the big screen — albeit slowly — and motion pictures began to usurp live theatre’s position in popular entertainment. Fundamentally, film (as with theatre before it) is an illusion that technology predicates. Technology assists us to suspending our disbelief so that their narratives, as with Sophocles of Ancient Greece and Shakespeare of the Renaissance. In time, this was progressively aided by more and more advanced techniques and developments. As theatre was with fireproofing and lighting, motion pictures were aided with the development of more sophisticated production modes: in cinematography, technique, direction, editing and more.
Film’s strength was in technology’s afforded malleability: its stories could easily shift in scene and audio could be compressed and made audible, where theatre before it struggled. The ability to create drama as an isolated, carefully crafted product that’s produced, and edited to the point of (directorial) perfection, then marketed and sold, set it apart from theatre and remains one of the medium’s defining strengths. The technology behind theatre is assistive, rather than innovative, with its nature of spontaneity always lending itself to degrees of error.
Motion picture increasingly emulated theatre’s depiction of drama through the eras of silent films; sound films, such as the talkies of Classic Hollywood; to New Hollywood and the modern 'blockbuster.' And this is isn’t even accounting for non- and western European traditions of film, which have their own unique evolution and consequent character. The realism of theatre was slowly usurped by film’s acceptance of (and fundamental entrenchment in) technology; film reached a point where it portrayed believable human experience and stories better than theatre.
So cinema’s capacity to suspend our disbelief in onscreen events has always been nested its readiness to innovate. Whether it’s from computer generated imagery (CGI) that creates realistic science fiction settings, or the serialism seen in film-on-television, cinema carries theatre’s torch in portraying human drama. Today, the pacing, content and techniques have largely been formalised, with films — especially in the cast of the blockbuster — becoming predictable. Ignoring 3D and 4D’s general failure to enter popular cinematic acceptance, the technology behind the production of film has essentially stayed the same since the introduction of CGI.
It’s interesting to see video games, which so often — while fully embracing the benefits CGI affords — hearken to the immediacy and interactivity of theatre. Where observers are detached non-agents in cinema, video games put one in an inverse position: players are active members of the experience, affecting what happens on 'screen' — as was the case with many more interactive theatre performances. The actors of the drama can respond to viewers but, rather than being a passive observer, video games immerse the participant as they themselves are actors — as if on stage or in a film. Not only can gamers play a first-hand role in the events taking place, but more recent games allow players design their own character; they are able to place a likeness of themselves into the story, enhancing their role within the narrative.
With the increasing complexity of computers, video game developers create simulated worlds, dramas and narratives (as in RPGs, Dating Sims, and ‘Adventure’ games, respectively) for participants to live out, rather than absorb — as in theatre and film. While initially primitive, video games have evolved to a point of almost photo-realism, with interactive environments for player to explore and create. In essence, video games cinematize theatre’s (fledgling) grasp of agency, in a similar way that cinema developed upon theatre.
But where will technology take video games from here? Video game developers and technicians are already toying with virtual and augmented reality (dubbed VR and AR, respectively). These technologies strive for a hyper-emersion yet unseen in video games — much akin to film’s immersive leaps in cinematography and editing. Often taking the form of goggles, VR/AR technology aims to remove distractions between the player, deeming the 10-foot distance between the gamer and their television ‘too far’. Rather than gamers participating in narratives, now they will feel the experience — similar to film’s original goals of removing the distractions that come with theatre’s spontaneity.
But what will this mean for gaming? More and more, video game developers have been pushing the boundaries of immersion; VR and AR gaming could be the logical result of that push. Rather than the player being conscious that they are playing a video game, VR/AR allows players to control in-game avatars with their own movements, breaking the problem of disbelief — a problem that theatre (as with film) has been attempting to solve for so long. By placing audiences truly within theatre, these narratives, dramas and comedies can be experienced, rather than observed, to a degree unseen in theatrical history.
It has yet to be seen whether VR and AR prove formative for a future medium (as with theatre and film in the past), but with video games’ constant push for hyper-immersion, it definitely seems like a logical progression. And as Pixar’s John Lasseter observed, art and technology are cyclic: inevitably ‘the art challenges the technology, and the technology inspires the art.’
Edited by Jessica Herrington and Sara Nyhuis