Summary: A new movie adapts and changes its story based on the viewer’s emotional response.
Source: The Conversation
Most films offer exactly the same viewing experience. You sit down, the film starts, the plot unfolds and you follow what’s happening on screen until the story concludes. It’s a linear experience. My new film, Before We Disappear – about a pair of climate activists who seek revenge on corporate perpetrators of global warming – seeks to alter that viewing experience.
What makes my film different is that it adapts the story to fit the viewer’s emotional response. Through the use of a computer camera and software, the film effectively watches the audience as they view footage of climate disasters. Viewers are implicitly asked to choose a side.
I chose to use this technology to make a film about the climate crisis to get people to really think about what they are willing to sacrifice for a survivable future.
Storytelling has always been interactive: traditional oral storytellers would interact and respond to their listeners. For almost a century, film directors have been experimenting with interactivity – the past decade has seen an explosion of interactive content.
Streaming services give viewers the opportunity to choose their own adventure. However, letting the viewer control the action has long posed a challenge: it’s at odds with narrative immersion, where the viewer is drawn into the world created by the story.
One of the most prominent recent experiments in interactive film, Netflix’s Bandersnatch, clearly illustrates this. Here the action stops to ask the user what to do next – breaking the flow of the story and actively involving the viewer. Solving this issue of breaking the immersive experience remains a key question for artists exploring interactive film.
Using this data from the brain, audiences create a non-conscious edit of the film in real time – reinforcing the films’ respective stories of science-fiction dystopia and a wandering, daydreaming mind.
However, the BCI interface requires specialised equipment. For Before We Disappear, I wanted to use a technology more readily available to audiences, that could allow films to be shared over the internet.
Controlling the narrative
Before We Disappear uses an ordinary computer camera to read emotional cues and instruct the real-time edit of the film. To make this work, we needed a good understanding of how people react to films.
We ran several studiesexploring the emotions filmmakers intend to evoke and how viewers visually present emotion when watching. By using computer vision and machine learning techniques from our partner BlueSkeye AI, we analysed viewers’ facial emotions and reactions to film clips and developed several algorithms to leverage that data to control a narrative.
While we observed that audiences tend not to extensively emote when watching a film, BlueSkeye’s face and emotion analysis tools are sensitive enough to pick up enough small variations and emotional cues to adapt the film to viewer reactions.
The analysis software measures facial muscle movement along with the strength of emotional arousal – essentially how emotional a viewer feels in a particular moment. The software also evaluates the positivity or negativity of the emotion – something we call “valence”.
We are experimenting with various algorithms where this arousal and valence data contributes to real-time edit decisions, which causes the story to reconfigure itself. The first scene acts as a baseline, which the next scene is measured against. Depending on the response, the narrative will become one of around 500 possible edits. In Before We Disappear, I use a non-linear narrative which offers the audience different endings and emotional journeys.
I see interactive technology as a way of expanding the filmmaker’s toolkit, to further tell a story and allow the film to adapt to an individual viewer, challenging and distributing the power of the director.
However, emotional responses could be misused or have unforeseen consequences. It is not hard to imagine an online system showing only content eliciting positive emotions from the user. This could be used to create an echo chamber – where people only see content that matches the preferences they already have.
Our research aims to generate conversation about how users’ emotion data can be used responsibly with informed consent, while allowing users to control their own personal information. In our system, the data is analysed on the users’ device, rather than, say, the cloud.
Big business, big responsibility
Non-conscious interaction is big business. Platforms such as TikTok and YouTube use analysis of users’ past interactions on the platforms to influence the new content they see there. Users are not always aware of what personal information is being created or stored, nor can they influence what algorithms will present to them next.
It’s important to create a system where audiences’ data is not stored. Video of the viewer or facial expression data should not be uploaded or analysed anywhere but on the player device. We plan to release the film as an interactive app, incorporating an awareness of potential abuse of the user’s data, and safeguarding any personal data on the device used to watch it.
Adaptive films offer an alternative to traditional “choose-your-own-adventure” storytelling. When the story can change based on the audiences’ unconscious responses rather than intentional interaction, their focus can be kept in the story.
This means they can enjoy a more personalized experience of the film. Turns out the old traditions of storytelling may still have much to teach us in the 21st century.