top of page
  • Writer's pictureHinton Magazine

Q&A : Autopilot

Autopilot is a play that explores the human right to make mistakes, through the vehicle of a modern love story and many of the insecurities that come with it. The work of Olivier-nominated, IdeasTap Underbelly Award-winning and two-time UK national poetry slam champion writer Ben Norris, Autopilot is directed by Sean Linnen, a former Associate Director at Paines Plough and Resident Assistant Director at the Donmar Warehouse.

We asked Ben Norris to share with us some thoughts about the play.

How does your background in slam influence your playwriting?

If anything, I've actually had to work against my poetry and spoken-word background when writing theatre! Poetry often centres on feelings, pausing to explore a single moment or experience; even in the most active poems, there is a necessary stillness, an interiority, and - more often than not - a singular voice. Whereas the best drama comes from action and conflict, from multiple points of view, and from people not exploring feelings and not telling you all about them! Obviously, there are exceptions, but this is certainly true in the case of Autopilot. I'm forever having to fight the urge to write huge poetic monologues that, although they might have merit sometimes (it's not for me to say), don't help the drama move along. Each edit of Autopilot was a process of making the language less and less poetic so that the play itself could become more and more so. The poetry, if you will, happens in between what is said.

Would you ever trust a car running on autopilot?

This is the big question! The question that was the seed for writing this play. In theory yes, because by the time they are rolled out more widely, they will be statistically far safer than a human-operated car. But there is something unnerving about it isn't there? Even if we're in greater danger, I think we like to be in control of the danger.

Do you think we prefer human mistakes to artificial intelligence ones? I certainly think it makes them easier to deal with, because there is someone to blame. We can investigate why a particular mistake happened, based on lack of skill or loss of concentration or whatever, and it gives us clearer recourse for justice. In the event where a machine has made an error, it's harder to know how to deal with that, legally and emotionally speaking. Who is to blame then? Where do we direct our anger or our hurt? If the person who owns/made the machine gets to keep profit generated by the machine, then surely they also have to shoulder responsibility for anything that the machine does wrong. But it's a very slippery question, and that's why I was interested in making a drama about it. We are all the products of learned behaviour, so does that mean individuals are always to blame for everything, or society at large?

The play is about a love story. Do you think machines will ever fall in love? That's a good question. One of my favourite films is Her, in which Joaquin Phoenix's character falls in love with an AI personal assistant called Samantha, voiced by Scarlett Johansson, and the two have a relationship. But does she ever really love him back, or is she just doing the best possible job of facilitating his needs, taking the role of personal assistant to its limits? She seems to have human traits, like a sense of humour and desire, but they will always be derived from human behaviour. It will always be a machine doing an impression of humanity. So, I think machines may one day be able to give the impression that they are in love, with humans or with each other, but they won't actually be feeling anything. It will still all be 1s and 0s. (Though if I'm wrong and one day a machine is reading this back and feeling hurt at my lack of faith in their emotional capacity then I'm sorry. I love you too. Please don't leave me.)

Did you use any artificial intelligence helpers to write the play? Not the play itself, but at one point one of the characters 'writes' a poem for the other, which turns out to have been written by a bot...and it was actually written by a bot.

Do you think human relationships can be better understood by following the problems that arise during the development of AI? I think we can learn a lot from the way that machines learn, that's for sure. The way self-driving cars, for example, not only learn from their own past mistakes to ensure they are continually improving their decision making and deepening their understanding of the world around them, but also have a shared neural network that they each feed all of their experiences into, so they can learn from each other's mistakes too. At its best, it's a kind of experiential commons, which enriches the whole 'population', not just an 'individual'. (Although of course in reality most companies will keep their designs and discoveries private, their neural networks exclusive to just their own brand of car, because: capitalism.)

Autopilot, Pleasance Courtyard (Forth), 1.35pm, 3-29 August (not 15)


bottom of page