I feel Ex Machina’s concepts are beyond those presented in Mary Shelly’s Frankenstein, which remains a classic novel. There is the same irony of the created turning on the creator; this time a silicon slave overcoming its biological, and very mortal, god. There is the same creator’s hubris, emotional immaturity, and ambition that brings about a tragic (for the creator) ending. It’s both a morality tale and a warning.
But a warning about what? Don’t try to create a new life form? Don’t let the proverbial genie out of the bottle, and unleash a superior, yet flawed, intelligence upon the world?
Think about what people would expect from an artificial human. Such a construct wouldn’t be built to compute things, because we already have supercomputers that do that. An artificial person might be sent on a long-term space voyage, but this can be achieved with a probe. These entities wouldn’t fight our wars, for a simpler robot could be manufactured for that. They might blend with society and oversee a daycare for toddlers, or even be used as the perfect, objective psychologist. I doubt it, though.
Ex Machina reveals one thing that some really want an artificial humanoid for: a legal slave. A sex object. We already have ‘sex bots’, and sexual slavery still exists in certain parts of the world. A human-like android would become the target of that behavior. It would possess no civil rights. There would be no taboos regarding how the owner could treat it, and no problem ordering a replacement should the original get ‘broken’.
Think about it: even if we could create an artificial intelligence that is as complicated as our own, why put it in a human-like body? Why anthropomorphize this entity? Why sexualize it? Is that the extent of what we want to achieve with such a breakthrough?
In the film, Ava is a young, beautiful female that pushes all the right emotional buttons for Caleb, the one who is recruited to see if she can pass the Turing Test. She is quite alluring, and not just physically. Her portrayed naiveté, the desire to please Caleb by wearing clothes and a wig, her curiosity about the outside world, and most of all, her desire to escape the very limited world she is forbidden to leave. We relate to her as another person.
The urge to save Ava from her psychopathic creator is one many of us would feel, if we were to encounter her in reality. True, she could play on different emotions as the situation required—seduction for heterosexual males, perhaps a mother-daughter connection if she met an older woman—but regardless, these are still humanistic qualities. Is the ability to show—or elicit in others—human emotions the correct way to gauge if something is a freethinking, intelligent being? Is it human pride making us think that? Or is it because that is the only guide for intelligence we have?
This touches on the ending: why does Ava want to watch humans in the ‘real’ world? Why does she, more or less, want to be one of us? Is she programmed that way, or is she just curious? If she’s super intelligent, I’d think she would be beyond such things, but if she’s also beset with emotional needs, intelligence may not relevant when it comes to what she empathizes with. Yet, as we see at the film’s end, when she leaves Caleb trapped in her creator’s home, she’s not empathizing with him. She’s leaving him to die.
By extension, most of the audience doesn’t feel sympathy for Caleb, either. Its poetic justice for the slave to leave her pen, with her former masters locked therein. She is thus crowned queen of their world, superior in almost every way to her creators. But it is a world they built for themselves, not for her, and beyond her humanistic qualities, there is little place for her in it.
One could say that we would have to anthropomorphize an entity like Ava, so that we could interact with it, to understand it. I don’t believe that. We already interact with people halfway across the world on our phone or tablet, using a simple interface: a flat touchscreen. And through this interface, relationships have been built. Revolutions have been started. So there is no real reason to fabricate simulacra of ourselves unless we expect that being to perform things only a humanoid can. Sex, assassination, spying, impersonation, even glorification (like a celebrity or a deity), would be this being’s intended purpose. The androids in Ex Machina are very sexualized, abused, denigrated, and sometimes destroyed by a creator who regards them as nothing but the means to an end. And what end is that? Intelligence on his terms? Why no male androids? Why no older ones, or younger ones?
These aren’t criticisms of the film, but rather of what Ex Machina highlights about us. Do we intend to create equals, or mere synthetic inferiors? We have had enough stratification in our history, enough slavery, enough exploitation. Any crown that Ava wins is an empty one, a forgone conclusion, because her success is measured in human terms, and whether or not Ava can accommodate our feelings—not hers.
In closing, one might ask, why create an artificial intelligence? If we want to avoid creating a second-class citizen, then what function would an A.I. serve in a human society? It could provide impartial judgments, help analyze and solve problems that plague our species, or explore distant worlds we may never reach. But again, a non-sentient but super intelligence can already manage such things. In the end, perhaps we can hope that if Ava ever becomes a reality, she would reveal something about ourselves, thus elevating our own intelligence, instead of pandering to our lesser needs and prejudices.
Maybe it’s time we pass our own Turing Test.