Westworld and the Harm Principle
HBO’s Westworld is a new sci-fi series currently on its fourth episode. Based on the 1973 film by Michael Crichton, the show presents a world where the advancement of technology leads to the invention of a “theme park” populated by androids. The theme of the park is ironically retrospective: The Wild West.
Human tourists are transported to the park where they interact with robot-people programmed to behave and “think” as if they were living during the time period of the American Expansion. The tourist experience is depicted as combining immoral indulgence of Grand Theft Auto with the palpability of a camping trip.
Westworld is designed to be the perfect blend of degeneracy, combining frontier-style excess with modern attention spans. And the results are straight up hideous – violence for sport mixed with robotic brothels and other stuff even HBO probably hesitated to air. Yet the immorality of Westworld is contained and technically causes no direct harm to people. By analogy, video games can be distasteful, but certainly not criminal. Why should that change with the invention of a game that allows for first person indulgence without the buffer of a screen?
And it is not so farfetched. Just think about how realistic interactive gaming has become – within a matter of years, sports games went from being no more detailed than foosball, to being able to see individual drops of sweat. It might not be long before there really is some anachronistic theme park in Silicon Valley, where you can spend your long weekends as if you were going fishing. That time will likely come before we have had the chance to evaluate whether we really want a thing like that to exist, or even whether there are any grounds to prohibit it.
Personally, I do not think an argument based purely on morality would be successful. We have slaughterhouses to prove that consumers are valued more than the immorality inherent within the product, even when it does cause harm in some circumstances. One way to justify a prohibition is to argue that the tourists will bring the moral corruption back with them to the real world.
That is an appealing option, but moral corruption is not something a customs officer can catch at the border. At best any proof of harm coming back would be hindsight, like research about the long-term effects of smoking cigarettes, and by the time it is proven it is already intertwined with our society.
Maybe we could bite the bullet and argue that such a theme park should be banned because the androids are treated unfairly. This would require an argument that the rights of robots do in fact exist – one of the major themes that Westworld explores.
Now, in 2016 it is generally accepted that Hologram 2Pac is not ready to go on tour and that you can drop your phone without feeling bad about it. But perhaps soon we will live in a world where some innovation is birth: 3DPac will be able to write his own songs and Siri can meaningfully accept an apology.
This idea of attributing humanity to artificial intelligence is not new, but in recent years the exponential progress in technology has both intensified the sci-fi experience and made its questions hit a lot closer to home. Even though we have seen the Turing test passed recently on films like I, Robot, Her, and Ex Machina, Westworld has the advantage of being a series, which allows viewers to become more attached and empathetic to the android characters through each episode.
If you are skeptical, I guarantee that Westworld will provide you with a vivid case for why we should look at the line separating people from mechanisms as dotted.
Shanil Patel (1L) is Staff Writer for Juris Diction.