Fear and Loathing in Virtual Reality

Jeffrey Pomerantz
9 min readApr 2, 2019

The filmmaker Chris Milk gave a now-famous TED Talk titled How virtual reality can create the ultimate empathy machine. In this talk, he argues that “film is this incredible medium that allows us to feel empathy for people that are very different than us and worlds completely foreign from our own”… a statement that anyone who has ever cried at a movie would probably agree with. And if film is capable of producing such empathy, VR is capable of producing more. Milk concludes by saying that VR “can change people’s perception of each other.” A great deal of VR development is based on this premise, including such projects as Project Empathy and The Machine to be Another.

As with any claim this bold, however, there has inevitably been backlash. Authors have written that it is ridiculous and even delusional to expect VR to produce empathy, and worse, that VR is an exercise in voyeurism and appropriates other people’s experiences. At the very least, VR is a new medium, developers and directors are still learning how to tell stories in it, and it is therefore inappropriate to use the frames of older media to conceptualize what VR is capable of.

Still, Milk is correct about at least one thing: in VR “you feel present in the world that you’re inside and you feel present with the people that you’re inside of it with.” VR produces similar, and sometimes identical physical reactions as similar experiences outside a simulation: people in VR may experience vertigo, adrenaline spikes, and motion sickness. This is in part why VR is effective for exposure therapy for phobias: being shown a simulation of, for example, spiders, while practising relaxation techniques, is as effective for reducing the physical reaction to one’s phobia as being shown real spiders. Even if VR cannot produce empathy, it can certainly produce physical reactions.

Exposure therapy is a well-established and empirically sound methodology for treating anxiety disorders. If it is possible to use VR as a mechanism for fear extinction, then surely it is also possible to use VR as a mechanism for fear creation.

It is probably only a matter of time until someone creates a VR simulation intended not to generate empathy, but the opposite. Clouds Over Sidra was created by Chris Milk and Gabo Arora in collaboration with the United Nations, and is the story of a 12-year old Syrian girl living in the Za’atari Refugee Camp in Jordan. The film has been used to advance the UN’s advocacy for the Syrian crisis, and to showcase the need to support children’s education in crisis situations. Carne y Arena is a VR installation created by director Alejandro G. Iñárritu; it enables users to experience “a fragment” of Mexican and Central American refugees’ experiences at the U.S.-Mexico border. These simulations were developed specifically to foster empathy for refugees. What will the VR community do when a white supremacist VR developer creates an anti-Clouds Over Sidra, an anti-Carne y Arena, a simulation designed not to generate empathy for refugees and immigrants, but fear and loathing of them?

If such a simulation doesn’t already exist. Which I would bet good money that it does. I’m just unwilling to spend time on the parts of the internet where I would have to go looking for it.

One obvious response to the question of “What will the VR community do when…” is to pre-empt it by suggesting that the VR community needs a code of ethics for building simulations. The Association for Computing Machinery already has a Code of Ethics, the very first principle of which is that a computing professional should “contribute to society and to human well-being,” and that a professional should “take action not to discriminate.” Both the National Education Association and the American Association of University Professors have codes of ethics. Surely the development of VR simulations is covered by these existing codes?

Perhaps. But none of these codes of ethics have enforcement teeth, unlike those of, for example, the American Medical Association or the American Bar Association. It is possible for someone who violates the AMA or ABA codes of ethics to have their license suspended or be disbarred. A software developer who violates the ACM Code of Ethics or an instructor who violates the AAUP’s Professional Ethics may lose their current job, but no mechanism exists to prevent them from practicing in their chosen profession in the future.

But that hardly matters. Regardless of whether you think it’s a good idea for software development or teaching to have such a permanent removal mechanism, it’s a moot point in the context of VR. There are a great many independent game developers out there, and it is likewise perfectly possible for VR simulations to be developed independently. Not only is there no equivalent of the Bar Association that a software developer can be kicked out of, an indie developer doesn’t necessarily even have a corporate software job that they can lose, as a consequence for a professional ethics violation.

What is to be done? There would seem to be no way to prevent ethically questionable or downright offensive VR simulations from being created. There would seem not to be even the possibility of imposing consequences on those who create such things.

While it may not be possible to impose professional consequences, however, it is possible to impose social consequences. Apple, Facebook, Google, and Spotify removed conspiracy theorist Alex Jones’ podcasts from their various platforms, for violating their hate speech policies. These, like all corporations, have terms of service, and may therefore decide that someone has violated them. While removing these podcasts does not entirely stop them from being disseminated, it certainly raises roadblocks to their dissemination, limiting their reach and hopefully also their impact.

Similarly, VR simulations, like most software, are generally distributed via platforms, and the organizations behind those platforms also have terms of service. Github is the largest hosting platform for source code in the world, and hosts software development kits for VR, if not actual simulations, while sites such as Sketchfab and Google Poly host 3D models and entire environments that may be used in VR simulations. Github has a set of Acceptable Use policies in their terms of service, which includes the agreement not to “upload, post, host, or transmit any content that… is discriminatory or abusive toward any individual or group.” Sketchfab’s terms of service similarly states that users will not “transmit any material or content that is pornographic, threatening, harassing, libelous, hate-oriented, harmful, defamatory, racist, xenophobic, or illegal.”

A VR simulation such as the anti-Clouds Over Sidra hypothesized above would violate Github’s and Sketchfab’s terms of service, and could not be hosted there. Other hosting platforms surely have similar terms of service in place. Increasing the friction of disseminating ethically questionable or downright offensive VR simulations (or any software) reduces its availability, limiting its reach and impact. Though if our hypothetical white supremacist VR developer wants to purchase and administer his own servers, and disseminate his own content from those servers, there is probably little that could be done about that.

In this, VR simulations are no different than any software, or indeed any information product. With the advent of the web and relatively inexpensive computing, anyone and everyone has the means of production. It is not possible to prevent the production of any information product. Rather, the burden falls on the user: Caveat emptor.

The information professions came to this realization a long time ago. The American Library Association’s Information Literacy Competency Standards frames the critical evaluation of information and its sources as essential in “the contemporary environment of rapid technological change.” Media studies scholars have articulated the importance of media literacy, and a whole suite of new skills that should not only be taught in the classroom, but acquired by everyone in order to be a full and active participant in modern culture; central to these new skills is Judgment, “the ability to evaluate the reliability and credibility of different information sources.”

On the one hand, a VR simulation is, like any piece of software, a piece of media content. On the other hand, VR is an experience, and therefore so much more than simply an information source. Emory Craig has spoken in many venues about digital literacy, and asks “What is literacy when we no longer analyze data on the homeless but stand next to them? … What is literacy when we no longer read a study on the cattle industry but stand inside a slaughterhouse… and gaze at the blood flowing past our shoes?” Evaluation is an important piece of both information literacy and media literacy, and evaluation is even more important for digital literacy: it’s one thing to critically evaluate an information source like a book or a video or other discrete piece of our experience, but we’re not used to thinking critically about the totality of our experience. Seeing is believing, right?

Not any more. Of course, no one is going to be put into a VR simulation without their knowledge. This isn’t The Matrix. And of course simulations aren’t that realistic yet anyway. But VR feels real enough when you’re in it. We’re not used to walking down the street and having to think about the assumptions and systematic biases that are built into what we’re experiencing. But that’s exactly what we have to do in a VR simulation, because that simulation was built by someone, and that street and the experiences you have on it tell a story. Yes, everyone is the hero of their own story, but that expression isn’t usually meant quite so literally.

Of course, some of us more than others are used to having to think about the assumptions and systematic biases built into our everyday experiences. I write this as a fully-abled neurotypical cis straight white man. Of course I’m not used to thinking about the assumptions and systematic biases built into the world I inhabit; all those assumptions and systematic biases favor people like me, and thus become largely invisible to people like me. Others are not so socially favored, and therefore, I would assume, have spent more time thinking about the systematic biases that are built into the worlds they inhabit.

Even the most ardent social constructionist, however, is only going to be used to thinking about the assumptions and systematic biases that exist in the physical world. The assumptions and systematic biases that exist in a simulation, on the other hand, may be subtly or extremely different from those in the physical world. No matter how woke one is in the physical world, it will take practice to develop the skills to critically evaluate simulations.

Which brings us back to digital literacy. The skills required for the critical evaluation of simulations is useful not only in that context, however, but also in the context of the physical world. Instead of digital literacy, perhaps we need a new term for this set of skills. Social literacy? Virtuality literacy? Reality literacy?

Or perhaps this is just good old-fashioned critical thinking.

The set of critical thinking skills — digital literacy skills — that are necessary to evaluate simulations are, if not identical, then at least similar to the set of critical thinking skills necessary to identify the assumptions and systematic biases in the physical world. One of the most important pedagogical uses of VR is for learners to gain practice, through repetition, in skills that are challenging to practice in the physical world. And while the physical world certainly provides plenty of opportunities to identify systematic biases, doing so can still be a challenge. Practice is helpful. Providing learners with opportunities to practice this type of critical thinking and critical evaluation could be one of the most valuable uses of VR.

VR may not be the ultimate empathy machine. But it may be able to provide practice in identifying systematic bias, power structures, and the causes and motivations that led to those being in place. A simulation of a refugee’s experience in a camp may or may not be an appropriation of another individual’s experiences. But if that experience leads the user to a critical re-evaluation of the social forces that led to the existence of that camp in the first place, perhaps it is worth it? Using VR as a machine to be another may or may not be voyeurism; a simulation designed to explore power dynamics could provide the user with the experience of having or lacking various types of agency.

VR may or may not be the ultimate empathy machine. It may, however, be one mechanism by which we can move towards a world in which we don’t need an empathy machine in the first place.



Jeffrey Pomerantz

Information scientist & professor. Founder of Proximal, maker of educational VR. Author of the MIT Press books Metadata & coming soon Standards.