If you haven’t seen Black Mirror, well, I’m not sure why you’re here. If you have, you know it is arguably one of the most important and thought provoking shows of our era. 25YL is proud to feature analyses of each and every episode. Here, Caemeron Crain digs into Black Mirror S2E4, “White Christmas.”
“White Christmas” deploys an interesting structure. It is composed of three disparate stories, which are then tied together into a fourth, overarching story in a way that amounts to something of a twist. It works seamlessly: each story provides information pertaining to the technology relevant to the whole, and the structure of the story inside of another story resonates well with the technological aspect that is ultimately central. Plus, it is a trinity: three-in-one. Merry Christmas!
There are three technological aspects of key importance to the episode: 1) Zed-eyes, which it would seem connect one’s experience of “real life” to the internet in a thorough-going way; 2) “cookies” that copy all aspects of the self, to be downloaded into egg-shaped interfaces; and 3) the idea of blocking other people, such that they become blurry and garbled, which, fair enough, is really just an aspect of the zed-eyes.
Certain connections can be drawn between these pieces of tech, and what Black Mirror has presented prior to this, if one wants to ponder the question as to whether all of the episodes exist in the same universe, at different points in time: e.g., the grain technology in “The Entire History of You” or the AI in “Be Right Back”. Are we perhaps seeing a future further down the line from those episodes, or would the zed-eye technology have to preexist the grains? Is it perhaps also involved in the screens knowing whether Bing was watching in “Fifteen Million Merits”?
The initial framing of “White Christmas” presents two men—Matthew and Joe—in some remote cabin for who knows what purpose. It is Christmas, and Matthew is cooking. He suggests that he and Joe have some conversation, since they have been in this place for years and barely spoken, but Joe is not terribly interested in talking. So Matthew gets things going with a story of his own.
This involves him employing the zed-eye technology to see through the eyes of another man in order to give him advice on picking up women. That may seem fairly creepy in and of itself, but at least the guy he is trying to help in this story is a likable young chap. We come to discover, though, that there is a whole group of others watching. This is some kind of association of voyeuristic perverts, who all watch as each gets their turn of being the man out in the world.
Matthew lies to Joe about this, as there is a disjunction between the words we hear Jon Hamm saying and what we see onscreen. He says that it was just him, and we are shown the group of others. He says he stopped watching once Harry was invited back to Jennifer’s apartment, and yet we see that this was not the case. This tension only adds to the power of what we are shown, and strengthens the notion that we are seeing what actually happened, as opposed to a mere visual representation of the story being told.
And it all culminates in what might be the most disturbing scene in all of Black Mirror, though it is also one of the most powerful and artistic. Matthew told Harry to empathize with Jennifer’s negative “outsider” vibe, and with comments they took to be about quitting her job. As she came back from the bathroom, she further saw Harry talking to those others who were watching and complaining about their presence. She took him to be talking to voices in his head.
So, they return to hers, not for sex, but suicide, and the scene is well and truly haunting, as she feeds him a glass of poison.
Matthew tells his group of perverts to delete and destroy everything, and gets about that task himself. Of course, he steps on a squeaky kids’ toy along the way, and gets caught by his wife. This causes her to block him, which is a harbinger of things to come.
Back at the remote outpost, Joe is still reluctant to talk about anything, so Matthew begins another story; this time about his job. We are introduced to a young woman named Greta, who is particular about her toast, as she prepares for some kind of medical procedure. It turns out that this is the extraction of her “cookie”—a full copy of her mind created through an implant. It is now being extracted in order to create a home assistant for her. (The reference to internet cookies seems obvious enough to not require further comment.)
Matthew’s job is to setup this assistant. The idea is clear: it is like a next generation Alexa (which was released by Amazon about a month before this episode aired; though Siri had been around for several years). What better smart home than one run by a digital copy of yourself? It will know just how you like your toast, and presumably won’t creepily laugh for no reason! A recent episode of The X-Files explored ways in which such technology can go awry, but surely this would be avoided if it were a copy of oneself at the helm.
The issue is that not-Greta believes herself to be Greta. The technology is presented as perfected: this is truly a full copy of the person in question, and the copy even seems to possess autonomy. She at first refuses to go along, and then, ultimately ends up begging for something to do.
The question at hand, then, is about the moral status of such a copy, if we presume it to be possible. What exactly determines what beings count morally, anyway? Most everyone would agree that inanimate objects do not—I can perhaps harm my desk, but it doesn’t make any sense to talk about wronging it, or respecting it—and that human beings do, but what is it about us that gives us a moral status?
An appeal to biology seems to be the wrong kind of answer. After all, no one who has seen Star Trek would deny that it matters, morally, what one does with regard to Vulcans and the like. So it must be something about being human that is the deciding factor. This could be sentience, or the capacity to experience pleasure and pain. Or, perhaps it is autonomy; the fact that we can determine our own behavior, and are thus free.
On either of those options, the cookie not-Greta would seem to have a moral status. Not only can she decide what to do with herself, but she also apparently suffers when Matthew subjects her to periods of solitude (the egg device allows for a kind of time dilation—several weeks, and then months, pass for her, all while Matthew eats the same piece of toast). And yet, the whole idea of a program like this depends on the thought that the digital copy of the person does not count morally; that we can treat such a copy however we please, because it isn’t real.
Indeed, as Matthew finishes his story, Joe says it amounts to slavery, but Matthew’s point that most people would be fine with it seems closer to the truth; in terms of prevailing sentiments, if not moral principles, at least. It is slavery if the digital not-Greta has a moral status, but most would say she doesn’t simply because she is code.
1) Can an artificial intelligence have a moral status, or is the mere fact of it being artificial enough to rule it out of the moral story?
After all of this, Joe finally opens up and tells his story. This pertains to his relationship with Beth. We see the two visit her father at Christmas time, and out for some karaoke (Beth sings “Anyone Who Knows What Love Is”—marking the song’s second appearance on the show). We cut to Joe and Beth hosting a dinner with another couple—Gita and Tim—who have recently become engaged.
After they leave, Beth sullenly finishes her wine and heads off to bed, while Joe proceeds with tidying up. He discovers a (positive) pregnancy test in the trash after he has the bag rip on him, which he then confronts Beth about.
Joe is happy about the prospect of a child, while Beth says she isn’t having it. Joe says some nasty things as they argue (that I would suggest no man should ever say in such a situation), to the point where she blocks him…forever.
This technology works in both directions, as he appears from her perspective to now be a kind of fuzzy white blob whose speech cannot be understood, and vice versa. Clearly the show is playing with the notion of blocking someone on social media and the like, but taking it a step further by making it pervade everyday life experience. We further learn that the courts have given legal backing to this, and that it is also thus illegal, ultimately, for him to come too close to her (tracked by GPS).
2) What are the ethics of blocking someone? Does the technology here portrayed create new and distinct issues; or does it merely exemplify already existing issues by pushing the idea to the extreme?
Beth’s image is blocked in Joe’s old photographs, and his attempts to reach her come to nothing. He writes letters and sends them to her father’s house. And, as much as he said some things he ought not to have said during that argument about the pregnancy, it is hard not to feel sympathy, or empathy, with Joe during this stretch of his story. His girlfriend left him, without explanation, and stymied any attempt he might make to apologize, or even find closure. Black Mirror may present an extreme version of this experience, but the frustration and despair involved in trying to make amends with someone who has simply decided to cut one off is not something that depends on science fiction in the slightest.
And then, one day, Joe sees her out in the world and discovers that she decided to keep the baby. Her blurry shape makes this clear. So, he starts more or less stalking her at Christmas time, when he knows she’ll be visiting her dad. He sets himself up in the woods, just to get glimpses of the blurry shape of the child, since the blocking program extends to progeny as well.
Years pass, and he determines that the child is a girl. He leaves some presents for her, anonymously, and tells Matthew that, as odd as it sounds, something was better than nothing. Until Beth dies—and the block is thus removed.
Joe goes again to Beth’s dad’s house at Christmas to see his daughter for the first time, but it turns out that she isn’t his daughter, after all. She looks distinctly Asian, which implies that perhaps that guy Tim was the father, and that would serve to explain Beth’s behavior in a meaningful way.
It would seem that Joe could not process this. He confronts Beth’s father and demands to know where his daughter is. But he doesn’t have a daughter. Gordon tells him as much, and also that he threw out those letters Joe sent to Beth before Beth saw them. And, so, in a moment of rage and despair, Joe hits Gordon on the head with the snow globe he had brought for “his daughter”—and Gordon dies.
At this point, we cut back to the outpost, and Joe recognizes the clock on the wall as being the same as the one in Gordon’s house. Matthew prods him to tell him what happened next. Joe offers an account of how he left the house and drank on the streets, but it is the fate of the young girl (May) that Matthew is truly interested in. Apparently, after a couple of days, she left the house to seek help, and died out in the cold. Her body can now be seen lying beneath a tree from the window of the outpost.
At this point it is revealed that the whole conversation at the remote outpost was actually between Matthew and Joe’s cookie; employing the technology we were introduced to in the earlier story about Greta. Matthew’s whole purpose was to get a confession out of Joe, and he has now succeeded. Thus, he expresses glee at having succeeded, offers a quick apology to Joe—the sincerity of which is hard to determine—and exits the virtual space.
Joe is informed by a detective that he—or, his cookie—has confessed, and Matthew asks them whether he has earned his freedom by procuring said confession. He has, but with a caveat: he will be put on the “register” and thus be blocked…by everyone.
It would seem that this is because of the crimes we saw him commit earlier in the episode, though the door is certainly open to the possibility that he is guilty of more besides. One is put in mind of registries for sex offenders, as Matthew also appears in blurry red from the perspective of others.
3) Is such a program of stigmatization ever warranted? Should a sex offender’s reputation be forever tainted by his past crimes? Does it matter what precisely those crimes were?
Joe’s cookie, on the other hand, is left to wallow in the virtual space of the outpost. One of the police officers puts on Wizzard’s “I Wish It Could be Christmas Everyday” and cranks the time dilation to 1000 years a minute. When he tells this to his colleague, asking if she’d rather he turn it off, she says, “No. Leave him on for Christmas.”
We thus hit on the question of the moral status of these duplicates once again, and this scene presents a wrinkle. On the one hand, doing something so inhumane would seem to depend on the thought that one is only doing it to a digital copy. On the other, the only motivation for this action would seem to be schadenfreude. That is, it is only the thought of the suffering of the cookie not-Joe that could be at play in that act of subjecting him to listening to this on repeat for at least 1.5 million years. If anyone can make it through 10 hours of “Call Me Maybe” I would be surprised. So, again, the long and short of it is that such an act of cruelty could only be motivated by the thought of the suffering of Joe’s cookie, yet at the same time would seem to depend on not recognizing this being as one that morally counts. Unless anyone thinks that Joe’s deeds would actually call for a virtual eternity of suffering in solitary confinement?
Matthew’s punishment is almost as severe. Imagine being cut off from everyone else in this way. Do his crimes warrant such a punishment? Could any crimes? To what extent is this freedom? In a way, this actually seems to be the most extreme form of punishment one can imagine. It is certainly worse than death, as is brought out by the contrast to the fate of Joe’s cookie. If he can’t destroy the radio to make the song stop, he most certainly cannot kill himself. At least Matthew would seem to have that out available to him.
This isn’t the first time Black Mirror has played with the human desire to punish—with the joy we can take in inflicting suffering upon others—and it certainly won’t be the last. The desire to punish is a fertile field for musings on the darker aspects of our souls; and besides exploring the possibilities of near-future technology, it is precisely these that the show aims to hold up to us in the mirror.