Hated in the Nation
I’ve been talking about Black Mirror with my flatmates recently, and it’s made me want to rewatch the episodes. I’m a huge fan, and I’ve seen the episode at least three times and still don’t grow tired of it. So, first and foremost, I would advise everyone to watch this tv show. It’s fantastic and shows a futuristic society in which technology is central to our lives. It enables us to raise the appropriate ethical and technological concerns. So, let’s return to the episode I want to talk about today: Hated in the nation (ep. 6, s. 3).
First, let me give you a little recap of the episode, so if you haven’t watched it yet, please do so or you’ll be spoiled.
The episode is set in a near-future London. The episode begins with Karin Montgomery being led into a hearing chamber to address a case in front of a panel of judges. From there, we flash back to May 15 and get our first view of Karin. She is divorced, lives alone, and has an unhealthily diet. As she eats her evening Pringles, she watches the news, which includes a report about the internet turning against journalist Jo Powers for posting an article that everyone despises, as well as a report about the release of Autonomous Drone Insects (ADIs), robot bees that compensate for the fact that bees are extinct.
Karin is summoned to Jo’s house that night after she is discovered on the floor of her home with her neck cut. Karin arrives and meets her new tech-savvy companion Blue. They leave after inspecting the house, and Karin offers to give Blue a ride. During the trip, Blue reveals that she used to work in cyber forensics, but seeing the heinous things people store on their phones inspired her to come out into the real world and do something about it.
Because of her history, Blue assumes Jo’s death is related to her badly received piece and begins going through her mentions on social media. Karin, on the other hand, is not so easily persuaded. Their first trip is to the place of the woman who delivered Jo a “F*** you b****” cake, a teacher. She provides them with their first lead: #DeathTo, a hashtag used on the internet when someone is being an a**hole.
The next lead comes from Jo’s autopsy. One of the ADIs is discovered in Jo’s brain by the coroner. The ADI is to blame for driving her insane and leading her to cut her own throat. This, however, is not the only ADI-related mortality. Doctors discovered one inside a rapper who had lately been the target of internet wrath. Someone has taken over control of these bees and is using them to kill. Blue begins to piece together the puzzle and realizes this is all part of a Twitter game called Game of Consequences: the person who receives the most #DeathTo tweets in a day is murdered. Unfortunately, most people are unaware that utilizing this hashtag might have fatal consequences. This is definitely an attempt to investigate internet fury and shaming. It’s leading us to think about how we behave online and whether we let our digital identities get the best of us.
Blue finds that a man called Garrett has published a manifesto in which he states that he is doing this to demonstrate the dangers of extreme public shaming. He wants others to understand what it’s like for their words to have significance. After failing to catch Garrett, Karin and Blue begin searching through his files and learn that he has a list of everyone who tweeted #DeathTo. Once he takes control of all of the ADIs, he orders them to murder everyone on the list for using the hashtag.
The AI nightmare portrayed in this episode begs the question, “What are the ramifications of weaponizing internet outrage?” How may facial recognition technology exploit the flaws in mob behavior on social media?
The widespread monitoring and surveillance of individuals is a major ethical problem. This episode shows that the ADI project was approved by the UK government on the condition that the ADIs use advanced facial recognition technology and that the visual feed would be accessible by the government’s national security services team. It warns us that countries are on a path to become dictatorial surveillance governments with complete control over data about us (Hi Edward Snowden hihi) with no assurance that such information would not be exploited by the state or enterprises. It also highlights the vanity of organizations that trust their own security systems and highlights the susceptibility of surveillance technologies to be misused. It implies that unrestricted government monitoring poses far too many risks to be granted full rein. As a result, these institutions must be held accountable for their acts.
Facial recognition technology is already in use by police in a number of places, but it is also notorious for having incredibly high mistake rates. COMPAS, for example, is an artificial intelligence tool used by law enforcement to predict recidivism. Even without the added face recognition technology, COMPAS is sadly prejudiced towards Black people, leading to racial and intersectional discrimination against Black defendants. Manufacturers with good intentions can build technologies that disproportionately harm people of color and women. Coded bias on Netflix also depicts this issue with AI and the discrimination it leads to.
So this episode is actually about helping us think about what the drawbacks of AI, rather than simply the positive aspects of it.
Sources:
Brownsword, R. From Erewhon to AlphaGo: For the Sake of Human Dignity, Should We Destroy the Machines, (2017), Law, Innovation and Technology, https://www.tandfonline.com/doi/abs/10.1080/17579961.2017.1303927
Hart, M. Black Mirror ‘Hate in the nation’ is a honeycomb of mysteries, Nerdist, (2016), https://nerdist.com/article/black-mirror-recap-hated-in-the-nation-is-a-honeycomb-of-mysteries/
Kwon, S. Black Mirror’s “Hated in the Nation”: Facial recognition is a weapon, The University of Melbourne, (2021) https://law.unimelb.edu.au/news/caide/black-mirrors-hated-in-the-nation-facial-recognition-is-a-weapon
Nafisa, N. Construction of Dystopia in Black Mirror: Hated in the Nation, Passage, (2020), https://ejournal.upi.edu/index.php/psg/article/view/22994