The surveillance society dystopia has attracted more mainstream critical attention than the entire rest of genre combined, thanks to a rich array of prescient offering from Orwell’s 1984 to Cory Doctorow’s Little Brother. Yet, now that we arguably live in a surveillance society, these stories have become less cautionary. People in power now know virtually everything they could want to about all of us, and, for the most part, Westerners are happy to be watched. Now the question is not how to avoid Big Brother, but how we live with him.
Or, as in the case of Joseph Tomaras’s ‘Bonfires in Anacostia’ (Clarkesworld # 95), how our ignorance serves him. Told from the POV of the devices that watch us, ‘Bonfires’ traces how passive surveillance and data gathering are used by an unspecified Agency to make assumptions about the potentially political activities of ordinary people, with fatal results.
The most poignant thing about Tomaras’s story is the liberal, upper-middle-class characters’ cavalier attitude towards technology, surveillance, and power. They know their devices have the capacity to listen to them and they are aware of the legislation that allows the Agency to collect their personal information. These are ordinary, educated Americans who could know better but choose not to; too comfortable to be cautious. Too privileged to fear.
The story is hardly speculative, the only science fictional element being an ongoing civil disturbance in Washington’s Anacostia, where the poor, disempowered citizens burn down the holographically-enhanced structures of their neighbourhood. The holograms feel like a scifi coating: totally unnecessary to the story. Without the holograms, this is any number of places in modern-day America. What matters are the fires, the “riots”, and the government’s desire to keep the disturbance under wraps. Far away, on the white side of town, a few flagged words are spoken too close together within their device’s hearing, and the Agency activates, making assumptions about whose loose lips might be spreading the news that they don’t want spread.
Of course, it is the poor, black kid who dies for it in the end. The story’s comfortable – and white – surveilled characters walk on, unaware that what they have said has cost two people their lives. Tomaras has done a brilliant job of demonstrating why the privileged in our society allow themselves to be watched – because they aren’t the ones who have to pay for it.
Jo Walton takes a different tack in her story ‘Sleeper’ (Tor.com August 12th, 2014). The surveillance in this near-future England is presumed to be complete, but Walton offers a crack through which people could climb.
Matthew, a simulation of a mid-20th century British intellectual, has been put together by a biographer, Essie, in order to answer her questions about his life. Though he is meant to be, essentially, a bonus feature to bundle with the ebook, Essie has more in mind. The near-future is even less socialist than the world Matthew lived in, and between the dry lines of an academic biography, she hopes to slip ideas into the heads of the readers. “They pick up the books for the glamour, and I hope they will see the ideals too,” she tells Matthew.
Which would be easy to do on its own, but Essie wants more than to germinate ideas – she wants a revolution, and with communications controlled by corporations, she needs a backdoor into people’s minds. Walton does a wonderful job of intertwining cloak-and-dagger espionage and the role of the biographer. Casting an admired celebrity’s life in a particular political light has long been the modus operandi of the biographer, and Walton simply takes what we all know about written histories one step further. History serves the agenda of those who write it, and through the simulation of Matthew – a program written as much by her as the book is – Essie is able to do more than simply suggest ideas – she can make plans.
Essie’s biographical reconstruction of Matthew feels a little like a plot hole: if censorship is as complete as she believes, why would her publisher let her deliver an un-edited copy of her simulation? Why would a reader’s conversation with a simulation not be recorded? We can generously call it her society’s oversight – the technology is new, and perhaps they do not understand its potential yet. Essie has created a portrait of a person which suits her agenda, as a biographer does, and people with lips shut tight keep the best secrets of all.
James Gunn has a little more fun with life in the surveillance age in his latest, ‘Patterns’ (Asimov’s September 2014). A short, funny piece, Gunn reminds us that while we might trust those who collect our data, you never know who else might be reading over their shoulder.
Jeremy works for the NSA and his job is to spot patterns. All three stories reviewed here problematize the surveillance society in the same way: as Jo Walton’s Matthew says, “the classic problem of intelligence is collecting everything and not analysing it.” For Walton, this meant people could risk simply not being noticed. Tomaras proposes an algorithm that flags certain words and links them with intentions – a system which turns out to be fallible. Gunn’s Jeremy is a sort of savant, employed to see the big picture. A person’s individual emails are of no interest to him – trends and patterns are.
Even non-specific data can be problematic in the wrong hands, as Jeremy knows. Big Data on “where they lived and how they communicated, the state of their technology” could be of use to the right person or group. Jeremy, hapless observer, can’t guess what somebody would do with this information, but he knows that “information is power.” Whoever is hacking and sorting their data, they have a lot of it to work with.
Gunn plays the scenario for clever turns of phrase and a tongue-in-cheek finale, but there’s more than a little truth behind the lighthearted telling. Today’s surveillance society is often sold to us as being made “safe” by a blanket of anonymity. They will know our consumer patterns, but not who, exactly, bought what. ‘Patterns’ reads like a perfectly-told joke, but Gunn shows that there’s a lot a person can learn from aggregate data. From how you might lose your job to why someone would sort your data, you can anticipate the future if you just know the pattern. There is no innocuous surveillance.