This is the first installment in a multi-part series entitled Personal Surveillance. To read the other parts once they become available, click here.
George Orwell predicted, among many other things, a massive state surveillance apparatus. He wasn’t wrong; we certainly have that. But I’d submit that it’s also not the average person’s greatest threat to privacy. There’s the old saying that the only thing protecting citizens from government overreach is government inefficiency, and in this case there’s something to that. Surveillance programs are terrifyingly massive in their reach, but simply aren’t staffed well enough to parse everything. This may change as algorithms become more advanced in sifting through data, but at the moment, we aren’t efficient enough to have a thought police.
The real danger to privacy isn’t what a bureaucrat is able to pry from an unwilling suspect, but what an onlooker is able to discern from an average person without any special investigative tools or legal duress. The average person is generally more at risk from stalkers than surveillance. Social media is especially dangerous in this regard, and the latest scandals surrounding Cambridge Analytica, et. al. are a good example of how social media can be used for nefarious purposes.
Yet despite lofty and varied criticism, I am willing to bet the overall conclusion of this latest furor: the eventual consensus will be that, while social media may be at fault, its developers are not guilty of intentional malice, but rather of pursuing misaligned incentives, combined with an inability to keep up, whether through laziness or not grasping the complete picture soon enough, with the accelerating pace with which our lives have become digitized.
Because that is the root problem. Facebook and its ilk started as essentially decentralized contact lists and curated galleries, and twitter and its facsimiles started as essentially open-ended messaging services, but they have evolved into so much more. Life happens on the Internet nowadays.
In harkening back to the halcyon days before the scandal du jour, older people have called attention to the brief period between the widespread adoption of television and the diversification; the days when there were maybe a baker’s dozen channels. In such times, we are told, people were held together by what was on TV. The political issues of the day were chosen by journalists, and public discourse shaped almost solely by the way they were presented on those few channels. Popular culture, we are told, was shaped in much the same way, so that there was always a baseline of commonality.
Whether or not this happened in practice, I cannot say. But I think the claim about those being the halcyon days before all this divide and subdivide are backwards. On the contrary, I would submit that those halcyon days were the beginning of the current pattern, as people began to adapt to the notion that life is a collective enterprise understood through an expansive network. Perhaps that time was still a honeymoon phase of sorts. Or perhaps the nature of this emerging pattern of interconnectedness is one of constant acceleration, like a planet falling into a black hole, slowly, imperceptibly at first, but always getting faster.
But getting back to the original point, in addition to accelerating fragmentation, we are also seeing accelerated sharing of information, which is always, constantly being integrated, woven into a more complete mosaic narrative. Given this, it would be foolish to think that we could be a part of it without our own information being woven into the whole. Indeed, it would be foolish to think that we could live in a world so defined by interconnectedness and not be ourselves part of the collective.
Life, whether we like it or not, is now digital. Social media, in the broadest sense, is the lenses through which current events are now projected onto the world, regardless of whether or not social media was built for or to withstand this purpose. Participation is compulsory (that is, under compulsion, if not strictly mandatory) to be a part of modern public life. And to this point, jealous scrutiny of one’s internet presence is far more powerful than merely collecting biographical or contact information, such as looking one up in an old fashioned directory.
Yet society has not adapted to this power. We have not adapted to treat social media interactions with the same dignity with which we respect, for example, conversations between friends in public. We recognize that a person following us and listening in while we were in public would be a gross violation of our privacy, even if it might skirt by the letter of the law*. But trawling back through potentially decades of interactions online, is, well… we haven’t really formulated a moral benchmark.
This process is complicated by the legitimate uses of social media as a sort of collective memory. As more and more mental labor is unloaded onto the Internet, the importance of being able to call up some detail from several years ago becomes increasingly important. Take birthdays, for example. Hardly anyone nowadays bothers to commit birthdays to memory, and of the people I know, increasingly few keep private records, opting instead to rely on Facebook notifications to send greetings. And what about remembering other events, like who was at that great party last year, or the exact itinerary of last summer’s road trip?
Human memory fades, even more quickly now that we have machines to consult, and no longer have to exercise our own powers of recognizance. Trawling through a close friend’s feed in order to find the picture of the both of you from Turks and Caicos, so that you can get it framed as a present, is a perfectly legitimate, even beneficial, use of their otherwise private, even intimate, data, which would hardly be possible if that data were not available and accessible. The modern social system- our friendships, our jobs, our leisure -rely on this accelerating flow of information. To invoke one’s privacy even on a personal level seems now to border on the antisocial.