I’m a skeptic and an intellectual, so I don’t put too much weight coincidence. But then again, I’m a storyteller, so I love chalking up coincidences as some sort of element of an unseen plot.
Yesterday, my YouTube music playlist brought me across Halsey’s Gasoline. Thinking it over, I probably heard this song in passing some time ago, but if I did, I didn’t commit it to memory, because hearing it was like listening to it for the first time. And what a day to stumble across it. The lyrics, if you’ve never heard them, go thusly:
And all the people say
You can’t wake up, this is not a dream
You’re part of a machine, you are not a human being
With your face all made up, living on a screen
Low on self esteem, so you run on gasoline
I think there’s a flaw in my code
These voices won’t leave me alone
Well my heart is gold and my hands are cold
Why did this resonate with me so much today of all days? Because I had just completed an upgrade of my life support systems to new software, which for the first time includes new computer algorithms that allow the cyborg parts of me to act in a semi-autonomous manner instead of relying solely on human input.
It’s a small step, both from a technical and medical perspective. The algorithm it uses is simple linear regression model rather than a proper machine learning program as people expect will be necessary for fully autonomous artificial organs. The only function the algorithm has at the moment is to track biometrics and shut off the delivery of new medication to prevent an overdose, rather than keeping those biometrics in range in general. And it only does this within very narrow limits; it’s not really a fail-safe against overdoses, because the preventative mechanism is still very narrowly applied, and very fallible.
But the word prevention is important here. Because this isn’t a simple dead man’s switch. The new upgrade is predictive, making decisions based on what it thinks is going to happen, often before the humans clue in (in twelve hours, this has already happened to me). In a sense, it is already offloading human cognitive burden and upgrading the human ability to mimic body function. As of yesterday, we are now on the slippery slope that leads to cyborgs having superhuman powers.
We’re getting well into sci-fi and cyberpunk territory here, with the door open to all sorts of futurist speculation, but there are more questions that need to be answered sooner rather than later. For instance, take the EU General Data Protection Regulation, which (near as I, an American non-lawyer can make heads or tails of it,) mandates companies and people disclose when they use AI or algorithms to make decisions regarding EU citizens or their data, and mandating recourse for those who want the decisions reviewed by a human; a nifty idea for ensuring the era of big data remains rooted in human ethics.
But how does it fit in if, instead of humans behind algorithms, its algorithms behind humans? In its way, all of my decisions are at least now partially based on algorithms, given that the algorithms keep me alive to be able to make decisions, and have taken over other cognitive functions that would occupy my time and focus otherwise. And I do interact with EU citizens. A very strict reading of the EU regulations suggests this might be enough for me to fall under its aegis.
And sure, this is a relatively clear cut answer today; an EU court isn’t going to rule that all of my actions need to be regulated like AI because I’m wearing a medical device. But as the technology becomes more robust, the line is going to get blurrier, and we’re going to need to start treating some hard ethical questions not as science fiction, but as law. What happens when algorithms start taking over more medical functions? What happens when we start using machines for neurological problems, and there really isn’t a clear line between human and machine for decision making process?
I have no doubt that when we get to that point, there will be people who oppose the technology, and want it to be regulated like AI. Some of them will be Westboro Baptist types, but many will be ordinary citizens legitimately concerned about privacy and ethics. How do we build a society so that people who take advantage of these medical breakthroughs aren’t, as in Halsey’s song, derided and ostracized in public? How do we avoid creating another artificial divide and sparking fear between groups?
As usual, I don’t know the answer. Fortunately for us, we don’t need an answer today. But we will soon. The next software update for my medical device, which will have the new algorithms assuming greater functions and finer granularity, is already in clinical trials, and expected to launch this time next year. The EU GDPR was first proposed in 2012 and only rolled out this year. The best way to avoid a sci-fi dystopia future is conscious and concerted thought and discussion today.