‘The Surveillance Pandemic’ by General Sir Richard Barrons KCB CBE

The following comment and analysis by General Sir Richard Barrons, Commander Joint Forces Command (2013-2016), now Co-Chairman of Universal Defence & Security Solutions, is published here too by the Global Security Forum of which General Barrons is an Advisory Board member.

The Surveillance Pandemic

All eyes are focused on the extraordinary effects of the COVID-19 pandemic, so it is easy to forget that there many other challenges to our way of life existed before the virus struck and these have not gone away. Quite high up the list is the challenge of a new surveillance era as Digital Age technologies, each one created with the best of intent, may combine to deliver extraordinary capability for a state to understand, control, and dominate every citizen. Some of this technological prowess is entirely benign and necessary – infection tracking by smart phone surveillance for example – but we may stumble into a fundamental challenge to how we generally choose to lead our lives. There are four contributory pillars to this dilemma: the rapid advance of bioscience around DNA and the human genome: the constantly updated digital autobiography we create in our connected lives; the means to know where we all are all the time; and just ahead the control available through autonomy and robotics in the Internet of Things. The technology to make totalitarianism possible and perhaps irresistible is arriving, a surveillance pandemic, unless we choose otherwise.

First, although the first human genome took $2.7 billion and almost 15 years to record, it now costs around $1000 and within 3-5 years this is expected to be $100. It costs more money to interpret and correlate, but the cost of that falls too. The point is that as this science now crosses over rapidly from academia to commercial and government exploitation it is profoundly significant. As more and more individual genomes are stored in a database (each record is about 150GB) and more correlation is done by AI (causation is a very long way off), it becomes possible to judge and predict (both imperfectly) what each individual’s make-up ‘means’. This insight suggests that in due course each one of us could have our education, healthcare, career choices, and pension needs influenced by what our genome says about our individual make-up. This doesn’t sound a bad thing, for example if a doctor understood our genome sequence he could look out for specific ailments and build bespoke treatments.

It will also open the door to genetic selection, for example a couple may be offered a choice of embryo based on what is inferred by a particular genome (clever/fast/healthy combinations generally winning over dull, shambling and disease prone?) – though a society composed entirely of marathon-running, brilliant rocket scientists would need a lot of robotic or imported help – someone has to clean the bathrooms. It will also open the door to scientists altering the sequence, perhaps to remove an inherited vulnerability or to insert a desirable trait. All of this raises very difficult ethical considerations about who decides what is permissible and who benefits – wealth will no doubt lead the way, but building a self-reinforcing genetically modified (warrior) elite should not proceed without challenge…

But what if a government insisted that every person had their genome sequenced and stored in a national database? The government would then know the genetic predisposition of every person. That same government may also want to ensure a balanced population by allocating embryos by type. Might this also mean that a government that knew the genetic origin of every person and their predisposition might be tempted to discriminate against particular groups thought (by historic correlation) to be objectionable or just for being ‘different’? Might it be tempting to programme into the sequence the predisposition to an early death in response to a particular wavelength of light, and use this once a pension plan had been exhausted? This is the potential dark side of knowing enough about a genome to see off diseases. For the purposes of understanding the potential for a state surveillance pandemic, the day may be coming when the genetic predisposition of every person is in a government database. The Government will ‘know what we are like’.

Second, of course nature is not nurture, and we each turn out to be the people we are as a mix of what we are born with and how we experience and develop. ‘Fortunately’ how we actually turn out is now a matter of easily accessible record: anyone who is connected to the Digital Age leaves a trail, a constantly unfolding autobiography. Our phones, internet use, bank cards, travel tickets and shopping habits tell the truth about who we know, what we say, where we go, what we buy, and what interests us. This is not about what we choose to put in our digital shop-windows, but about what we actually do with our lives and say. The data collected on us every day by the large internet services (Facebook, Amazon, Netflix, Google) and organisations like banks and supermarkets is too much and too complex to be artfully constructed as a lie.

We can see how this truth about each of us is employed in the way companies use AI to dissect our consumption patterns in order to direct goods and news we are interested to our attention. This pleases us (apparently) and generates fortunes in advertising revenue. It is also used by political parties to find and micro-target messages at individual voters in key marginals. In China, this digital record is a key component of how a ‘social score’ is kept, which unlocks or inhibits how a person may travel or access certain goods or appointments. This sort of surveillance is not popular in a country like the UK…. except that we are very likely to submit willingly to tracking that reveals contact with COVID-19 infection, a good thing and perhaps seductive? So, for all its many uses and benefits, the combination of data, connectivity and AI that tracks our digital selves means that a government could know both what our genetic predisposition is and how we are actually turning out as citizens.

Third, the government could know where we are almost all the time. Our phones reveal this routinely, as does the use of any electronic card, or a car connected to the internet, or smart speaker or smart TV in the home, or indeed the fridge if it is connected to the internet and recognises us. The advent of the Internet of Things (whereby almost every device or thing that can possibly be connected to the internet is connected) means that the most humble machine can theoretically be instructed to report when it encounters us. And, as we are creatures of habit, our movement generally forms a ‘pattern of life’, so an alert can be created if something out of the established norm happens. The ability to predict where we may go next will develop. Of course, this sort of tracking could be turned off or regulated by good individual security, but what if the machine cannot be inhibited or we are barred from doing so, or it can be done covertly? This might help with, say, the protection of the house-bound elderly prone to a fall, but it will definitely be a spoiler for many other currently legitimate activities. We all have our lawful little secrets.

If this is not comprehensive enough for a surveillance pandemic, the burgeoning network of cameras, especially public and private CCTV, will generally find us, except in a wilderness – and in a wilderness, if we matter enough, space-based sensors or even commercial UAV technology will fill in the gaps. If a geostationary or low-earth orbit satellite can soon track a moving car, undetectable human movement will be increasingly hard. And for the majority who choose to live in populated urban or rural areas and avail themselves of the advantages of a home, a phone, a bank, a bus or a car, the government could know where we all are pretty much all the time.

We are, therefore, heading for a world where it will be technically possible for a government to understand our genetic predisposition, to see how we are turning out, and to know where we are. What if the government also wants to limit where we go or what we do, and is inclined to take action if in disagreement?

Fourth, ‘fortunately’ the combination of the Internet of Things (IOT) and the developing capabilities of robotics and autonomy will help control and if necessary disrupt our passage through life. Founded on the IOT, the introduction of more forms of autonomous machines designed to interdict, detain, punish or worse will follow. For example, if identified on a station concourse and known to have a ticket to a destination the government does not favour, it will be straightforward for the law-enforcement equivalent of a robotic lawn mower to seek you out and detain by clamping. We would object to being arrested by a machine, but the same machine that arrests a known axe-murderer on an outing is a good thing? As huge numbers of swarming (i.e. cooperative) micro-UAVs are built these could be distributed on charging points across a city – or country – ready to be activated by your presence in order to follow or intercept. In some cases, perhaps that micro-UAV is armed with explosives (see ‘Slaughterbots’ on YouTube). Or perhaps, more prosaically, your driverless car follows a remote instruction to slam itself unaccountably into a motorway bridge support at 69.9mph?

Some of the foregoing is still science fiction, but not that much and not for that long. The point of this narrative is to illustrate what is going to be technically possible in quite short order, as an inextricable part of the unfolding power of the Digital Age. There are some aspects of this we cannot really change and others we can. We cannot change that this technology will be made, because it results from rapid, talented, unstoppable and expensive innovation around the world, generally led by the civil sector. Genome sequencing, the smart phone, the IOT, are all extremely well-intentioned – in fact their creators are generally allergic to security applications, in the West at least. What is neither foreseen by the creators nor easily inhibited by the users is the darker use this technology may be put to, sometimes singly and more particularly in novel combinations.

Regulation has a role to play yet regulating something that has not existed before is difficult and there is neither a global consensus on what to regulate nor the means to enforce it. China’s approach to innovation is to ‘let the bullets fly’, to see what is produced and only step in where public or political disquiet appears, and China has an entirely different view of the role of the state and the limits of surveillance from Europe. The measures taken to monitor and control the Uighurs is a clear example. In the West, there is a preference for establishing a clear, definitive regulatory landscape before innovation is let off the lead, but as the innovation is hurtling along anyway this regulation effectively lags by 3-5 years. The debates still underway about regulating the use of personal data or the taxation of the big internet service providers show this.

The vital questions anyway are political and social more than legal. At the national level, if these technologies are coming along (and they are) what do we think is acceptable and how might we assure that limits are respected? We can be clear that states will come to different conclusions about both their internal affairs and how they employ technology in their international relations – cooperative, competitive, or conflictual. The capacity for manipulation and harm to be caused in UK by a foreign power exercising the sort of surveillance described above is surely troubling?

We might make a decent start to tackling a surveillance pandemic by understanding what is going to be possible in order to decide what is permissible. This requires policy makers and civil society to invest in thinking and managing a degree of complexity at reasonable tempo. Even when understood, it will not be possible to limit or prevent everything we might object to all the time (assuming such a consensus exists). We will need to take a ‘risk management’ approach, focusing effort on the aspects that are really important and letting the merely unattractive go by. We will certainly need the will and the means to enforce, no matter how powerful or stateless the targets are, or we will find our laws and values are comprehensively outflanked by technology ‘we’ built and use but fail to control – a triumph of human ingenuity over judgement. Like all pandemics, if the prospect of a surveillance pandemic occurring is clear, we have only ourselves to blame if we are not ready when it occurs.

Richard Barrons

May 2020