Safeguarding without snooping

A public-health emergency requires a degree of monitoring people. All the more reason to be especially vigilant on privacy, argues Nick Dowson.

VINCENTDRAGO/ALAMY

‘Be quick and go home,’ commanded the robot voice. A four-wheeled, chunky black box covered with police logos and cameras had been interrogating a man on a largely empty Tunis street about his reasons for being outside during Tunisia’s coronavirus lockdown earlier this year.

Anyone approached had to display their ID to an officer controlling the machine remotely.

This robo-cop was one of the quirkier technological responses to coronavirus, some of which have had significant successes.

After South Korea saw a spike in the disease early this May, linked to Seoul nightclubs, officials used credit-card data, GPS tracking and video surveillance as well as records from the clubs themselves to track down more than 45,000 people. Such aggressive contact tracing has been central to keeping a lid on the virus, even without ‘stay at home’ measures.

But as governments and private companies seek to collect broad and intrusive data on citizens, we should also beware of the risks.

As lockdowns spread, contact-tracing apps proliferated, hailed as a way to help stop the virus – even as many countries failed to implement the WHO’s recommended test, trace and isolate approach.

Many were quick to jump to experimental technology in place of tried-and-tested ‘analogue’ solutions. Traditional public-health responses, such as human contact tracers, were ignored.

Digital surveillance was not limited to contact tracing. Poland required those in quarantine to take daily selfies via a coronavirus app to prove compliance with stay-at-home orders. The UAE and Turkey took similar approaches.

In many ways the pandemic is simply turbo-charging the already-expanding state-corporate data collection complex

Data’s dark side

Responding to coronavirus clearly requires new measures. But data collection and surveillance also bring massive dangers.

In South Korea the publishing of detailed information about cases led to public shaming and even homophobia.

Detailed data about contacts between individuals could also be abused by governments seeking to build up pictures of social movements and crack down on protests.

Private data is lucrative for advertisers. In the case of our medical histories, it is also prized by health insurers who use it to decide how to price premiums – or when to deny cover.

The crisis is normalizing intrusive, Orwellian technologies like facial recognition – as in Moscow, where it was used with public CCTV to identify lockdown breakers. Facial recognition is foremost a technology for control by rulers and bosses. Witness, for example, accountancy giant PwC’s move to use it to track staff absences from their home desks. It also tends to reinforce racial and other discrimination.

A coronavirus firewall

Good policies can help, by ensuring the collection of only the minimum amount of data needed for an effective Covid-19 response, with access to it strongly restricted.

Personal data thus collected could, for example, be restricted to local health agencies, with a ban on sharing with police and other government bodies, and deleted afterwards.

There is a question of trust here, too – advocacy group Access Now has argued that companies with a track record for facilitating human rights abuse should be excluded from data collection programmes.

Access Now also calls for any software used to be open source: its code made accessible, enabling transparency and allowing researchers to verify that data is secure.

Where and how our data is stored is also important: if it is being transmitted to a remote server, is the encryption strong enough to prevent unwanted access?

How policy and technological choices combine can be seen in different approaches to contact-tracing apps.

While some countries looked to store all data in a central server, mobile operating system designers Apple and Google have pushed a more decentralized option. This offers more privacy protection as it does not automatically upload contacts by default.

But this also raises questions, with the two major companies – previously revealed to be part of US intelligence agencies’ mass-surveillance scheme Prism – effectively dictating to democratic governments what form of contact-tracing apps they can use in their public-health response.

Hacking the system

Where safeguards are not in place there is another option: subversion.

When the Indian government made the Aarogya Setu contact-tracing app effectively compulsory (in a country where many do not own smartphones), one Bangalore software engineer hacked it so it could still flash the required green badge despite its tracking technologies being removed.

Recognizing the tension between privacy and public interest here and not wanting to undermine efforts to isolate the virus, he only shared it with close friends.

Artists have experimented with make-up and fashion accessories that disrupt facial recognition. And for protesters in the UK, newly widespread mask-wearing has effectively suspended previous police efforts to ban face coverings as part of protest surveillance.

In many ways the pandemic is simply turbo-charging the already-expanding state-corporate data collection complex.

From Gmail to Facebook, major companies already hold data giving detailed insights into our lives and relationships; smartphones add location history. Now they want to cash in on expanded online learning and healthcare.

Using open-source programs – think Firefox, LibreOffice, Linux – or online services run by trustworthy non-corporates (Wikimedia, Riseup) can help: they offer transparency about what they do with our data and often increased privacy protections.

However, getting away from social technologies like Facebook is a taller order for many.

Overall we need to bring vital digital infrastructure under democratic control – decentralizing it where possible, ensuring transparency of data use and also looking to both regulation and forms of public and community ownership.

The pandemic has demonstrated that our digital infrastructure, vital this year for communication, work and organizing under lockdown, is too important to be left in the hands of billionaires.

Neither should we let it become a tool for authoritarian leaders to centralize monitoring of populations that goes beyond Orwell’s wildest nightmares: tracking devices not just in every room but in every pocket.