Surveilling encounters

Here’s Carol Yin detailing how her movements have been tracked across China since the lockdown came into place. Upon entering a train station, she has been having to share her location data of recent weeks. When booking a taxi, she needs to scan a QR code generated by WeChat or Alipay to “check-in”. The same applies to taking public transport or accessing any building. The tracking is done via a combination of QR codes and location data from the phone providers.

The code China is assigning to each citizen — red, yellow or green — reflects someone’s contagion risk.

Israel is tapping into cellphone data, nothing fancy.

Taiwan set up an ‘electronic fence’: your phone determines whether you are respecting the boundaries of the quarantine or not. Authorities are alerted if you switch it off or as soon as you leave the designated space.

In reality China’s system is way more confusing and less centralised than you might have read. There are at least four competing health codes generated by different entities (city, province, community, and app codes). Each of them obliges to different rules. You might never find out why you were assigned that one.

South Korea is throwing in the mix a little bit of everything: CCTV surveillance, bank transaction logs, mobile phone usage. Big data! Hurray!

Here’s a website with data released by the Ministry of Health of Singapore. You can see every known infection case, down to every movement and every connection a case had. It’s alright because it’s anonymised. Sure.

Hong Kong is slapping wristbands upon arrival at its airports. The wristband connects to a smartphone app, StayAtHomeSafe. It generates a unique fingerprint of your house by looking into the signals emitted by the devices surrounding you — nearby WiFi, your WiFi, Bluetooth and cellular. “As you walk around the home, the algorithm on the app will sample the signals of the home.”

Palantir is doing well. “The software company is in discussions with authorities in France, Germany, Austria and Switzerland.”

Singapore solution to contact tracing is an app called TraceTogether. The app creates a temporary ID by encrypting a user ID to a Ministry of Health owned public key and then broadcasts the temporary ID over Bluetooth. The Ministry of Health acts as a trusted third party (that can decrypt those IDs) and promises it will only use the information for COVID-19 related purposes.

You’d have noticed how (some) of these solutions are trying to do two things at once:

  • Help citizens with contact tracing
  • Help authorities surveilling whether the population is complying with the lockdown

Let’s neglect the latter (I hope we wouldn’t need or want to surveil). Here in the west we’ve got plenty of tools to self-diagnose our risk, yet we’re missing a widely adopted system to do contact tracing. If we want to go back to normality (where normality here simply stands for: going outside the house) it sounds likely that we’ll need a form of digital surveillance. Emphasis on likely: I am not in a position to weigh in on the efficacy of contact tracing — I know nothing — all I can say is that it seems to be a valuable tool if paired with other non-technical solutions.

That said, I worry that we’re going to do what we usually do when in panic mode: introduce purportedly temporary surveillance that ends up staying. We might adopt despotic tech, willingly, because it makes us feel safe without having evidence of any actual benefit. As before, we need to balance our need for security with some level of freedom.

It seems that we need:

  • A privacy-preserving system to track encounters. Using Bluetooth Low Energy (BLE) to detect nearby devices (= humans) seems to make the most sense to me. There are doubts whether location tracking — done via GPS or phone carriers — can offer a meaningful contribution in defeating the virus. We’re talking about maintaining a 2-meter distance here: GPS accuracy is around 5 meters. We don’t need to know the coordinates, but rather the proximity with other devices. Proximity tracking seems to matter more.
  • If location is important (e.g. we want to notify everyone who has recently been in a listed hotspot, being it the tube, a public park, or else) guess what: retailers have been surveilling you for a while. You could use beacons in public spaces and WiFi signals to let each smartphone log access locally. The smartphone could then check its recorded path against a hotspot database. No information needs to leave the device (this is MIT’s PrivateKit)
  • We probably don’t want to share our location data with third parties unless we become infected. We want to collect it locally until it makes sense to share (part of) it. Existing health apps (in the UK: the NHS app, or third parties that work with them such as Babylon) could gain access to this data in a similar fashion as they request access to the health database
  • A system to alert every user that came into close range with a case for an extended period

Governments and health authorities should explain in details what data they’re using and for what reason. Most governments’ apps are asking for name, sex, birth year, residence, travel history and a plethora of other unnecessary information. If this system ends up determining one’s ability to roam freely, you’ll want to know why you can’t leave the house.

Ideally, we wouldn’t get an app. This should be something baked into the OS. Google and Apple should provide privacy settings for contact tracing: that would give us a universal system to collect this kind of data locally and securely. Besides, the utility of such system is null without everyone using it. A pandemic is global: there needs to be a global way of dealing with it.

It is possible to build a system for contact tracing that is also privacy-preserving. Apple does something similar, albeit for other purposes. And there’s already a proposed protocol, the PEPP-PT:

  • Assign a unique and anonymous ID to every device
  • When two devices come in close contact for an extended period of time, exchange and log the IDs
  • When someone is diagnosed with the virus, alert all the logged IDs
  • Then and only then: ask the affected IDs, via an app, to self-diagnose themselves continuously, and if they report symptoms get them tested (ideally even if not)

You’ll notice that there is no leak of data to the government under this scenario. All the government knows is that an ID needs to be tested.

Especially if the problem is here to stay for a while, we need a solution that doesn’t permanently compromise our freedom. We also need something that all of us can use and trust, independently of the country we inhabit.

A lot of the tools above — like tracking GPS movements — seem unnecessary. Let’s not scramble up a solution by throwing random data into the mix. An app is not going to save us. All of this is going to be pointless if the more essential pieces of the puzzle (like testing) are not there.

Alas, don’t demand surveillance, because no one is going to turn it off when this is over.

The EFF and McSweeney’s have teamed up to produce The End of Trust, an issue of McSweeney’s dedicated to technology, privacy, and surveillance. It can be read online for free.

Arvind Narayanan:

I will focus the rest of my talk on this third category [predicting social outcomes], where there’s a lot of snake oil.

I already showed you tools that claim to predict job suitability. Similarly, bail decisions are being made based on an algorithmic prediction of recidivism. People are being turned away at the border based on an algorithm that analyzed their social media posts and predicted a terrorist risk. […]

Compared to manual scoring rules, the use of AI for prediction has many drawbacks. Perhaps the most significant is the lack of explainability. Instead of points on a driver’s license, imagine a system in which every time you get pulled over, the police officer enters your data into a computer. Most times you get to go free, but at some point the black box system tells you you’re no longer allowed to drive.

Un reporter della BBC ha deciso di testare l’efficienza dell’infrastruttura di sorveglianza del governo cinese — dopo essersi fatto registrare nel database (che contiene tutti i cittadini), è andato a spasso per Guiyang lasciando che la polizia prima o poi lo identificasse tramite le telecamere sparse per la città. In Cina ci sono 170 milioni di CCTV; il governo ha intenzione di installarne altre 400 milioni nel corso dei prossimi tre anni.

Ci sono voluti meno di sette minuti, per trovarlo.

Un’applicazione può — ottenendo accesso alla libreria fotografica di un iPhone — indirettamente, analizzando la geolocalizzazione delle foto, venire a conoscenza degli spostamenti di un utente negli ultimi anni.

Lo ha mostrato Felix Krause con una semplicissima applicazione:

Does your iOS app have access to the user’s image library? Do you want to know your user’s movements over the last several years, including what cities they’ve visited, which iPhones they’ve owned and how they travel? Do you want all of that data in less a second? Then this project is for you!

Qualsiasi cittadino europeo può richiedere un resoconto dei dati che un servizio ha raccolto sul suo conto. Judith Duportail, del Guardian, l’ha fatto con Tinder:

Some 800 pages came back containing information such as my Facebook “likes”, my photos from Instagram (even after I deleted the associated account), my education, the age-rank of men I was interested in, how many times I connected, when and where every online conversation with every single one of my matches happened … the list goes on.

Maciej Ceglowski, su Hacker News, si preoccupa di una conseguenza che Face ID potrebbe avere sulla privacy: di come possa venire normalizzata l’idea che un telefono scansioni il nostro volto ogni secondo durante l’uso.

Esattamente come nessuno di noi si pone più il problema di un telefono che in ogni istante sa dove ci troviamo, forse un giorno non ci preoccuperà più un telefono che costantemente ci osserva. Apple è molto attenta alla privacy dei suoi utenti, ma altre aziende — il cui modello di business è basato sulla pubblicità — potrebbero essere spinte a sfruttare il sensore per avere un resoconto ancora più dettagliato delle nostre reazioni e comportamenti:

When you combine this with business models that rely not just on advertising, but on promises to investors around novelty in advertising, and machine learning that has proven extremely effective at provoking user engagement, what you end up with is a mobile sensor that can read second-by-second facial expressions and adjust what is being shown in real time with great sophistication. All that’s required is for a company to close the loop between facial sensor and server.

Roomba, il robottino che pulisce la casa, oltre a raccogliere polvere raccoglie anche un bel po’ di dati interessanti sulla casa stessa — mappandola e facendosi un’idea di come sia organizzata grazie ai sensori laser che gli permettono di evitare gli ostacoli.

L’azienda che lo produce si è accorta che mentre la polvere non è redditizia, questi dati potrebbero tornare molto utili nello sviluppo di dispositivi ‘smart’ per la casa — e sta pensando di venderli a terzi:

That data is of the spatial variety: the dimensions of a room as well as distances between sofas, tables, lamps and other home furnishings. To a tech industry eager to push “smart” homes controlled by a variety of Internet-enabled devices, that space is the next frontier. […]

With regularly updated maps, Hoffman said, sound systems could match home acoustics, air conditioners could schedule airflow by room and smart lighting could adjust according to the position of windows and time of day.

Shanghai ha lanciato un’app inquietante che assegna automaticamente un voto di comportamento ai cittadini, calcolato in base ai dati che ha aggregato, raccolti dal governo:

Here’s how the app works: You sign up using your national ID number. The app uses facial recognition software to locate troves of your personal data collected by the government, and 24 hours later, you’re given one of three “public credit” scores — very good, good, or bad.

Shao says Honest Shanghai draws on up to 3,000 items of information collected from nearly 100 government entities to determine an individual’s public credit score.

Una cosa simile è pericolosa, soprattutto se gli algoritmi che prendono queste decisioni restano opachi al cittadino.

Wikileaks ha rilasciato più di 8.000 documenti provenienti dalla CIA che descrivono la capacità dell’agenzia di accedere e prendere il controllo di microfoni e telecamere di smartphone, smart TV, computer, etc. senza che i loro utenti se ne accorgano.

Scrive il Washington Post:

In the case of a tool called “Weeping Angel” for attacking Samsung SmartTVs, Wikileaks wrote, “After infestation, Weeping Angel places the target TV in a ‘Fake-Off’ mode, so that the owner falsely believes the TV is off when it is on, In ‘Fake-Off’ mode the TV operates as a bug, recording conversations in the room and sending them over the Internet to a covert CIA server.”

Siccome questi malware riguardano il sistema operativo/device e non l’applicazione in uso dall’utente, tramite essi la CIA è riuscita praticamente a bypassare la sicurezza di Signal, WhatsApp, Weibo e Telegram.

Il tipo di attacco è differente in natura dai precedenti rivelati da Snowden: mentre quelli erano di mass surveillance — non relativi a una persona o a un device specifico —, quest’ultimi si focalizzano su device mirati.

In un certo triste modo, seppur in minima parte, il leak potrebbe anche non essere del tutto una notizia negativa: dimostra che il crittaggio end-to-end funziona, e che sta spingendo la CIA ad attacchi più mirati, verso persone specifiche, invece che a collezionare indiscriminatamente i dati dei cittadini.

Signal, applicazione per inviare messaggi ed effettuare chiamate criptate, sta venendo bloccata in Egitto e negli Emirati Arabi.

Il team dell’app ha trovato una via per aggirare il blocco, e far sì che i messaggi vengano inviati e ricevuti lo stesso:

Signal’s new anti-censorship feature uses a trick called “domain fronting,” Marlinspike explains. A country like Egypt, with only a few small internet service providers tightly controlled by the government, can block any direct request to a service on its blacklist. But clever services can circumvent that censorship by hiding their traffic inside of encrypted connections to a major internet service, like the content delivery networks (CDNs) that host content closer to users to speed up their online experience — or in Signal’s case, Google’s App Engine platform, designed to host apps on Google’s servers.

“Now when people in Egypt or the United Arab Emirates send a Signal message, it’ll look identical to something like a Google search,” Marlinspike says. “The idea is that using Signal will look like using Google; if you want to block Signal you’ll have to block Google.”

The trick works because Google’s App Engine allows developers to redirect traffic from Google.com to their own domain. Google’s use of TLS encryption means that contents of the traffic, including that redirect request, are hidden, and the internet service provider can see only that someone has connected to Google.com. That essentially turns Google into a proxy for Signal, bouncing its traffic and fooling the censors.

(Via Bruce Schneier)

La Electronic Frontier Foundation ha comprato una pagina su Wired per invitare le aziende tecnologiche a prepararsi a Trump

Se volete farvi un regalo geek per Natale, sottoscrivete una donazione mensile alla EFF. Io l’ho fatto giusto un anno fa, di $10 mensili: vi spediscono pure una bellissima maglietta, per ringraziarvi.

(via Boing Boing)

Il Regno Unito conserverà la cronologia di navigazione dei suoi cittadini per 12 mesi

Che bello, finalmente siamo tutti un po’ più al sicuro (a parte quei pochi stronzi che hanno cose da nasconderci). Esplodono cose ad ogni passo di questi giorni ed è dunque giusto, in maniera angosciata, affidarsi a uomini maschi forti e passare leggi che privano pian piano i cittadini della loro libertà, senza al contempo avere alcun effetto sulla sicurezza nazionale.

The Guardian:

The new surveillance law requires web and phone companies to store everyone’s web browsing histories for 12 months and give the police, security services and official agencies unprecedented access to the data.

It also provides the security services and police with new powers to hack into computers and phones and to collect communications data in bulk. The law requires judges to sign off police requests to view journalists’ call and web records, but the measure has been described as “a death sentence for investigative journalism” in the UK.

Electronic Frontier Foundation:

If Mr. Trump carries out these plans, they will likely be accompanied by unprecedented demands on tech companies to hand over private data on people who use their services. This includes the conversations, thoughts, experiences, locations, photos, and more that people have entrusted platforms and service providers with. Any of these might be turned against users under a hostile administration.

Dal permettere accesso anonimo a un sito (senza costringere durante la registrazione a utilizzare nome e cognome), al cancellare i dati raccolti in background durante la navigazione, se proprio debbono essere raccolti in primo luogo.

Suggerimenti simili li aveva dati Maciej Cegłowski, durante uno dei suoi talk.

The Intercept ha messo assieme una lista di piccoli passi che potete compiere per mettere i vostri dati un po’ più al sicuro — visti i risultati attuali e ciò che ci attende.

Se volete una guida più completa, quella messa assieme dalla Electronic Frontier Foundation fa al caso vostro.