Facial recognition per se isn’t actually that controversial. The smartphone in your pocket can recognise a face when you point it at a scene. It’s why your selfies are in focus.
When people say they’re worried about facial recognition, what they’re usually scared of is two things: facial profiling and facial matching, as I found out when I spent a year following the technology’s rapid expansion across the UK for BBC News.
Facial profiling means discerning things like age, gender and even mood. It’s already being done in an advertising context, but as I discovered, its now being marketed for the deeply controversial use of ethnicity detection.
Facial matching, however, is the real issue. Put simply, facial recognition systems reduce photos of your face to a string of numbers, which uniquely identifies you. Just as your NHS number links you to a stash of health data, and your bank account number links to records of your accounts, your face ID can link you to a store of your information.
Think of all the places your face appears, and the data it’s linked to: your social media (which contains your friends, family, and possibly your home address), your work profile (which contains your colleagues, place of work, and CV) every CCTV camera you’ve walked past (which contains your movements, locations, and possibly your shopping history).
Your face ID is potentially a golden thread linking all these stores of data.
One problem: unlike your NHS number of bank account number, your face is public and you can’t hide it from facial recognition. If you try to, here’s what can happen:
This man hid his face from the cameras during a Metropolitan Police trial of facial recognition in Romford. He was stopped; an argument ensued and he was given a £90 fine for disorderly behaviour.
The police are deeply interested in this technology. They argue that it’s nothing new: officers have always known the “usual suspects” on their beat.
What this argument misses is the meteoric growth in the speed and scale of facial recognition tech. It’s no longer about an officer looking for a few suspects among a crowd. Modern software can scan hundreds of thousands of faces and compare them to huge watch-lists, instantly.
Police trials of facial recognition have been closely scrutinised. Private industry’s use less so. Facewatch, one of the companies we interviewed for the report, is building up a database of alleged wrongdoers, whose images have been submitted by businesses who sign up to its service.
Facewatch says anyone who thinks they’ve been wrongly added to the database can appeal to the company to be removed. But to do that, you have to know or suspect that you’re on the database, and since the company doesn’t make it public, that creates a Catch 22 situation.
And that’s the real issue behind facial matching: if your face is now being used to access stores of data about you, who’s controlling those stores? How accurate are they? And how will you ever find out?
All in all, this is a worrying new frontier in the Wild West of personal data exploitation.