[This post was updated on 22nd May 2018 to include a response from NtechLab]
Late on Friday I tweeted the following screenshot:
When I came back after the weekend it had been retweeted 11,000 times, which is a new experience for me.
The company concerned is the Russian firm Ntechlab. They’re very good at facial rec, by all accounts, having won several competitions. And they were behind the controversial FindFace software that promised to match social media profiles simply from a photo taken of a subject.
Watching the debate following my tweet has been fascinating, and I thought it’d help to break down the different responses and discuss them:
1. “Arabic is a language, not an ethnicity!”
This was by far the most common response. On the one hand, it seems unfair to criticise the Ntech for their clumsy language usage. But on the other hand, in an area as fraught as ethnicity, language really matters and the website owners should have got this right, which feeds into response two….
2. “Techies need to see the wider picture”
Otherwise known as: “just because you can, doesn’t mean you should”. A lot of the responses picked up on the fact that tech developers often seem oblivious to the wider ramifications of their inventions. The problem with this argument is: should innovators refrain from inventing new tech on the basis of potential down-sides, thereby depriving society of potential up-sides?
(Incidentally, I tried to think of positive uses for Ntech’s software, and I did come up with one: diversity monitoring for businesses. But it’s a pretty weak example, and doesn’t hold up well against the considerable concern felt by many Twitter respondents, particularly those from minority ethnic communities).
3. “Ethnicity isn’t about skin colour”
From a technical standpoint, Ntech’s offer isn’t that radical (image recognition software can distinguish colours quite easily, and that’s exactly what’s going on here: the tech is simply distinguishing between skin tones).
But labelling colour recognition as “ethnicity recognition” is another example of sloppy language use by Ntech Lab.
As the Twitter responses show, there is considerable debate over how “race” and “ethnicity” are defined (the one thing everyone seemed to agree on was that neither term is solely about skin colour, and for some skin colour is irrelevant).
That’s why I was hesitant about the tweets calling Ntech’s software “racist”; strictly speaking, Ntech’s software is “colourist”, since it’s not capable of distinguishing race and ethnicity.
Of course, the tools could be used in a racist way, but perhaps Ntech would argue that’s up to its clients to decide (but that goes back to point 2, above).
(I’m painfully aware at this point that race and ethnicity aren’t my area of specialism).
In the long run, is this technology going to be useful? I think we can safely say it can’t do what it says on the tin, because it’s not actually recognising “ethnicity”. Really then, how useful is it to know skin colour? (Especially given the number of Twitter respondents who reported being racially categorised in a bewildering variety of ways thanks to their mixed heritage).
A spokesperson for NtechLab blamed a “communication and localization issue” and said: “We deeply apologize that the information on our website regarding ethnicity recognition has caused people to respond negatively to our AI technology.
“Our company doesn’t detect racial ethnicity, we recognize and celebrate ethnic diversity across all demographic groups.”
“We… will make due corrections on the web page to resolve this communication issue as soon as possible.”
However, as at 22nd May 2018, NtechLab’s website still includes “Ethnicity Recognition” among its upcoming projects. The photo and description have been removed.