The use of Clearview AI by law enforcement in Canada was widespread. A leaked document revealed that the RCMP, the Vancouver Police Department, the Edmonton and Calgary Police Departments, the Toronto Police Service, the Halifax Regional Police, amongst others, had all been clients of Clearview AI.1-3
That means, with one photo and a click of a button, law enforcement could conduct virtual police lineups using your photos. This essentially makes everyone a suspect.
While Clearview AI recently stopped offering their services in Canada due to an ongoing investigation by the Office of the Privacy Commissioner of Canada, your photos are still in their database and are still available to international law enforcement agencies.
Why does that matter? In the United States, this same technology led to the arrest of two men for crimes that they didn’t commit.4-5
In the first case, a man was arrested after arriving home from work. It happened on his front lawn in front of his wife and young daughters. He was held overnight in police custody for a crime that he didn’t commit.
That’s not an isolated problem with a single piece of software, but an issue with all facial recognition technology. Scientific studies have shown that the leading version of this technology is discriminatory against women and people of colour.6
Because of this, Amazon, Microsoft, and IBM have all imposed moratoriums on the use of their facial recognition technology by law enforcement.7 But there’s always going to be a company willing to put profits ahead of ethics, like Clearview AI.
Find out if you’re part of their database and ask them to delete your data by using the form.