Clearview AI facial recognition worrisome for black, Indigenous, racialized groups, expert says - Action News
Home WebMail Wednesday, November 13, 2024, 04:13 AM | Calgary | -1.4°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Windsor

Clearview AI facial recognition worrisome for black, Indigenous, racialized groups, expert says

Legal scholars at the University of Windsor have expressed concerns over Windsor police's use of a controversial artificial intelligence-based facial recognition tool.

Windsor police acknowledged their use of Clearview AI earlier this week

a security camera
Windsor police confirmed via email that fewer than 10 officers have used Clearview AI facial recognition software since October 2019. (Eric Risberg/Associated Press)

Legal minds at the University of Windsor have expressed concerns over Windsor police's use of a controversial artificial intelligence-based facial recognition tool, saying that the software is especially worrisome for black, Indigenous and racially marginalized communities.

Windsor police on Tuesday confirmed in an email statement that the service is aware of fewer than 10 members who have used Clearview AI's facial recognition tool since October 2019, adding that the service is "is currently in the process of determining if there are any additional members who accessed the software."

The tool works by scraping publicly available information about an individualonline including names, phone numbers and addresses based on little more than a photograph.

Widespread concern over the Clearview AI tool came to the fore earlier this year, after a New York Times investigation revealed that the software had extracted more than three billion photos from public websites like Facebook and Instagram, to build a database used by more than 600 law enforcement agencies across the U.S. and Canada.

Among those agencies is the RCMP, which initially denied using Clearview AI's software, only to later acknowledge that the federal police service had been using the tool for months.

In its Tuesday email statement, Windsor police clarified that Chief Pam Mizuno ordered officers to stop using Clearview AI after a number of privacy commissioners across the country including the federal privacy commissioner launched investigations in February into the use of the tool in Canada.

Josh Lamers is a first-year law student at the University of Windsor, as well as an activist and community organizer. (Jacob Barker/CBC)

Josh Lamers a first-year law student at the University of Windsor, as well as a black community organizer and activist said the use of facial recognition tools like Clearview AI by law enforcement is concerning for marginalized groups.

"Black, Indigenous and racialized communities are often overpoliced and over-surveilled, and these technologies are often built off of our experience," he said.

In addition to issues raised by the use of tools like Clearview AI, Lamers said he's concerned because facial recognition tools have historically done a poor job of identifying people of colour, especially black people.

Google, for example, was forced to issue an apology in 2015 after its Photos application mistakenly identified people with dark skin as "gorillas."

Black, Indigenous and racializedcommunities are often overpoliced and over-surveilled ...- Josh Lamers, Law student, University of Windsor

"Let's say I get mistaken for another person because this technology is so basically anti-black that it can't tell the difference between two black people," Lamers said. "What if I'm travelling and I get flagged for this and then I'm not able to get back into Canada?"

University of Windsor law professor Kristen Thomasen said part of the problem isn't just the use of facial recognition software by law enforcement. Instead, a large issue is the wide distribution of tools like Clearview AI in the first place.

"The constitutional and legislative constraints on the state use of surveillance tools like facial recognition aren't perfect, but there are constraints there," Thomasen said. "A big issue with the wider distribution of this technology to commercial and especially to private individuals using facial recognition, is that the legal constraints become a little more challenging."

She added that enforcing those constraints becomes an additional problem.

Despite a watchful eye over city streets, privacy expert Kristen Thomasen said people expect a level of privacy while in public.
Despite a watchful eye over city streets, privacy expert Kristen Thomasen said people expect a level of privacy while in public. (Tom Addison/CBC)

Following the publication of the New York Times story in January, Clearview AI published a blog post on its website explaining that the company's facial recognition software isn't intended for consumer use.

"Clearview's app is NOT available to the public," the company wrote. "While many people have advised us that a public version would be more profitable, we have rejected the idea."

Still, Thomasen pointed out that so-called "Stalkerware" apps are already readily available, and are often marketed as allowing users to keep track of people who have such software installed on their personal devices.

"The thought of matching [stalkerware] with this kind of identification technology is quite terrifying, to be honest," she said.

At the same time, Thomasen said she's troubled by the lack of "forthcomingnessabout how isthis technology being used."

"We just don't have any answers," she said. "How are we taking into account the privacy interests of the people whose information was used to design the software and to train the software? How do we know that, if the victim is even identified, that their information is going to be protected?"

Windsor police said the service will "re-evaluate the use of facial recognition software for future investigations," once the federal privacy commissioner publishes the results of their investigation.

For his part, Lamers said he wants to see Windsor police to take a lesson from places like San Francisco which, in 2019, became the first U.S. city toopenly ban the use of facial recognition tools.

"It shouldn't take a privacy commissioner," he said. "Windsor police [should] just say 'We don't have to wait for a privacy commissioner to tell us not to use this technology. We're going to choose to take a step back away from it.'"

With files from Jacob Barker and Tony Doucette