Did the London Police Service lie about the use of Clearview AI surveillance technology? - Action News
Home WebMail Saturday, November 23, 2024, 12:37 AM | Calgary | -11.5°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
London

Did the London Police Service lie about the use of Clearview AI surveillance technology?

A former Ontario privacy commissionersays she's concerned the London Police Servicemay have lied to cover up its officers' useof Clearview AI.

Police have issued contradictory statements about the use of the unregulated technology

The London Police Service has issued contradictory statements about the use of Clearview AI by its officers. (Colin Butler/CBC)

Former Ontario privacy commissioner Ann Cavoukiansays she's concerned the London Police Servicemay have lied to cover up its officers' useCleaview AI, a controversial and unregulated surveillance tool that critics say has a tendency to deliver false matches, especially when it comes to visible minorities.

It comes after the Toronto Star reported the London Police Service is among a dozen police agencies in Canada, including the OPP and RCMP, that use the controversial face-matching application.

The software has theability to match any photoagainst a database of threebillion images scraped from millions of websites, potentially telling officers where you live, what you do and with whom you have a relationship.

However the London Police Service has repeatedly deniedit had any records relating to its use, even after multiple inquiries by CBC News, including a Freedom of Information request dating back to August of 2019.

'This information may have actively been concealed'

Former Ontario privacy commissioner Ann Cavoukian. (Onnig Cavoukian)

"My concern is that this information may have been actively concealed," Cavoukian said.

"I can't prove that, I have no evidence to that effect, but the fact that virtually every law enforcement agency that's been using this never fessed up to using it and never fessed up until it went public, that's very telling."

CBC News began asking about the use of such technology by London Police on Aug. 7, 2019, througha Freedom of Information request. The request asked for "any documents or correspondence" regarding the "possible or planned acquisition of facial recognition equipment or software and/or service agreements."

The London Police Service wrote back in September of 2019, refusingto confirm or deny the existence such records. CBC News appealed the decision with the Ontario Information and Privacy Commissioner.

The case went to mediation and the London Police Service changed its stance, writing in a letter to CBC News datedFeb. 10, 2020 saying "the records do not exist."

CBC News followed up with an email, this time asking about Cleaview AI by name on Feb. 19. A police spokeswoman repliedofficers have not used nor have they ever used Clearview's services.

Police admit software 'may have' been used

Visitors check their phones behind the screen advertising facial recognition software during Global Mobile Internet Conference (GMIC) at the National Convention in Beijing, China April 27, 2018.
Visitors check their phones behind the screen advertising facial recognition software during the Global Mobile Internet Conference in Beijing in 2018. (Damir Sagolj/Reuters)

Eight days later, London Police changed its stance again, saying the controversial software "may have" been used by its officers.

"It has come to our attention that well intentioned officers may have accessed the technology that has been marketed and shared as a possible investigative tool amongst the law enforcement community," police wrote.

"To date the London Police Service as an organization has not engaged in the use of Clearview AI technology in any formalized manner."

The statement also noted London Police Chief Stephen Williams has given clear direction "that any use of such technology is to stop pending further review of the matter."

Why can't police seem to control their officers?

A screengrab from Clearview AI's webpage shows that in order to access the software, officers to send an email from a bonafide account to access it. An FOI request filed by CBC News claimed the records 'do not exist.' (Clearview AI)

Cavoukian said while she's disappointed by the response from law enforcement in regards to Clearview AI, she does give credit to police leadership for acting responsibly.

"Most of the police chiefs across Canada whose law enforcement people, police were using this, weren't aware of it and when they learned about it, they stopped it," she said.

"One should perhaps question why they wouldn't know about this before and how is this possible, how junior police officers can just choose to do this on their own."

What also doesn't add up is why the Freedom of Information request filed by CBC Newsrevealed no records.

In order to request access to Clearview AI's software, an interested officer would have to submit a professional email address to prove they're bona fide. The company then repliesto the verified officer with a link to test drive the software.

If more than one London Police officer sent emails to Clearview AI, then why wasn't that correspondencerevealed by a simple email search, let alone theFreedom of Information request filed by CBC News?

"The Service's response to the FOI request was based on request as submitted," police spokeswoman Const. Sandasha Bough wrote in an email."The London Police Service maintains that there are no responsive records to that request."

FOI expert says poor police administration a possibility

Ken Rubin, an Ottawa-based researcher and a Freedom of Information expert, said the fact the emails were missed could illustrate a lack of accountability or poor administration when it comes to record keeping by city law enforcement.

"There's a lack of accountability procedures. Records and what people are doing are not always accountable for in their management process," he said.

"It's done through emails or blackberries or whatever so you don't know what's being decided upon, what's being collected or exchanged and so if you don't have that you have a problem," he said.

Rubin said the case ofClearview AI illustrates a need for better regulation of digital surveillance technology which is more widely used than most people think.

"It's not a matter of just a few rogue officers doing things out of turn and 'Oh we're sorry about that'airports and other places are using this sort of technology," he said."We have a serious problem."

"Some of the information and privacy commissioners are starting to realize that and so are other people."

A spokeswoman for the Ontario Privacy Commissioner said Brian Beamish would not be available for commentuntil early next week.