CBSA to use facial recognition app for people facing deportation amid privacy concerns - Action News
Home WebMail Friday, November 22, 2024, 10:31 AM | Calgary | -10.8°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Windsor

CBSA to use facial recognition app for people facing deportation amid privacy concerns

The mobile reporting app would use biometrics to confirm a person's identity and record their location data when they use the app to check in.

Spokesperson confirmed an app called ReportIn will be launched this fall

A Canada Border Services Agency (CBSA) sign is seen in Calgary, Alta., Thursday, Aug. 1, 2019. Alberta border officers said they made their largest-ever seizure of methamphetamine earlier this week at a crossing into Canada from Montana.
A CBSA spokesperson confirmed that an app called ReportIn will be launched this fall. (Jeff McIntosh/Canadian Press)

The Canada Border Services Agency plans to implement an app that uses facial recognition technology to keep track of people who have been ordered to be deported from the country.

The mobile reporting app would use biometrics to confirm a person's identity and record their location data when they use the app to check in. Documents obtained through access-to-information indicate that the CBSA has proposed such an app as far back as 2021.

A spokesperson confirmed that an app called ReportIn will be launched this fall.

Experts are flagging numerous concerns, questioning the validity of user consent and potential secrecy around how the technology makes its decisions.

Each year, about 2,000 people who have been ordered to leave the country fail to show up, meaning the CBSA "must spend considerable resources investigating, locating and in some cases detaining these clients," says a 2021 document.

The agency pitched a smartphone app as an "ideal solution."

Getting regular updates through the app on a person's "residential address, employment, family status, among other things, will allow the CBSA to have relevant information that can be used to contact and monitor the client for any early indicators of non-compliance," it said.

"Additionally, given the automation, it is more likely that the client will feel engaged and will recognize the level of visibility the CBSA has on their case."

Plus, the document noted: "If a client fails to appear for removal, the information gathered through the app will provide good investigative leads for locating the client."

Visitors check their phones behind the screen advertising facial recognition software during Global Mobile Internet Conference (GMIC) at the National Convention in Beijing, China April 27, 2018.
Visitors check their phones behind a screen advertising another facial recognition software during a Global Mobile Internet Conference. (Damir Sagolj/Reuters)

An algorithmic impact assessment for the project not yet posted on the federal government's website said biometric voice technology the CBSA tried using was being phased out due to "failing technology," and it developed the ReportIn app to replace it.

It said a person's "facial biometrics and location, provided by sensors and/or the GPS in the mobile device/smartphone" are recorded through the ReportIn app and then sent to the CBSA's back-end system.

Once people submit photos, a "facial comparison algorithm" will generate a similarity score to a reference photo.

If the system doesn't confirm a facial match, it triggers a process for officers to investigate the case.

"The individuals' location is also collected every time they report and if the individual fails to comply with their conditions," it said. The document noted individuals will not be "constantly tracked."

FILE PHOTO: The logo of Amazon Web Services (AWS) is seen during the 4th annual America Digital Latin American Congress of Business and Technology in Santiago, Chile, September 5, 2018.
The app uses technology from Amazon Web Services. (Ivan Alvarado/Reuters)

The app uses technology from Amazon Web Services. That's a choice that grabbed the attention of Brenda McPhail, the director of executive education in McMaster University's public policy in digital society program.

She said while many facial recognition companies submit their algorithms for testing to the U.S. National Institute of Standards and Technology, Amazon has never voluntarily done so.

An Amazon Web Services spokesperson said its Amazon Rekognition technology is "tested extensively including by third parties like Credo AI, a company that specializes in Responsible AI, and iBeta Quality Assurance."

The spokesperson added that Amazon Rekognition is a "large-scale cloud-based system and therefore not downloadable as described in the NIST participation guidance."

"That is why our Rekognition Face Liveness was instead submitted for testing against industry standards to iBeta Lab," which is accredited by the institute as an independent test lab, the spokesperson said.

The CBSA document says the algorithm used will be a trade secret. In a situation that could have life-changing consequences, McPhail asked whether it's "appropriate to use a tool that is protected by trade secrets or proprietary secrets and that denies people the right to understand how decisions about them are truly being made."

Kristen Thomasen, an associate professor and chair in law, robotics and society at the University of Windsor, said the reference to trade secrets is a signal there could be legal impediments blocking information about the system.

There's been concern for years about people who are subject to errors in systems being legally prohibited from getting more information because of intellectual property protections, she explained.

Despite a watchful eye over city streets, privacy expert Kristen Thomasen said people expect a level of privacy while in public.
Kristen Thomasen is an associate professor and chair in law, robotics and society at the University of Windsor. (Tom Addison/CBC)

CBSA spokesperson Maria Ladouceur said the agency "developed this smartphone app to allow foreign nationals and permanent residents subject to immigration enforcement conditions to report without coming in-person to a CBSA office."

She said the agency "worked in close consultation" with the Office of the Privacy Commissioner on the app. "Enrolment in ReportIn will be voluntary, and users will need to consent to both using the app, and the use of their likeness to verify their identity."

Petra Molnar, the associate director of York University's refugee law lab, said there is a power imbalance between the agency implementing the app and the people on the receiving end.

"Can a person really, truly consent in this situation where there is a vast power differential?"

If an individual doesn't consent to participate, they can report in-person as an alternative, Ladouceur said.

Thomasen also cautioned there is a risk of errors with facial recognition technology, and that risk is higher for racialized individuals and people with darker skin.

Molnar said it's "very troubling that there is basically no discussion of human rights impacts in the documents."

A person wears a Canada Border Services Agency badge shown on one shoulder.
A CBSA document says the facial recognition algorithm that will be used will be a trade secret. (Jeff McIntosh/The Canadian Press)

The CBSA spokesperson said Credo AI reviewed the software for bias against demographic groups, and found a 99.9 per cent facial match rate across six different demographic groups, adding the app "will be continuously tested after launch to assess accuracy and performance."

The final decision will be made by a human, with officers overseeing all submissions, but the experts noted humans tend to trust judgments made by technology.

Thomasen said there is a "fairly widely recognized psychological tendency for people to defer to the expertise of the computer system," where computer systems are perceived to be less biased or more accurate.