In an effort to identify potential deportees, the federal government quietly tested facial recognition technology on millions of unsuspecting travellers at Toronto’s Pearson International Airport in 2016.
The six-month initiative, meant to pick out people the Canada Border Services Agency (CBSA) suspected might try to enter the country using fake identification, is detailed in a document obtained by The Globe and Mail through a freedom of information request. The project is the largest known government deployment of the technology in Canada to date.
As travellers walked through the international arrivals border control area at Pearson’s Terminal 3, 31 cameras captured images of their faces. Whenever the system returned a match against a 5,000-person list of previously deported people, a border officer would review the data and pass the traveller’s information along to an officer on the terminal floor, who would track the traveller down and pull them into a “secondary inspection.”
Facial recognition system setup at Toronto Pearson International Airport, Terminal 3
The CBSA-controlled area had 31 cameras and 14 capture zones
Capture area
Facial recognition camera
Escalators or stairs
Flow of people
Cross-
border
Int’l
Automated border control kiosks
TERMINAL 3
BORDER
CONTROL
Cross-
border
SOURCE: FACE4 SYSTEMS INC.
Facial recognition system setup at Toronto Pearson International Airport, Terminal 3
The CBSA-controlled area had 31 cameras and 14 capture zones
Capture area
Facial recognition camera
Escalators or stairs
Flow of people
Cross-
border
Int’l
Automated border control kiosks
TERMINAL 3
BORDER
CONTROL
Cross-
border
SOURCE: FACE4 SYSTEMS INC.
Facial recognition system setup at Toronto Pearson International Airport, Terminal 3
The CBSA-controlled area had 31 cameras and 14 capture zones
Escalators or stairs
Flow of people
Capture area
Facial recognition camera
Cross-border
TERMINAL 3
BORDER
CONTROL
International
Automated border control kiosks
Cross-border
SOURCE: FACE4 SYSTEMS INC.
Facial recognition has come under intense scrutiny in recent years. Experts have warned facial data, once collected, could lead to the permanent loss of a person’s anonymity in public. Much like a fingerprint, a face is highly identifiable and can’t easily be changed.
Research has shown facial recognition technology to be less accurate for non-white faces, in part because of assumptions that researchers build into face-matching algorithms. In the United States, several Black people have been misidentified by law enforcement and arrested on the basis of faulty facial recognition matches.
Details about the project, dubbed “Faces on the Move,” are scarce. But presentation slides posted online by Face4 Systems Inc., an Ottawa-based contractor hired by the CBSA to provide the technology and run the pilot, say it resulted in 47 “real hits” – travellers whose faces were matched against the CBSA’s database.
It is unclear if any travellers were deported following facial recognition matches.
In a statement, the CBSA told The Globe and Mail that matches were “processed in accordance with [the agency’s] operating procedures,” and later followed up to say “no individual was removed” as a result of the pilot. After being asked to clarify if that meant no one was deported during the project, the CBSA said facial recognition “would not have been the only indicator used in the traveller’s border clearance process or in determining their admissibility.”
According to Face4 Systems’ presentation, the technology was used on 15,000 to 20,000 travellers a day, and the CBSA told The Globe that 2,951,540 travellers passed through border control at Pearson’s Terminal 3 between July and December, 2016, when the pilot project was running.
“I’m very concerned that the government chose to do this,” said Tamir Israel, a lawyer at the University of Ottawa’s Samuelson-Glushko Canadian Internet Policy and Public Interest Clinic who has studied border agencies’ use of facial recognition.
“This was deployed in a context where there was no public discussion in advance, with a technology that’s known to have flaws in terms of both accuracy and, in particular, racial biases,” he said. “In such a high-stakes environment, that’s really concerning.”
While Mr. Israel said he was aware the CBSA had tested the technology, he did not know the project’s details until now.
The CBSA’s privacy impact assessment – a document designed to identify privacy risks in government operations – breaks down the agency’s procedures during the pilot.
When facial recognition turned up a possible match and triggered a CBSA inspection, the flagged traveller would not be told the name or be shown a photo of the previously deported person with whom they’d been matched, in order to protect that person’s privacy, the document says. If the CBSA confirmed the traveller was eligible for deportation, they would be “immediately deported without any judicial review,” according to the document – standard practice when someone isn’t allowed in at a port of entry, though legal bids to enter Canada would still be possible.
The CBSA did not put up signage within Pearson airport informing travellers the technology was being used, the privacy assessment says, as doing so could have led to “inaccurate information,” defeated the purpose of collecting the information or prejudiced the information being collected.
The agency opted instead to post a summary of the initiative to its website, but that overview intentionally did not specify which airport would be used for the initiative, when it would occur or what database the agency would be testing, according to the CBSA document.
Mr. Israel said he was concerned that travellers may have been identified – and possibly deported – without ever knowing they’d been picked out via facial recognition.
“They would not even know that they may need to challenge this,” he said.
In an e-mailed statement, CBSA spokesperson Jacqueline Callin said the agency “takes the issue of personal information and privacy seriously.”
During the project, “the CBSA explored the ability and use of cameras to capture images of travellers that were compared with an established operational database of high-risk individuals known to be inadmissible to Canada,” she continued.
Ms. Callin said the CBSA completed a privacy impact assessment, which was submitted to the Office of the Privacy Commissioner for review, and that at the end of the project facial recognition technology was removed from the airport and travellers’ images were erased. She also said the agency had no record of a complaint regarding the project, and that the CBSA had not used facial recognition at airports since the 2016 project and is not currently using it “in any other capacity.”
Face4 Systems president and CEO Robert Bell said in an e-mail that the initiative was “carefully designed to account for privacy and security at all levels.” Mr. Bell said the company took several steps to mitigate bias in its technology, including picking algorithms that minimize error. He also said the company keeps close tabs on the performance of its projects.
He added that “a few” organizations in Canada had run similar facial recognition tests “in a similar capture environment” using Face4 Systems’ technology, but that none were currently in service. According to the company’s website, Face4 Systems has done work outside of Canada for Manila International Airport in the Philippines and for the Eurostar rail service.
Facial recognition technology became a flashpoint last year after the New York Times revealed Canadian police forces had used Clearview AI, an American service that collected images en masse from public sources like Facebook, Google and YouTube.
The Royal Canadian Mounted Police, Ontario Provincial Police, Toronto Police Service and 45 other Canadian entities used Clearview AI before the company was forced to pull out of Canada following an outcry over privacy concerns. In a blunt February report, federal Privacy Commissioner Daniel Therrien said the service amounted to mass surveillance and was illegal.
Petra Molnar, a lawyer and associate director of the Refugee Law Lab at York University, said she was concerned that Ottawa’s use of facial recognition would further tilt the power imbalance that already exists at the border.
“The starting point has to be that this technology hurts people,” she said.
As a result of concerns about the technology, facial recognition has been banned by several U.S. cities, including Boston and Berkeley, Calif.
Mr. Israel said Ottawa should institute policies requiring agencies like the CBSA to get a public licence – such as by going directly to Parliament – before employing this kind of technology.
Given the project’s intrusiveness and the fact that facial recognition systems perform differently for people of different racial groups, travellers’ rights may have been violated, he added.
“I think there’s a strong case to be made that this type of implementation is not constitutional,” Mr. Israel said.
Our Morning Update and Evening Update newsletters are written by Globe editors, giving you a concise summary of the day’s most important headlines. Sign up today.