Skip to main content
Open this photo in gallery:

Funding issues are reportedly hampering an RCMP initiative designed to ensure the force uses facial recognition technology in accordance with Canada's privacy laws.Amr Alfiky/The New York Times News Service

An RCMP initiative to ensure that the force uses intrusive technological tools in accordance with Canada’s privacy laws is dealing with a lack of funding and staff, says an internal report obtained by The Globe and Mail.

Concerns about the fate of the Mounties’ new National Technologies Onboarding Program (NTOP) unit are contained in a draft risk-assessment document written by the police force this spring and released under access-to-information laws.

“Without external pressure … there is a possibility that the resources required to implement and institutionalize the program will not materialize,” the assessment says.

The NTOP unit was launched by the Mounties last year as the federal Office of the Privacy Commissioner was investigating the RCMP’s use of a form of facial-recognition software.

Dozens of police forces in Canada had also utilized that same software, which is known as Clearview AI, between 2019 and 2020. But this was largely curtailed once media reports exposed the software’s use. Subsequent reviews by privacy officials determined that the software is illegal under Canada’s privacy laws and that rank-and-file police officers had gotten this gear with inadequate oversight.

Several police forces are now reviewing technology acquired by their detective bureaus in hopes of staving off similar controversies.

At the RCMP, they are putting this oversight in the hands of the new NTOP unit, which ran as a pilot program last year and which was written into the police operations manual this spring as a permanent fixture within the force.

The new unit’s work is under way on nearly 50 police-used technologies, and released records show that NTOP plans to delve into privacy issues around police software, drones, databases, algorithms, body-worn cameras, and cellphone-hacking and interception tools.

The crime-fighting capabilities of these technologies can come at a cost to collective privacy, so the NTOP unit plans to give green-light, amber-light or red-light recommendations about these tools to the RCMP’s 20,000-officer force.

Budget forecasts say the program will need 15 employees and a fixed $3-million annual budget to meet this mission. So far, however, the unit is getting by on much less because of the RCMP’s budget constraints and staff shortages.

The draft risk assessment says it takes the RCMP one to two years to fill any open position. Released records from April show that the new unit was then staffed by four primary employees – assisted by 12 others trying to “support this initiative while maintaining normal duties.”

Such dynamics could bog down reviews, and that could heighten frictions between the new unit and RCMP investigators. Detectives want clarity about what kinds of police technology they can and cannot use as they struggle to solve cases that come with their own considerable time pressures.

“The intensive NTOP assessment process will inevitably slow down the procurement and deployment of operational capabilities,” reads the risk assessment. “This will undoubtedly create concerns and barriers.”

Pushback may happen on other fronts. The risk assessment says that RCMP officers in the field can dislike centrally planned initiatives because such efforts “are commonly being viewed as Headquarters attempting to exert control.”

The risk assessment appears in a release of 400 pages of RCMP documents first acquired by researcher Daniel Konikoff, a PhD student at the University of Toronto studying police and technology issues.

He says the NTOP process could have an outsized impact on policework in Canada, given how other forces take their cues from the Mounties. “What I’m curious to see is whether the RCMP has any influence over provincial and municipal police departments,” he said.

In September, the Mounties submitted a document to Parliament saying that the police force has stopped using some kinds of software that had been employed for years to advance child-exploitation and human-trafficking investigations.

In May, a senior Mountie told a parliamentary committee that vulnerable people will be in jeopardy if police lose complete access to tools like facial-recognition software. “I’m waiting on a decision from our national technical operations, our NTOP process,” Chief Superintendent Gordon Sage testified. “ … Until I get that process completed from my NTOP people, I cannot use it, and victims are at risk today.”

Corporal Kim Chamberland, a spokeswoman for the Mounties, says that the NTOP initiative has received inquiries about 47 investigative technologies from other branches of the force.

“The RCMP has taken the decision to not adopt certain types of technologies for operational use,” she said.

In an e-mailed response to questions from The Globe, she did not reveal details about NTOP’s current staffing, funding or its findings to date.

“The RCMP generally will not comment on specific investigative tools, technologies or techniques, including those that have been assessed and not adopted,” said Cpl. Chamberland. The RCMP is still figuring out resourcing, she said, including by trying to “actively identify a source of funding to allow for the permanent staffing of all positions.”

In the summer of 2021, the federal Privacy Commissioner’s office released recommendations about how to fix the problems that led to the police force’s use of Clearview AI. The office says that work continues with the Mounties’ new unit.

“The RCMP has worked closely with our compliance monitoring team,” spokesman Vito Pilieci said.

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe