Skip to main content
opinion.
Open this photo in gallery:

A software engineer works on a facial recognition program that identifies people when they wear a face mask at the development lab of the Chinese electronics manufacturer Hanwang (Hanvon) Technology.THOMAS PETER/Reuters

Vass Bednar is the founder of Regs to Riches and a senior fellow at Centre for International Governance Innovation. Brenda McPhail created the Right2YourFace coalition. Together, they lead McMaster’s Master of Public Policy in Digital Society Program.

An M&Ms vending machine recently reminded us that our faces are like candy for companies – tempting, but ultimately unnecessary to ingest. As Canada continues to refine privacy and artificial intelligence legislation, we must reconsider whether the use of facial recognition technology in advertising contexts is tolerable or just junk.

A University of Waterloo student recently noticed that a vending machine selling candy on campus had an error message indicating its facial recognition app was on the fritz. Reasonably concerned that they were getting a side of surveillance with their sugar high, the student shared this information on Reddit and subsequently with the student newspaper.

In response, the vending machine company, Adaria Vending Services, assured that no personal information was captured. The company stated that the machines do not take or store any photos or images, and that individuals can’t be identified using the machines, which they said is compliant with the European Union’s General Data Protection Regulation framework. While the GDPR is a high standard, it is not the legal bar in Canada.

The claim that scanning student faces didn’t count as collecting personally identifiable information is an interpretation of Canada’s private sector privacy law that conflicts with our regulators’ recent investigation into a similar incident at Cadillac Fairview malls (and Canada’s Personal Information Protection and Electronic Documents Act, not the GDPR, is the law they need to apply here). The University of Waterloo expressed surprise that the machines were running facial analytics and said they’d remove them.

The machine’s error message is a reminder of a systemic problem that we ignore at our peril if we value our privacy and all the rights it enables. That includes our right to move through our lives with dignity and autonomy, untracked and – controversially – unmonetized.

Consider the setting: How reasonable is it to subject students on campus to corporate surveillance? Even if best practices were followed in terms of procurement, there is a concern over passive surveillance of students without proper understanding or public conversation. The fairness of facial analytics and surveillance technology on campus should be constructively debated, not randomly deployed without notice.

The example was reminiscent of Canadian Tire’s recent use of the technology in British Columbia, where the provincial privacy commissioner found that four stores did not properly notify people entering the store that FRT was in use, failed to demonstrate a reasonable purpose for using FRT and did not obtain consent from those entering the store.

Why do retailers see biometric information fair game for marketing? Because we have allowed companies to get greedy with the data they absorb from us in an effort to prompt us to spend more and shop more often through personalized incentives.

Ultimately, these “smart” vending machines offer a tangible case study to imagine the future use of the Consumer Privacy Protection Act’s “legitimate interest” provision, which would allow businesses to collect personal information without knowledge or consent if they think their interest outweighs any adverse impact on individuals, and that a reasonable person would expect the use.

Would a “reasonable person” consider that it is appropriate to scan young people’s faces when they are buying candy from a vending machine? The company seems to have a curiosity about the age and gender of their clientele, which is somewhat comical when you consider the narrow age band of students on a university campus. There’s also the fact that many of these machines no longer accept cash, prompting customers to tap their debit or credit card – which can communicate information such as their name, demographic information and purchase history.

In the recent past, stakeholders have called for facial recognition technology to be banned until sufficient governance structures were in place. Other jurisdictions such as San Francisco; Portland, Ore.; Massachusetts; and some cities in India have banned the use of FRT by governments.

In Canada, the Right2YourFace coalition has been sounding the alarm about the need to appropriately regulate facial recognition, and to define the contexts where we believe the tool should and should not be used. Canadians can still decide that scanning our faces so companies can sell us more stuff is an illegitimate use of the tool, but if we make that decision, our data governance frameworks need to support this stance.

The embedding of facial recognition technology in vending machines prompts us to look beyond the foundation of sufficient consent. Do we want or actually need it to be used in corporate contexts? It certainly shouldn’t have taken an ominous error message to alert students of the use of this software. Perhaps the application of such technology itself is the failure we should be focusing on.

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe