Maggie MacDonald is a PhD candidate at the University of Toronto specializing in platform policy research, and an advisory board member with Ethical Capital Partners, which manages Pornhub through its parent company Aylo.
Canadians have witnessed a recent legislative push for online age verification, ostensibly to keep kids safe from harm. Following trends toward age-restriction in Utah, Texas and several other states, controversial U.S. laws backed by evangelical conservative groups are now being echoed in Canada. As the Online Harms Act (C-63) makes its way through the House of Commons, another federal bill aimed at “protecting young persons from exposure to pornography” (S-210) has been approved by the Senate; digital IDs have been endorsed by the Leaders of the Conservative Party and Bloc Quebecois.
We can all agree that young people deserve protection from harm, so keeping kids safe online is a position that politicians are quick to endorse; support for anti-porn bills offers an easy route to positive PR. But good intentions don’t make for good laws.
While optimistic in tone, these alarmingly vague proposals gloss over the enormous technical and critical privacy concerns they’d provoke if implemented. Michael Geist, the Canada Research Chair in internet and e-commerce law, found that if C-63 was passed, members of the Digital Safety Commission to enforce anti-harm measures would not be “bound by any legal or technical rules of evidence.” S-210, meanwhile, was called “fundamentally flawed” by the Canadian Heritage Minister’s office.
Age-verification poses significant privacy risks for all internet users in Canada. Systems that authenticate age typically rely on government-issued ID being checked through third-party services. These methods are easy to circumvent with stolen IDs, so they’re often paired with biometric facial scans; some even involve bank account logins. Experts in cybersecurity, law and privacy rights warn that these surveillance-based models do far more harm than good.
If porn-watchers across the country are tracked through a centralized archive, it will also become a prime target for hackers. Records of our personal information tied to details of adult content we watch mean it would only take a single data breach for millions of people to be made vulnerable to harassment, blackmail and exploitation – and digital-ID databases are notoriously leaky.
Because of the steep privacy and liability concerns, websites can’t collect or manage this sensitive information using their own existing infrastructures, either. Centralized third-party systems run by private companies would need to partner with our government to collect and store IDs, creating further concerns around private enterprise. Digital security agencies will fiercely outbid competitors to secure these exclusive and lucrative contracts, but as we’ve observed in parallel sectors, the most financially attractive bids – frequently the ones chosen in government procurement processes – are no guarantee of the rigorous service delivery these systems require.
Despite being marketed as porn-focused, the scope of age-identification affects more than just adult sites. Search engines and social media companies could easily be swept up in vague risky-content thresholds and required to verify users, too. Understandably, many citizens would be uncomfortable registering with a government ID to log on to Facebook or X. The chilling effect of this precedent would muzzle our digital commons and impede free speech online.
There’s a fundamental flaw in the premise of age verification: the notion that pornography is inherently harmful and should be managed through censorship. Not only are Charter-enforced freedoms of expression in direct opposition to this approach, but peer-reviewed research in medicine, sociology and psychology consistently refutes beliefs – primarily driven by moneyed American lobby groups – linking porn to addiction, violent dysfunction or child abuse. Studies around human-computer interaction and child development prove policies of enclosure don’t limit harm or even prevent porn-watching. Instead, they mostly alienate young people.
Should we choose censorship, porn won’t cease to exist. This March, Pornhub blocked access in Texas after its age-verification law went into effect; in the hours that followed, Google searches for virtual private networks (VPNs) quadrupled in the state. Young people today understand – far better than most lawmakers do – how to easily sidestep censorship using such tools. And as Ethical Capital Partners notes, leading platforms already comply with Canada’s criminal code and child protection laws. Cracking down on responsible operators will drive traffic toward hundreds of thousands of anonymous sites that don’t maintain records or moderate harm.
In short, these laws won’t halt porn-watching; they’ll just push it underground, making conditions worse for the many Canadians who rely on legal porn work for income, and access significantly riskier for audiences who will continue to seek it out.
Young people deserve our protection and support, but throttling porn sites is not the answer. Rather than creating a digital surveillance state that imperils us all, let’s shape a safer Internet through evidence-based approaches.