On September 14, the Consulate General of Canada in New York spotlighted Canada’s expert and public consultation process on online harms and tech governance at “Participatory Democracy to Govern Big Tech: The Canadian Experience”, an event co-hosted with McGill University’s Centre for Media, Technology, and Democracy (CMTD) and All Tech is Human. The event also featured speakers from the Canadian Commission on Democratic Expression, Expert Advisory Group on Online Safety, and Citizens Assemblies on Digital Rights and Safety.
“Canadians should be able to express themselves freely and openly without fear of harm online,” said Canada’s Deputy Consul General André Frenette. “We are committed to taking the time to get this right – because this issue is too important to not get right.”
McGill Senior Fellow Frances Haugen credited the Government of Canada for undertaking the consultations after committing in the 2021 Speech from the Throne to fight harmful content online. Haugen said Canada’s experience “can serve as a model” to move the international tech governance conversation forward
In partnership with the Public Policy Forum and MASS LBP, CMTD organized four Citizen Assemblies: groups of 36 to 48 individuals chosen at random to examine how the Government of Canada should regulate digital service providers to create a safe environment where Canadians can express themselves.
In all, 90 Canadians from across the country representing a wide range of cultural identities and experiences gave a combined 6,000 hours of their time to voice their views and concerns on online harms, digital platform governance, and democratic expression online.
McGill Professor Taylor Owen said the consultation process was key not only to inform policy development in Canada, but also to build public support on a polarizing issue.
Owen said, “How you get to these digital policies matters – maybe even more than the outcome. You need buy-in for these policies to work. These are sweeping governance regimes, often setting up new regulators. They need buy-in from civil society and industry of all sorts, both domestic and international. You need citizen buy-in, which all too often these bills don’t have – they’re imposed on citizens.”
The Assemblies’ input was surprising. In a report on the Assemblies’ recommendations, MASS LBP co-founder Peter McLeod remarked that despite Assembly participants’ various levels of experience with online communities, each expressed concern with the significance and social impact of digital platforms: “Each Assembly has been unanimous on the need for immediate and far-reaching regulations to curb what they see as the pernicious and largely unconstrained ability of bad actors to exploit, harass, and victimize Canadians online.”
McLeod also noted the Assemblies’ lack of confidence that digital platforms would address the issues on their own: “They express deep skepticism about the sincerity or ability of many digital service providers to take the necessary steps to curb these harms.”
The Assemblies recommended the government adopt a risk-based “duty of care” approach for regulating online platforms, based on seven values:
Alongside the Assemblies, experts were also consulted on online harms and tech governance through the Canadian Commission on Democratic Expression and the Expert Panel on Public Safety in the Digital Age.
Speaking to her experience on the Expert Panel, University of Calgary Professor Emily Laidlaw said she was impressed that the group was able to find consensus on key issues, despite their diverse and often opposing points of view.
Laidlaw said, “Once you put people in a room to have these conversations, [you find that] people agree on the basics. It’s the details people disagree on. These details get too much public airtime, and that makes the debate appear a lot more divisive than it really is.”