By Maryna Polataiko
The Government of Canada has taken a wide range of avenues designed to address the question of mis- and disinformation, including: legislation, statements of principle, government initiatives, and civil society funding. Recent and proposed legislation bears potential implications for intermediaries.
Attempts to address mis- and disinformation were prominently highlighted in a parliamentary committee report issued in December 2018 and entitled “Democracy Under Threat: Risks and Solutions in the Era of Disinformation and Data Monopoly.” The resulting recommendations include attempts to create greater transparency around political content through measures such as the regulation of political advertisements and limits on foreign funding of electoral content.
The Report also recommended algorithmic transparency requirements and the ability to audit said algorithms, and related transparency obligations for targeted advertising, e.g. showing what the target audience is, and reasons for targeting. Additional recommendations included new obligations to: label bot-produced content; find and take down fake accounts imitating people “for malicious reasons”; label advertising; and the adoption of a code prohibiting “deceptive or unfair practices” and compelling the removal of, among other types of content, “fraudulent” as well as “maliciously manipulated content (e.g. ‘deep fake’ videos).”
More recently, spurred by COVID’19 misinformation, a top Canadian official announced his support for criminalizing disinformation in April 2020, calling for legislation that would protect Canadians from “people who are actively working to spread disinformation, whether it's through troll bot farms, whether [it's] state operators or whether it's really conspiracy theorist cranks who seem to get their kicks out of creating havoc.” While the idea of a criminal approach appears to have since been abandoned, some of these other proposals have found legislative purchase.
The 2018 Elections Modernization Act (the “EMA”) amended the Elections Act to include several disinformation measures aimed at protecting electoral integrity, with a focus on political advertisement transparency. Notably, platforms must publish a registry of political advertising messages when selling advertising space to political entities during an election. The registry must include a copy of the messages, and must specify who authorized the publication thereof.
Some platforms have elected to ban political advertising during elections instead of creating ad registries, citing compliance difficulties for real-time bidding mechanisms as well as struggles with definitional breadth.
Legislative reforms seeking to update Canada’s federal privacy law—the Personal Information Protection and Electronic Documents Act (“PIPEDA”)—may be used to address the spread of mis- and disinformation. Bill C-11 would introduce several new transparency obligations around automated decision-making into Canada’s privacy regime, and it is possible these measures will apply to algorithmic content curation.
The new transparency obligations will apply to ‘automated decision systems,’ defined as “any technology that assists or replaces the judgement of human decision-makers using techniques such as rules-based systems, regression analysis, predictive analytics, machine learning, deep learning and neural nets.” Organizations will need to make available “a general account of the organization’s use of any automated decision system to make predictions, recommendations or decisions about individuals that could have significant impacts on them.” Users will also be able to request explanations of predictions, recommendations or decisions made by automated decision systems, and to seek explanations of how the personal information used to make the prediction, recommendation or decision was obtained.
If algorithms used to curate and moderate content on online platforms fall within these definitions and decisions to remove or refuse to remove content amount to ‘significant impacts,’ these privacy reforms will be in line with the “Democracy Under Threat” report’s transparency recommendations. In practice, this may look like Facebook’s “Why am I seeing this ad?” feature.
Canada’ Digital Charter is said to set out “what Canadians can expect from the Government in relation to the digital landscape…”. These reforms may be an outgrowth of the ‘strong democracy’ principle outlined in Canada’s Digital Charter, which states that the “Government of Canada will defend freedom of expression and protect against online threats and disinformation designed to undermine the integrity of elections and democratic institutions.” It remains to be seen what other measures may emerge.