Wishlist for the Justice Minister’s priorities on digital issues and law reform
27 January 2020
One of the key themes in the mandate letter to the Minister of Justice and Attorney General is digital issues: combating online hate, harassment and extremism; enhancing the powers of the Office of the Privacy Commissioner (OPC) and advancing the Digital Charter, which the Government rolled out in May 2019. The focus of the latter is on delivering a suite of online rights, and the list is broad and worth replicating in its entirety:
Work with the Minister of Innovation, Science and Industry and the Minister of Canadian Heritage to advance Canada’s Digital Charter and enhanced powers for the Privacy Commissioner, in order to establish a new set of online rights, including:
data portability; the ability to withdraw, remove and erase basic personal data from a platform; the knowledge of how personal data is being used, including a national advertising registry and the ability to withdraw consent for the sharing or sale of data; the ability to review and challenge the amount of personal data that a company or government has collected; proactive data security requirements; the ability to be informed when personal data is breached with appropriate compensation; and the ability to be free from online discrimination including bias and harassment.
The mandate is a potential game-changer for the Canadian legal landscape in tech. It comes at a critical time. Our laws are out-of-date and out-of-touch with the digital economy. For example, our private sector privacy legislation (PIPEDA) is almost 20 years old, and while drafted with the goal of promoting trust in e-commerce, needs to evolve to address the types of challenges we now face (e.g. artificial intelligence, automated decision making, data mining and so on).
And the nature of the cybersecurity and privacy threats challenge traditional legal categories: the idea that we have public law that binds the state and private laws for the private sector, and that issues in, for example, defamation law are siloed from data protection, contract law or competition law, to name a few. These categories are collapsing, meaning that any new law must be designed with the whole picture in mind (on this see point four below). The consequence of failing to tackle this subject matter collaboratively across legal disciplines is that the law will lack coherence and be riddled with regulatory gaps, and will not be effective to address the major cybersecurity challenges we face. The Ashley Madison cyberattack is illustrative, where a website for cheaters was hacked and the client list posted online. Here, privacy and reputation became cybersecurity issues. More generally, data breaches usually involve a private sector entity, whether an issue of national security or not (e.g. hacking of sensitive client information held by a bank or retailer). We are seeing similar issues with deepfakes, which are altered images and videos. These can be used to humiliate an ex-spouse or target a politician to disrupt the democratic process.
Another example of this collapsing of categories is platform liability. This refers to the legal obligations of online providers such as social media and search engines for content posted by third parties. This is a key issue in cybersecurity and national security governance discussions and questions explored include the responsibilities of social media companies for radicalization, hate speech and disinformation. Most laws (and I am speaking more broadly than Canada here) on platform liability draw from defamation law and general content regulation laws. Thus, to understand the national security threats we face concerning disinformation, to name one, we need to look to the history of internet and communications regulation.
Further, any regulatory decision inevitably implicates the innovation economy and human rights. If a private company is deemed critical infrastructure for national security matters, how does this public/private relationship operate? What are the human rights responsibilities of a private company that is enlisted in a role critical to state security? And depending on the regulatory burdens imposed on companies for any of the above, there is an impact on innovation, risking entrenching the big players at the top and shutting out new entrants to the market. This is a weighty mix to balance.
With that in mind, these are a few of my wishes for this mandate, big and small. First, the mandate letter rightfully prioritizes reform of private sector privacy legislation (PIPEDA). There is mounting pressure in Canada in light of Europe’s General Data Protection Regulation (GDPR) and California’s new Consumer Privacy Act. The suite of online rights identified in the Mandate is telling as to the priorities of PIPEDA reform, and includes enhancing the powers of the OPC, and re-designing individuals’ rights in relation to the data about them (e.g. data portability, erasure, transparency, increased user control). I would like to see consent, and its limits, more meaningfully revisited in this reform process (there are times, I suggest, where consent is not an adequate metric of privacy protection). And we need to tackle the ‘sensorized’ nature of our lives: That our lives will be mapped by a network of gadgets and online services enhanced by artificial intelligence is our reality now and more so in the future. We often do not know what is happening to us and rely on scandal to find out (ranging from Cambridge Analytica to specific breaches of trust, such as of children’s apps). My hope is that law, ethics and technology are the foundation of this reform, and that the focus on algorithmic accountability that the Government has started will be further developed.
Second, I do not want to see the complications of regulating this space underplayed. Part of what makes regulation here so difficult is that most of our interactions online and with technology are shared experiences, and we have not yet settled on an ideal analytical framework for allocating decision-making power for shared experiences. This is most evident in debates about the right to be forgotten aka the right to have content delisted from search engine results. How much of the story of our lives is ours to tell? When do others have rights to tell it? The recent proposal of the OPC for a right to de-listing from search engines relies heavily on the concept of public interest to achieve this balance, and I caution that this is a problematic conceptual underpinning to use without further scrutiny (see my criticism here).
Similarly, third, platform governance is an important unifying issue in the mandate letter, and in the Digital Charter it seeks to advance. To tackle online hate speech, disinformation, online abuse, security vulnerabilities and privacy entails developing a framework for the roles and responsibilities of platforms. Platform regulation has been central to my scholarship (e.g. see here and here), and my caution would be to tread carefully. To meaningfully address this field means to be aware of unintended consequences on the innovation economy, on the balancing of rights and business regulation (ranging from competition to corporate responsibility). Further, to properly address platform regulation requires we holistically tackle the subject-matter. Our current laws have significant gaps (e.g. there are no intermediary laws in the tort of privacy, for example, although courts would likely draw from defamation law), and what does exist are subject-matter specific (e.g. defamation, copyright). In an ideal world, law reform in this area would bring together experts from multiple fields, from behavioural economics, psychology, media studies, computer science, and multiple fields of law to address at a macro-level what platform responsibility should look like.
Fourth, the name of the game is nimble regulation. The mandate letter mentions a digital advertising registry. Recall a recent Report of the Standing Committee on Access to Information, explored the idea of creating a social media council, and regulatory oversight of algorithms. This push for regulatory oversight is evident elsewhere. The United Kingdom House of Lords Committee recently proposed a Digital Authority to facilitate coordination between regulators on digital issues, and another United Kingdom proposal is for a digital regulator to administer a statutory duty of care on online companies. Regulation has the advantage of flexibility to the changing technology landscape, but unless carefully targeted will overburden online companies without solving the problem targeted. For example, the impulse is to impose significant regulatory burdens on platforms when tragedies happen, such as the New Zealand mosque shooting and its streaming on social media. However, it is not obvious whether greater regulation would necessarily have restricted the ease of streaming of this content, nor speed of its takedown. Regulation, in its widest sense here, requires we consider the most effective strategies, legal and non-legal, to achieve an objective. Some of these strategies should include technological solutions, victim support and education.
Finally, I hope to see the wider body of law on online abuse, prioritized with access to justice as the central driver of reform. Online abuse is identified in the mandate to the extent the abuse might be hate speech or harassment, or is part of the suite of reform priorities concerning PIPEDA. However, the remit of the OPC is quite narrow, and most online abuse occurs outside that regulatory bubble, and many forms of abuse are not hate speech or harassment. Consider all the ways that technology can be used to abuse or manipulate: sexualized shaming such as through deepfakes, revenge pornography and fake profiles; cyberstalking and harassment (to track victims, control their experiences, humiliate); abusive comments and posts ranging from offensive to defamatory and hate speech. The OPC has a limited role to play in addressing most online abuse, and traditional courts, for many reasons, are not ideal to deliver timely, affordable or effective resolution. Given the volume of abusive content posted online, and the particular vulnerability of marginalized groups, youth and partners in domestic abuse, online abuse should be tackled more widely than the mandate calls for, and access to justice should be explicitly engaged as a central pillar in that pursuit.
Emily Laidlaw is an Associate Professor at the University of Calgary, Faculty of Law