Bill C-63: The Online Harms Act

Bill C-63: The Online Harms Act

Adisa Ameeri

Executive Summary

Bill C-63, otherwise known as the Online Harms Act, is a policy initiative proposed to protect minors and other vulnerable groups against digital crimes and online hate.1 The key objectives of Bill C-63 is to protect the rights of its citizens whilst limiting polarization and online crime against vulnerable communities.

Policy Overview

Bill C-63 covers a range of issues, but they focus on four main points:

  1. To address the issue of online abuse and harm, including content that incites violence, and nonconsensual distribution of intimate images etc.2
  1. To make amendments to the Criminal Code which would create and increase penalties for hate speech related crimes. Penalties would include; life imprisonment. The Bill also proposes a recognizance to ensure peaceful behaviour.3
  1. “Would reintroduce a section of the Canadian Human Rights Act to enable human rights complaints during and for internet communications.”4
  2. Would implement mandatory reporting measures to protect, prevent, and stop the spread of child pornography.5

Problem Policy Addresses

The issues this policy addresses revolve around the digital world being fairly new and anarchic. Yet held together by terms and agreements of companies. Canada’s Bill C-63 is a way to limit crime in an anarchic digital world and support those who fall victim to predatory behaviour. It is needed to implement safety requirements that platforms must follow within Canada to keep individuals safe.

Figure 1: Use of Online Platforms by Age in Canada

The Toronto Metropolitan University’s survey on the social media platform used by individuals of all ages and details how individual ages 16-29 are most prevalent or take up a large majority of a platform’s audience.6 Taking into account the demographics of Figure 1 above, reveals the impact of social media on audiences.

Figure 2: Reported News Sources – by Age in Canada

Figure 2 goes on to reveal each age group engaging with news on the respective platform.7

Figure 3: Canadians’ Trust to Act in the Best Interest of the Public

Figure 3 reveals Canadians’ trust towards each respective platform to behave in a manner that is in the best interest of its consumers.8

Figure 4: Canadians’ Trust in Social Media Platforms Continues to Fall

Figure 4 indicates that Canadian trust in social media platforms only continues to fall in the media users consume on the respective social media platforms.9 An example is how TikTok has replaced Facebook as the least trusted social media platform.10 Each graph indicates the problem with people’s trust in media which has diminished via misinformation and disinformation as well as many people who fall victim to predatory behaviour.

Analysis of the Policy

The proposed bill had sparked conversations around online safety however, due to the limitations presented such as the possible threat of censorship and weaponization of the well-intended bill, the policy had been turned down. One of the main concerns of the policy was the “changes that would be made to…” s. 810.012 of the criminal code, where a new type of “pre-crime” would be added.11 Where if a judge and attorney general believed someone to be at “high-risk for voicing online hate” they could impose bond conditions even if they have not committed a crime.12 Instead of implementing an arguably drastic measure, picking and choosing what would be helpful to society is better, like taking out the pre-crime clause and keeping the Human Right act within the Bill.

Conclusion

The proposed bill is somewhat effective with the issues it attempts to solve however, many parts of the bill can risk “…hurting Canadian’s fundamental rights”13 if wrongfully applied or convicted.14 There is much risk, but if the bill could be split, it can still prove to be helpful to Canadian society and protect them from predatory behaviour online.

Digital Rights and Deepfake Proliferation

Joshua Evangelista

Innovation Path: Runner-up

Digital rights are increasingly recognized as fourth-generation human rights, especially in light of the rapid advancement of artificial intelligence and the digitalization of society. Major tech companies such as Google and Facebook dominate the digital landscape, often leveraging user data for profit. Furthermore, AI can reinforce and amplify societal biases, manipulating public opinion and undermining individual autonomy (Botes, 2023). Emerging technologies like deepfakes pose additional risks by generating convincing but false content—such as fabricated videos of political figures making controversial statements—which can severely threaten democratic institutions and public trust (Coeckelbergh, 2023). To address the intersecting challenges of privacy, equity and innovation, this brief proposes the creation of a U.S Digital Rights and Innovation Act (DRIA) – a dynamic federal framework that is modelled on proven policy responses and grounded in academic research. The DRIA specifically addresses access to digital rights for all and combatting the harms of deepfakes.

Digital Rights for All

The United States should implement federal data privacy legislation that guarantees all residents the right to access, delete, and control their personal information. This is modelled after the European Union’s General Data Protection Regulation (GDPR)—which successfully enhanced transparency and user trust in digital ecosystems (Greenleaf, 2018). This framework would address critical gaps in current state-level laws, such as the California Consumer Privacy Act (CCPA).

A federal standard must:

  1. Strengthen Enforcement: Provide robust mechanisms to ensure compliance, to avoid
    ambiguities seen in the CCPA.
  2. Clarify Regulations: Offer precise guidelines for businesses and being transparent in
    sharing third-party data sharing.
  3. Bridge the Legal-Tech Divide: Foster collaboration between legal and technology teams
    to operationalize privacy protections effectively. This can be done by sharing know-how to each other in terms that someone with a non-technical background can understand.

Deepfake Protection

The proliferation of deepfakes poses significant threats to freedom of expression and democracy in the United States. AI-generated products can spread misinformation, manipulate public perception, and disproportionately harm marginalized communities—particularly in rural and less educated areas where digital literacy is lower. Without the ability to discern real from fake, individuals in these communities may develop false beliefs, leading to social and economic harm (Barber, 2023).

To mitigate these risks, the following policy measures should be implemented:

1) Dual Policy Response

  • Permissible Use: Deepfakes created for satire, parody, or entertainment where their
    artificial nature is obvious should remain protected under free speech.
  • Restricted Use: Deepfakes designed to deceive, such as manipulated political speeches
    or fraudulent financial announcements (e.g. a fake video of Donald Trump announcing tariff removals to manipulate markets) should be banned or heavily regulated to prevent harm.

2) Public Education & Digital Literacy

  • Expand initiatives like MIT’s “Detect Fakes” toolkit to marginalized communities,
    equipping individuals with skills to identify deepfakes. For instance, lip movements are
    one of the most obvious indicators of deepfake usage. Unnatural or mismatched lip
    movements often signal that a video has been manipulated
  • Teach critical media literacy, including source verification and cross-referencing with
    trusted outlets. Knowing what sources is credible should be a skill and taught throughout
    the United States.

3) Technological Countermeasures

  • Image & Video Verification Tools: Platforms should integrate AI detection tools to flag
    suspected deepfakes. Verification and content flagging on Twitter or Wikipedia already
    exist and should be widely adopted.
  • Content Labeling & Flagging: Mandate clear disclaimers on synthetic media to prevent
    deception.

References

Botes, M. (2023). Autonomy and the social dilemma of online manipulative behavior. AI and Ethics, 3(1), 315–323. https://link.springer.com/article/10.1007/s43681-022-00157-5

Coeckelbergh, M. (2023). Democracy, epistemic agency, and AI: Political epistemology in times of artificial intelligence. AI and Ethics, 3(4), 1341–1350. https://doi.org/10.1007/s43681-022-00279-9

Greenleaf, G. (2018). Global data privacy laws 2017: 120 national data privacy laws, including Indonesia and Turkey. Privacy Laws & Business International Report.

Barber, A. (2023). Freedom of expression meets deepfakes. Synthese, 202(2). https://doi.org/10.1007/s11229-023-04042-0

Bridging Canada’s Digital Divide and Safeguarding Data Sovereignty

Lindsay Chavez

Winner: Innovation Path

Although Canada’s digital environment is robust in certain urban areas, it remains uneven and spotty throughout its diverse regions. Disenfranchised groups oftentimes have limited or unaffordable internet connectivity, especially in rural or isolated areas. This disparity limits access to vital digital services, such as telemedicine and e-governmental platforms, in addition to affecting people’s ability to work from home or pursue an online education. Adequate internet access is a basic extension of human rights in the modern era and not a luxury.

An important problem that goes beyond internet connectivity is data sovereignty. The majority of Canadians’ data is stored or transmitted through the United States, despite the country’s own data privacy laws. It may be possible for American authorities to have extensive access to this data through the US legal system, which includes laws like the USA PATRIOT Act. Minority groups are at a greater risk, as their personal data may be subject to scrutiny or misuse as a result. Canadians need further assurances that their information will be secured under domestic legal frameworks at a time when increasingly aggressive data gathering policies are endangering online anonymity.

This proposal is aimed at federal lawmakers and regulators who have the authority to create and implement digital policy, particularly those in Innovation, Science and Economic Development (ISED) Canada. To ensure that policy solutions take into account the various geographic and socioeconomic circumstances of each region, the government agencies at the provincial and territorial levels are important in the implementation and financing of digital initiatives.

A new, comprehensive strategy should be pursued to handle these interconnected issues of data sovereignty and internet access. The policy would first formally establish national internet connectivity requirements. The government would provide targeted subsidies and infrastructure investments for rural broadband by utilizing partnerships at the federal and
provincial levels. In order to ensure that these investments result in dependable, fast connections, this strategy would combine performance-based benchmarks with subsidies for ISPs who commit to constructing networks in marginalized areas.

Second, the policy would require that personal information belonging to Canadian people or residents be kept on servers that are physically located in Canada or in jurisdictions that meet or surpass Canadian privacy requirements in order to safeguard Canadians’ data rights. This requirement would provide a clear legal framework to protect sensitive information and be consistent with other countries’ current explanations of data localization. Additionally, the regulation would uphold the right to repair, guaranteeing that small firms and people can restore electronic items without hindrances from manufacturers. Such legislation would prolong the life of vital technologies and help lower-income Canadians stay connected by lowering the cost of maintenance for both personal and public devices.

Finally, strong oversight procedures would be covered by the new policy. It would be possible to create a dedicated government digital ombudsperson to handle complaints, carry out audits, and suggest improvements to stay up to current standards with rapidly changing technology. In order to maintain objectivity and accountability, this position would be separate from internet service providers and device makers.

When combined, these actions would close significant gaps in Canada’s digital infrastructure. They would strengthen national sovereignty over people’s data, broaden the definition of human rights to include meaningful online access, and protect vulnerable and low-income populations from the growing expenses of device maintenance. These regulations are an important step towards a digital future that is inclusive for all Canadians in a world where civic engagement is increasingly defined by digital connectedness.

References

CRTC. (2020). Broadband availability in Canada [Communications Monitoring Report]. https://crtc.gc.ca/eng/publications/reports/policymonitoring/2020/cmr5.htm

Innovation, Science and Economic Development Canada. (2021). Connecting Families Program: Annual Report. https://ised-isde.canada.ca/site/connecting-families/en

OpenMedia. (2020) Digital Divide Report: Accessibility in Canada’s rural and remote communities. https://openmedia.org/

Statistics Canada. (2020) Internet access in Canada: Trends and disparities. https://www150.statcan.gc.ca/n1/daily-quotidien/210531/dq210531d-eng.htm

Intra-Canada Digital Policy on Data Sovereignty

Eijiro Kakihara

Winner: Research Path

This policy assessment argues that a legal contradiction exists between U.S. surveillance practices and Canadian privacy protections, and trade agreements like CUSMA restrict Canada’s ability to enact robust data localization laws. These factors have led to a fragmented policy landscape for data sovereignty.

Canada and the U.S. have differing legal frameworks. This misalignment suggests that U.S. laws infringe on Canadian privacy rights through a legal loophole. Canadians are broadly protected from unreasonable search and seizure by authorities under Section 8 of their Charter (Department of Justice). Furthermore, Canadians’ data is protected during transfers outside the borders as well. The federal PIPEDA requires organizations to practice due diligence and be transparent with users about how foreign recipients handle Canadian data, even if those jurisdictions have less stringent privacy laws (Office of the Privacy Commissioner of Canada). Meanwhile, the U.S. National Security Agency, under Section 702 of the FISA, is authorized to intercept and analyze foreign data routing in the U.S., including from Canada, without a warrant (Director of National Intelligence 2-3). This law has significant impacts on Canada, as nearly two-thirds of its data and over half of Canadian government website traffic are routed through the U.S. before returning to Canada, according to a 2019 report (Orr 2-4). In other words, even if Canadian data is sent and received within Canada, the significant portion that passes through the U.S. is subject to warrantless U.S. surveillance. This is a legal contradiction; while Canadians’ data is protected domestically under the Charter, and internationally under PIPEDA when it is transferred outside their borders, it is not protected from the NSA if the data routes through the U.S. This legal contradiction creates a data policy landscape that significantly undermines Canadians’ data protection, especially considering the large amount of data that travels across the U.S.

Canada-U.S. trade agreements pose significant barriers to solving this contradiction. Chapter 19.12 of CUSMA restricts any member country from requiring a company to store or process data within that country as a condition for doing business (Global Affairs Canada). The only exception to this law is when a digital good or service is provided to a government. This means that generally, if a private company is offering a digital product or service, its data cannot be mandated to be localized domestically by a member country. This provision in the CUSMA is criticized by various politicians and academics, especially concerning how little analysis was done on the impact on Canada. The chief negotiator of CUSMA, Steve Verheul and his team revealed in a Standing Committee hearing that “no analysis was completed” that shows that “Chapter 19 of CUSMA will not prevent Canada from adopting laws that would create similar provisions such as those contained in Article 20 of GDPR” (“Trade Officials, Dairy Farmers on New NAFTA Bill” 11:14 – 12:33) which allows the EU to retain legislative freedom to enact data privacy and localization rules against CPTPP (Intersoft Consulting). In other words, according to law professor Michael Geist, “the very foundation of national digital and data strategy will be dictated by a trade agreement in which the government conducted little analysis”. This restriction poses a threat to Canada’s data privacy laws, as any policy that attempts to force the private sector to localize data would need to carefully navigate CUSMA provisions or amend CUSMA altogether. The political turmoil of CUSMA further poses a disincentive for the government to make any policy that may violate the treaty.

The dual challenges of a legal contradiction between U.S. and Canadian surveillance regimes and the restrictive provisions of CUSMA have resulted in an inconsistent policy framework for data sovereignty in Canada. These unresolved tensions continue to undermine Canada’s ability to protect its citizens’ digital privacy effectively.

2025 Graduation Reception Quest Activity

Help us celebrate your graduation! Complete at least 2 activities to enter the goodie bag draw — winners will be announced during the event.

    Select all that apply. You must complete at least two for a chance to win!

RSVP: Zacher Lecture Department Talk 2025


This talk is for UBC Political Science faculty members and grad students only. For more information about the public Zacher lecture on March 11, please go here.

Please note that photos may be taken, and the event will be recorded. More information about the event can be found here.

 

RSVP: 2025 Mark Zacher Lecture

Mark Zacher Distinguished Speaker Lecture 2025

Date: Tuesday, March 11, 2025

Time: 6-8 PM

Location: Ponderosa Ballroom

Event details: https://politics.ubc.ca/events/event/2025-mark-zacher-lecture/


 

RSVP: Politics to Watch 2025

POLITICS TO WATCH 2025

Location: The AMS Great Hall, UBC

Date: Saturday, January 25, 2024

Time: 5:30-8:30 PM PST


Registrations are now closed for Politics to Watch 2025.

 

Daniel Rojas

Denali YoungWolfe