Government of Canada introduces legislation to combat harmful content online, including the sexual exploitation of children Français
The Online Harms Act would establish a baseline standard to keep young people safe online and uphold freedom of expression. Now, more than ever, online platforms must take responsibility—and be held accountable—for protecting kids from the harms embedded within online platforms. Everyone in Canada deserves to be safe in all aspects of their lives, including online.
OTTAWA, ON, Feb. 26, 2024 /CNW/ - The digital world can pose significant risks. Social media can be used to sexually exploit children, promote self-harm to children, incite violence, put people's safety at risk and foment hate. Online harms have real world impact with tragic, even fatal, consequences.
Today, the Honourable Arif Virani, Minister of Justice and Attorney General of Canada, introduced Bill C-63, the Online Harms Act. The Bill would create stronger online protection for children and better safeguard everyone in Canada from online hate and other types of harmful content. The Bill sets out a new vision for safer and more inclusive online participation. It would hold online platforms, including livestreaming and adult content services, accountable for the design choices made that lead to the dissemination and amplification of harmful content on their platforms and ensure that platforms are employing mitigation strategies that reduce a user's exposure to harmful content.
For too long, we have tolerated a system where online platforms have offloaded their responsibilities onto parents, expecting them to protect their kids from harms that platforms create or amplify.
The Bill would do this by:
- Creating and implementing a new legislative and regulatory framework through a new Online Harms Act. This framework would mandate online platforms, including livestreaming and user-uploaded adult-content services, to adopt measures that reduce the risk of harm in seven specific categories of harmful content. The Online Harms Act would also require services to remove content (1) that sexually victimizes a child or revictimizes a survivor, and (2) is intimate content posted without consent. Non-compliance could lead to strict penalties;
- Requiring, through the new Online Harms Act, that services provide clear and accessible ways to flag harmful content and block users, implement safety measures tailored for children and implement other measures to reduce exposure to seven categories of harmful content, including content that involves bullying children or promotes self-harm among young people;
- Creating stronger laws to help protect all people in Canada from hatred, on and offline, by creating a definition of "hatred" in the Criminal Code, increasing penalties for existing hate propaganda offences, creating a standalone hate crime offence and creating an additional set of remedies for online hate speech in the Canadian Human Rights Act;
- Enhancing the Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to better protect young people; and
- Establishing a new Digital Safety Commission to oversee and enforce the Online Harms Act's regulatory framework and a new Digital Safety Ombudsperson to act as a resource and advocate for the public interest with respect to systemic issues related to online safety.
Everyone in Canada should be able to access an online environment where they can express themselves freely, without fearing for their safety or their life. The Government will always uphold Canadians' constitutional right to freedom of expression, which is essential in a healthy democracy. There is also an urgent need for better safeguards for social media users, particularly children. This is why the new framework is focused on seven types of the most damaging and extremely harmful content online: content that sexually victimizes a child or revictimizes a survivor; intimate content communicated without consent; violent extremist and terrorist content; content that incites violence; content that foments hatred; content used to bully a child; and content that induces a child to harm themselves.
Online platforms, including livestreaming and adult content services, must be transparent and they must be held accountable. The safety of everyone in Canada, especially children—society's most vulnerable—depends on it.
"I am the parent of two young boys. I will do whatever I can to ensure their digital world is as safe as the neighbourhood we live in. Children are vulnerable online. They need to be protected from online sexual exploitation, hate and cyberbullying. Now more than ever, especially given the evolving capabilities of AI, online platforms must take responsibility for addressing harmful content and creating a digital world where everyone can participate safely and freely. This legislation does just that."
— The Hon. Arif Virani, Minister of Justice and Attorney General of Canada
The Online Harms Act would set out obligations for online platforms, including livestreaming and adult-content services, like Facebook, Twitch and PornHub. When it comes to these services, there is currently little accountability and transparency in terms of what platforms need to do to help ensure the safety of their users.
Under this legislation, services would be required to reduce exposure to seven categories of harmful content and be open and transparent about the steps they are taking to do so. They would also be required to expeditiously remove content that sexually victimizes a child and revictimized a survivor, and intimate content communicated without consent. Services would be required to be transparent with Canadians about how they are working to protect users, especially children and survivors. All users should have the ability to express themselves freely, without the risk of harm and better curate their own online experience with accessible ways of flagging harmful material.
The Online Harms Act would see the creation of a new Digital Safety Commission of Canada to administer the framework and to help foster a culture of online safety in Canada. A new Digital Safety Commissioner would:
- Enforce legislative and regulatory obligations and hold online services accountable for their responsibilities through auditing for compliance, issuing compliance orders and penalizing services that fail to comply;
- Collect, triage and administer user complaints and reports about services' obligations under all three duties;
- Enforce the removal of content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent; and
- Set new standards for online safety by providing guidance to services on how to mitigate risk, perform research, work with stakeholders and develop educational resources for the public, including children and parents.
The Online Harms Act would establish a new Digital Safety Ombudsperson of Canada. The Ombudsperson would act as a point of contact and a resource for users and victims and would advocate for users' needs and interests on systemic issues regarding online safety. Appointed on a five-year term, the Ombudsperson would:
- Gather information from users on an ongoing basis and issue calls for written submissions to solicit views on specific issues;
- Conduct consultations with users and victims;
- Direct users to proper resources such as law enforcement or help lines; and
- Develop advice, publish public reports and advocate to the Commission, the Government, and online platforms calling attention to frequent, severe or systemic issues from a user perspective.
The changes are being made following extensive consultations by the Government of Canada since 2021, including public consultations, an Expert Advisory Group on Online Safety, a Citizens' Assembly on Democratic Expression focused on online safety, and 22 online and virtual roundtables across Canada, as well as consultations held in 2020 by the Minister of Justice, when he was Parliamentary Secretary to the Minister of Justice.
Government of Canada introduces legislation to combat harmful content online, including the sexual exploitation of children.
On February 26, 2024, the Government of Canada introduced the Online Harms Act, legislation to make online platforms responsible for addressing harmful content and for creating a safer online space that protects all people in Canada, especially children.
The Bill would create stronger online protections for children and better safeguard everyone in Canada from online hate and other types of harmful content. It would hold online platforms, including livestreaming and user-uploaded adult content services, accountable for reducing users' exposure to harmful content on their platforms and help prevent its spread.
For too long, we have tolerated a system where online platforms have offloaded their responsibilities onto parents, expecting them to protect their kids from harms that platforms create or amplify.
The Bill's core components are:
- The introduction of a new legislative and regulatory framework, the Online Harms Act, to reduce exposure to seven kinds of harmful content on online platforms, including livestreaming and adult content services. This new Act would also create a new Digital Safety Commission to enforce the framework and a Digital Safety Ombudsperson to provide support for users and victims;
- Changes to the Criminal Code to better address hate crime and hate propaganda;
- Changes to the Canadian Human Rights Act to allow individuals and groups to file complaints against people who post hate speech online; and
- The enhancement of the laws to protect children from sexual exploitation through amendments to an Act respecting the mandatory reporting of internet child pornography by persons who provide an internet service.
Everyone in Canada should be able to access an online environment where they can express themselves freely, without fearing for their safety or their life. The Government of Canada will always respect Canadians' constitutional right to freedom of expression, which is essential in a healthy democracy. However, there is also an urgent need for better safeguards for social media users, particularly children. This is why the new framework is focused on seven types of the most damaging and extremely harmful content online.
Online platforms, including livestreaming and adult content services, must be transparent, and they must be held accountable. The safety of everyone in Canada, especially children and society's most vulnerable, depends on it.
Legislative and Regulatory Framework for Online Platforms: the Online Harms Act
The Online Harms Act would set out obligations for online platforms, including live-streaming and adult-content services like Facebook, Twitch and PornHub. When it comes to these services, there is currently little accountability and transparency in terms of what services need to do to help ensure the safety of their users. Under this legislation, services would be required to reduce exposure to seven categories of harmful content and be open and transparent about the steps they are taking to do so. They would also be required to make content that sexually victimizes a child and revictimizes a survivor, and intimate content communicated without consent inaccessible in Canada. Services would be required to be transparent with Canadians about how they are working to protect users, especially children and survivors. All users should have the ability to express themselves freely, without the risk of harm and better curate their own online experience with accessible ways of flagging harmful material.
Categories of Harmful Content
The Online Harms Act would specifically target seven categories of harmful content:
- Content that sexually victimizes a child or revictimizes a survivor;
- Intimate content communicated without consent;
- Content that foments hatred;
- Content that incites violent extremism or terrorism;
- Content that incites violence;
- Content used to bully a child; and
- Content that induces a child to harm themselves.
Obligations related to these seven categories of harmful content would be organized under three duties: the duty to act responsibly; the duty to protect children; and the duty to make certain content inaccessible.
Duty to Act Responsibly
Services would be required to enhance the safety of Canadian children and adults on their platforms by reducing their risk of exposure to the seven types of harmful content. Services would be required to:
- Assess the risk of exposure to harmful content, adopt measures to reduce exposure to harmful content, and assess the effectiveness of those measures;
- Provide users with guidelines and tools to flag harmful content and to block other users. Services would also have to set up an internal point of contact for user guidance and complaints;
- Label harmful content when it is communicated in multiple instances or artificially amplified through automated communications by computer programs. This requirement would include harmful content shared widely by bots or bot networks, for example;
- File and publish Digital Safety Plans containing the measures the service is taking, the effectiveness of those measures, the indicators they use to assess effectiveness and any analysis of new risks or trends related to online safety. They would also need to identify the data sets they use and keep and provide those data sets to qualified researchers, when appropriate.
Duty to Protect Children
Services would have a statutory duty to protect children online. To make the digital world safer for kids, services would be required to implement design features, such as age-appropriate design features and to take the interests of children into account when designing products and features.
These requirements would be set out in regulations issued by the Digital Safety Commission and could include things like defaults for parental controls, default settings related to warning labels for children, or safe search settings for a service's internal search function. They could also include design features to limit children's exposure to harmful content, including explicit adult content, cyberbullying content and content that incites self-harm. Allowing the Digital Safety Commission to enact guidelines and regulations under this duty would allow the legislation to be adaptable and to grow over time as the landscape of harmful content affecting children changes.
Duty to make certain content inaccessible
This duty would require services to make two specific categories of harmful content inaccessible to their users: (1) content that sexually victimizes a child or revictimizes a survivor, and (2) intimate content posted without consent, including sexualized deepfakes. These two categories represent the most harmful content online and it only takes a single piece of content under these two categories to cause substantial and lasting harm. The only way to reduce the risk of exposure and to protect victims is to make the content inaccessible.
Creation of a Digital Safety Commission of Canada
The Online Harms Act would see the creation of a new Digital Safety Commission of Canada to administer the framework and to enhance online safety in Canada. A new Digital Safety Commission would:
- Enforce legislative and regulatory obligations and hold online services accountable for their responsibilities through auditing for compliance, issuing compliance orders and penalizing services that fail to comply;
- Collect, triage and administer user complaints and reports about services' obligations under all three duties;
- Enforce the removal of content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent; and
- Promote societal resilience to harms online and set new standards for online safety by providing guidance to services on how to mitigate risk, perform research, work with stakeholders, and develop educational resources for the public, including children and parents.
Creation of a Digital Safety Ombudsperson
The Online Harms Act would establish a new Digital Safety Ombudsperson of Canada. The Ombudsperson would act as a point of contact and a resource for users and victims and would advocate for users' needs and interests on systemic issues regarding online safety. Appointed on a five-year term, the Ombudsperson's would:
- Gather information from users on an ongoing basis and issue calls for written submissions to solicit views on specific issues;
- Conduct consultations with users and victims;
- Direct users to proper resources such as law enforcement or help lines; and
- Develop advice, publish public reports and advocate to the Commission, the Government, and online platforms calling attention to frequent, severe or systemic issues from a user perspective.
Definition of "hatred"
The proposed legislation seeks to define "hatred" in the Criminal Code based on previous Supreme Court of Canada rulings. Because "hatred" is a key element of two of the hate propaganda offences, it helps to clarify for everyone in Canada what falls within the scope of these offences and what does not.
New hate crime offence applying to every Criminal Code offence motivated by hate
Changes to the Criminal Code would create a standalone hate crime applying to any offence in the Criminal Code or any other Act of Parliament, where the underlying act was motivated by hate.
In order to recognize the serious harm caused by hate-motivated crime and to explicitly condemn hateful conduct, this new hate crime offence would cover instances motivated by hatred related to race, national or ethnic origin, language, colour, religion, sex, age, mental or physical disability, sexual orientation or gender identity or expression. The offence would carry a maximum sentence of life imprisonment. The new offence will also make the charging and prosecution of hate crimes easier to track nationwide.
Increasing penalties for existing hate propaganda offences
The Criminal Code contains four hate propaganda offences. To better reflect the dangers caused by those who spread hate propaganda and to denounce these acts more appropriately according to their degree of harm, the maximum punishments for all four offences would be increased.
Advocating or promoting genocide against an identifiable group would be increased to a maximum of life imprisonment and the other three hate propaganda offences would be increased to a maximum of five years imprisonment when prosecuted as indictable offences.
New peace bond designed to prevent the commission of hate propaganda offences and hate crimes
Changes to the Criminal Code would allow any person who reasonably fears that someone will commit a hate propaganda offence or hate crime to seek a court-ordered peace bond to be imposed on that person.
The peace bond would allow a judge to impose conditions on an individual where there are reasonable grounds to fear that they will commit a hate propaganda offence or hate crime, such as where there are reasonable grounds to fear that someone will wilfully or intentionally promote hatred against an identifiable group. As this is a preventative measure to protect all people in Canada, there would not be the need for evidence that an offence has actually been committed.
As with some other peace bonds, it would require Attorney General consent before it could be used.
Changes to the Canadian Human Rights Act
The Bill would amend the Canadian Human Rights Act (CHRA) to define a new discriminatory practice of communicating hate speech online. Changes include reinstating an improved section 13 (originally designed to prevent and remedy the distribution of hate speech by telecommunication and repealed in 2014), enhancing the complaints process and adding remedies to address communication of hate speech.
Definition of "hate speech"
The Bill would define "hate speech" in the CHRA and empower individuals and groups to file complaints with the Canadian Human Rights Commission against users who post hate speech online.
As part of the proposed amendments, "hate speech" would be defined based on Supreme Court of Canada decisions. The Bill defines "hate speech" as the content of a communication that expresses detestation or vilification of an individual or group of individuals on the basis of prohibited grounds of discrimination.
The grounds of discrimination are race, national or ethnic origin, colour, religion, age, sex, sexual orientation, gender identity or expression, marital status, family status, genetic characteristics, disability or conviction for an offence for which a pardon has been granted where a record suspension has been ordered.
In order to constitute a discriminatory practice, the hate speech would need to be communicated where it is likely to foment detestation or vilification of an individual or group on any of these prohibited grounds. The provisions would focus on both the content of the speech and its likely consequences—as online threats too often turn into real-world harm.
Speech would not fall within the definition of hate speech solely because it expresses dislike or disdain, or it discredits, humiliates, hurts or offends. This distinction is intended to reflect the extreme nature of hate speech captured by the amendments.
Application of amendments to the Canadian Human Rights Act
Amendments to the Canadian Human Rights Act would not apply to operators of online platforms. The focus is on users communicating with other users.
The CHRA amendments would apply to public communications by individual users on the internet, including on social media, on personal websites and in mass emails.
They would not apply to private communications (e.g. private emails and direct messages) nor to intermediaries who supply hosting (e.g. allocation of space on a web server for a website to store its files), caching (the process of storing copies in a cache), and other technical infrastructure. They also would not apply to telecommunications service providers or broadcasters, that may be regulated by other means.
By empowering people to file complaints against users who spread hateful messages online, the Bill targets those who are causing direct harm to others.
Enhancing the complaints process
The Bill would also outline procedures for the Canadian Human Rights Commission to rapidly dismiss complaints that do not involve hate speech as defined and to be able to protect the confidentiality of complainants and witnesses as appropriate.
The Bill would seek to improve the complaints process by:
- Empowering the Canadian Human Rights Tribunal to ensure fair and efficient hearings by awarding litigation costs against parties who abuse the process; and
- Empowering the Commission and the Tribunal throughout the process to take measures to protect the confidentiality of complainants, victims and witnesses if there is a risk of reprisals, while ensuring that fully open proceedings is the default approach.
Protection of rights and freedoms
Hate speech is carefully defined to target only an extreme and specific type of expression, leaving almost the entirety of political and other discourse untouched. The Supreme Court of Canada has repeatedly upheld laws that combat hate speech as justified limits on freedom of expression.
Changes to mandatory reporting of Internet child pornography
Amendments to the Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service would enhance law enforcement's response to crimes of child sexual exploitation.
The proposed amendments would:
- Create a new regulatory authority to permit centralizing mandatory reporting of child pornography offences through a designated law enforcement body to ensure law enforcement experts receive the necessary evidence in a timely way;
- Enhance transparency by creating authority for regulations to require an annual report to the Ministers of Public Safety and Justice from the designated law enforcement body;
- Clarify that the Act applies to all types of internet services including online platforms and other application-based services;
- Impose a 12-month data preservation requirement for computer data related to the reports to police (as opposed to the current 21 days), to help facilitate investigations into child pornography;
- Require internet service providers to include transmission data (defined in the Criminal Code, this is information related to telecommunication functions, such as routing, addressing, date and time, and does not include content of the communication) when reporting material to the designated enforcement body where the content relating to the report of a child pornography offence is manifestly evident, in order to assist law enforcement in locating the source of illegal materials; and
- Change the limitation period for prosecution from two years to five years to ensure perpetrators can be brought to justice.
The proposed changes are being made following extensive consultations by the Government of Canada since 2021, including public consultations, an Expert Advisory Group on Online Safety, a Citizens' Assembly on Democratic Expression focused on online safety, and 22 online and virtual roundtables across Canada, and consultations in 2020 held by the Minister of Justice, when he was Parliamentary Secretary to the Minister of Justice.
Proposed Bill to address Online Harms
The Government's commitment to address online safety and hate
What We Heard: The Government's proposed approach to address harmful content online
SOURCE Government of Canada
(media only), please contact: Chantalle Aubertin, Deputy Director, Communications, Office of the Minister of Justice and Attorney General of Canada, 613-992-6568, [email protected]; Media Relations, Canadian Heritage, 1-819-994-9101, 1-866-569-6155, [email protected]; Media Relations, Department of Justice Canada, 613-957-4207, [email protected]
Share this article