Frequently Asked Questions

Make sense of Bill C-63. Our FAQ below breaks down the essential topics and helps navigate the debate around C-63.

  • The Bill regulates large social media platforms, which are defined in the Bill as “websites or apps that help people communicate and share content online, including adult content sites and live streaming platforms.”

    Private messaging features on platforms are excluded from the regulation.

  • The Bill identifies seven types of harm that fall within its scope. 

    1. Content that sexually victimizes a child or revictimizes a survivor;

    2. Intimate content communicated without consent;

    3. Content used to bully a child;

    4. Content that induces a child to harm themselves;

    5. Content that incites hatred;

    6. Content that incites violence; and

    7. Content that incites violent extremism or terrorism.

  • The Bill creates four duties, and a data transparency requirement. 

    1. Duty to Act Responsibly

    • Main goal: Minimize the risk of harmful content without eliminating it entirely, while protecting free speech.

    • Services must submit a Digital Safety Plan. 

    • Services must report on:

      • How they meet regulations;

      • Measures for protecting children;

      • The amount of harmful content moderated;

      • User complaints and feedback.

    • Platforms must provide tools for blocking or flagging harmful content, inform users about flagged content, and label automated content (i.e. bots)

    2. Duty to Protect Children

    • Main goal: Operators must ensure design features are age-appropriate and in place to protect children.

    3. Duty to Make Certain Content Inaccessible

    • Operators must remove:

      • Content that sexually exploits children or revictimizes survivors and 

      • Non-consensual intimate content.

    • Suspected harmful content must be made inaccessible within 24 hours and the user who posted it must be notified. Users can appeal the decision if content was wrongly removed.

    4. Duty to Keep Records

    • Operators must maintain records and data. 

    Data Transparency Requirement

    • Regulated services must share data with qualified individuals for research purposes. 

    • Researchers can hold services accountable for the accuracy of their digital safety plans and content moderation practices.

  • 1. Amendment to the Criminal Code to:

    • create a hate crime offense of committing an offence under that Act or any other Act of Parliament that is motivated by hatred based on certain factors;

    • create a recognizance to keep the peace relating to hate propaganda and hate crime offences;

    • define “hatred” for the purposes of the new offence and the hate propaganda offences; and

    • increase the maximum sentences for hate propaganda offences.

    2. Amendments to the Canadian Human Rights Act:

    To provide that it is a discriminatory practice to communicate or cause to be communicated hate speech by means of the Internet or any other means of telecommunication in a context in which the hate speech is likely to foment detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination. It authorizes the Canadian Human Rights Commission to accept complaints alleging that discriminatory practice and authorizes the Canadian Human Rights Tribunal to inquire into such complaints and order remedies.

    3. Amendments to an Act respecting the mandatory reporting of Internet child pornography by persons who provide and Internet service to:

    • clarify the types of Internet services covered by that Act;

    • simplify the mandatory notification process set out in section 3 by providing that all notifications be sent to a law enforcement body designated in the regulations;

    • require that transmission data be provided with the mandatory notice in cases where the content is manifestly child pornography;

    • extend the period of preservation of data related to an offence;

    • extend the limitation period for the prosecution of an offence under that Act; and

    • add certain regulation-making powers.

  • The Bill creates three new bodies:

    1.Digital Safety Commission of Canada

    • Promotes online safety and reduces harm from harmful online content by:

      • Enforcing the Act and ensuring transparency from operators.

      • Investigating complaints about harmful content.

      • Developing online safety standards through research and education.

      • Facilitating participation from Indigenous peoples and collaborating with stakeholders.

    • Composed of 3-5 full-time members appointed by the Governor in Council.

    2.Digital Safety Ombudsperson of Canada

    • Supports users and advocates for the public on online safety issues by:

      • Gathering information on online safety and harmful content.

      • Publicly sharing non-personal information gathered.

      • Directing users to relevant resources.

    3.Digital Safety Office of Canada

    • Supports the Commission and Ombudsperson by managing their operations.

    • Headed by a CEO appointed by the Governor in Council, responsible for daily management and oversight.

  • The regulator has two main powers:

    1. Investigative Powers:

    • Can compel people to appear and provide testimony or documents under oath.

    • Can hold public or private hearings to investigate issues.

    • Appoints inspectors to verify compliance with the Act.Inspectors can enter places to gather relevant documents or information, even remotely, if the owner agrees.

    2. Power to Issue Orders:

    • The Commission can issue orders to operators to ensure compliance if it believes they are violating the Act.

    • Orders can be made enforceable in Federal Court, where they can be executed like any court order.

  • Punishments vary based on violations or offences. 

    Violations

    • Penalties: Administrative monetary penalties for non-compliance with the Act, including:

      • Violating the Act, Commission orders, inspector requirements, or false statements.

      • Obstructing Commission or inspector actions.

    • Maximum Penalty: Up to 6% of gross global revenue or $10 million, whichever is greater.

    • Considerations include the nature of the violation, compliance history, benefits gained, ability to pay, and more.  

    Offences

    • Offences: Operators commit an offence if they:

      • Violate Commission orders, obstruct the Commission or inspectors, or make false statements.

    • Penalties:

      • On conviction: Up to 8% of gross global revenue or $25 million (indictment) or 7% of revenue or $20 million (summary).

      • Personal liability: Penalties for individuals or non-individuals, with fines based on gross revenue or specific amounts, depending on conviction type.

    • Operators may avoid liability by proving they exercised due diligence.

    • There is also personal liability for persons that commits an offences

  • Yes. A person in Canada can submit complaints to the Commission about harmful content on a regulated service or the operator's compliance with the Act.

    Complainants who work for the company are also protected.

    Complaints about content that sexually victimizes a child, revictimizes a survivor, or involves intimate content shared without consent may be investigated. If not dismissed, the Commission will:

    • Notify the operator and user involved.

    • Order the operator to make the content inaccessible in Canada until a decision is made.

    The Commission will then determine if the content falls under these categories and may order permanent removal.

  • The Minister must review the Act every five years, starting five years after it comes into force. A report on the review must be presented to Parliament within one year of its completion