Bill C63 Canada Online Harms Act represented by scales of justice with a shield on one side and a megaphone on the other in front of a Canadian flag.

Bill C63 Explained: Helping or Harming Canada?

Why do we even need an Online Harms Act or Bill C63?

There is a growing concern about the impact of social media on mental health, especially among adolescents. Studies have shown a correlation between problematic social media use and mental health outcomes such as psychological symptoms, emotional problems, and reduced life satisfaction. A study found that around one-fifth of social media users aged 15 to 64 reported that they had lost sleep, gotten less physical activity, or had trouble concentrating on tasks or activities as a result of their social media use. The Centre for Media, Technology and Democracy’s research found that almost all Canadian youth have seen hate on their social media feeds and experienced the effects of misinformation in their communities.

A study commissioned by the youth charity Ditch the Label analyzed 263 million conversations in the UK and US between 2019 and mid-2021. It found over 50 million discussions about, or examples of, racist hate speech in that time. Another study conducted by the Institute for Strategic Dialogue Global in 2020 concluded that Canadians were using more than 6,600 online channels, pages, groups, and accounts across several social media platforms to spread white supremacist, misogynistic, or other extremist views. While calls for genocide are less common, they do occur. However, it’s important to note that there are the varying definitions of what constitutes hate speech or a call for genocide.

A young women holds up a sign that says End Revenge Porn - Bill C63

Another study by the Cyber Civil Rights Initiative showed that 37% of the 980 people that had shared a nude video or photo of themselves with a partner were the victims of revenge porn. Even more disturbing is that 93% of victims (mostly women aged 18-30 years old) said they have suffered significant emotional distress due to it. 49% said they have been harassed or stalked online by users that have seen their material. 30% said they have been harassed or stalked outside of the Internet (in person,  over the phone) by users that have seen the material online. More than half had contemplated or attempted suicide as a result.

Data from Statistics Canada consistently indicates that Jews are the religious minority most frequently targeted by hate crimes and are the second most targeted group overall. The internet is a primary space where anti-Semitism thrives with little restraint.

There is no question that the internet is full of hate and can be used to perpetrate harm. Currently there is no remedy other than going to the police, whose hands are tied, or suing for civil damages which is costly, time consuming and doesn’t guarantee any results.  The question is what can we do to protect the most vulnerable and prevent harm and hatred from spreading without infringing on free speech or other protected rights?

What is The Canadian Online Harms Act or Bill C63?

The Canadian Online Harms Act (Bill C63), is a significant piece of legislation introduced by the Government of Canada to promote online safety, reduce online harms, and ensure transparency and accountability from social media services in Canada. Here are some key points about the bill:

  • The bill was introduced for its First Reading on February 26, 2024. It is significantly different from the 2021 consultation.
  • The purpose of the bill is to promote the online safety of persons in Canada, reduce harms caused to Canadians as a result of harmful content online, and ensure that the operators of social media services are transparent and accountable with respect to their duties.
  • The bill establishes the Digital Safety Commission of Canada, whose mandate is to administer and enforce the Act.
  • It creates the position of Digital Safety Ombudsperson of Canada, whose mandate is to provide support to users of social media services and advocate for the public interest in relation to online safety.
  • The bill imposes on the operators of social media services a duty to act responsibly in respect of the services that they operate, including by implementing measures that are adequate to mitigate the risk that users will be exposed 7 types of harmful content on their service:
    1. Child sexual abuse (CSAM) or revictimizing material; This includes stronger reporting of child pornography.
    2. Content encouraging children to harm themselves;
    3. Content used to bully children;
    4. Non-consensually distributed intimate images (NCDII), aka revenge porn
    5. Content that incites violence;
    6. Content that incites violent extremism, or terrorism;
    7. Content that foments hatred.
  • The bill applies to platforms such as Facebook, X, Pornhub, and Twitch.
  • The precise application of the Act will not be known until regulations are promulgated.

This bill represents a significant expansion of Canada’s hate speech laws and creates one of North America’s most rigid regulatory environments for media and social media companies. It aims to hold online platforms accountable for the content they host and create stronger protections for kids online.

C63 the Good the Bad and the Ugly

The idea behind the bill is valiant. Protect the children, foster accountability on social media platforms, stem the flow of violent hate speech… The tricky bit is how to do that. It must be effective enough to have real consequences and provide tools to accomplish these tasks. It also must be careful not to infringe on protected rights like free speech. Let’s delve into how it aims to do this and what are some of the areas for concern surrounding this new proposed legislation.

Canadian flag flying on a blue sky with white clouds

The Good

Bill C-63, the new Online Harms Act, is well-crafted legislation that understands the internet and how platforms will react. Here’s what it does right:

The Act focuses on the right targets – large social media and streaming services, leaving most of the web untouched. These platforms are asked to create and share their own strategies to mitigate risks. They must account for any of the seven illegal harms happening on their platforms and show us how they’re tackling them. It requires platforms to make non-consensually distributed intimate images and child sex abuse material inaccessible within 24 hours.

If social media services contravene the Act, such as failing to take down content or not abiding by the duty to implement the measures set out in its digital safety plan, they may be obligated to pay administrative monetary penalties of up to 6% of the gross global revenue or C$10-million, whichever is greater. These penalties are intended to ensure that social media platforms take their responsibilities seriously and comply with the Act.

If C63 becomes law, we won’t just rely on what platforms tell us. Researchers can use anonymized data from platforms to study what’s really happening. This will give us a clear understanding of how and why harmful content spreads online, something we’ve lacked so far.

The Act rightly takes a tough stance on the worst and most easily identifiable content, like child abuse material and non-consensually shared adult content. Platforms will be required to take down such content automatically.

The Act also makes sense when it comes to children’s safety. It doesn’t try to make the whole internet safe for kids, but it does ask platforms to use standard safety features for children’s accounts. This recognizes that young people have rights to expression and privacy, but also face unique risks.

Finally, C-63 avoids many bad ideas and learns from other countries’ experiences. It doesn’t threaten private messaging or encrypted communications, doesn’t require mandatory takedown of any speech content, and doesn’t call for proactive surveillance of all Canadians. Instead, it relies on the user report systems that most platforms already have, with clear rules on how they should support their users.

The Bad

Bill C63, the Online Harms Act, has been bundled with significant changes to Canada’s Criminal Code and Human Rights Act. These changes could potentially suppress Canadians’ lawful online speech.

The bill introduces a new definition of “hatred”. It is defined as “the content of a communication that expresses detestation or vilification of an individual or group of individuals on the basis of prohibited grounds of discrimination”. The bill does not provide a specific definition for genocide. Harmful content, which includes hate propaganda, is defined in the legislation as content that incites violence, foments hatred, incites violent extremism or terrorism, is used to bully a child, sexually victimizes a child, induces a child to harm themselves, or involves intimate content communicated without consent. It’s important to note that the precise application of these definitions will be determined by the courts in accordance with Canadian law.

The changes include life imprisonment for promoting genocide or committing a new ‘hate offense’. Also, Canadians can now investigate others’ online speech and seek penalties up to $70,000, with $20,000 awarded to them as potential victims. This process doesn’t have to follow normal court evidence rules or check if the offending speech was true.

Beverly Maclachlin, Canada’s former Chief Justice, has said these proposals could face serious constitutional challenges. They seem to go too far, fail basic justice tests, and risk being misused to silence lawful speech.

Moreover, changes to the Criminal Code introduce a new ‘pre-crime’. A judge and attorney general could impose bond conditions on someone they believe is at high risk of voicing online hate, even if they’ve never committed a crime. The government justifies this ‘pre-crime’ as similar to existing peace bond conditions for those at risk of committing terrorism or domestic violence. But these are for clear and imminent risks of violence to specific people, not for speech that could theoretically harm someone. There’s no direct comparison.

Many, including civil liberties advocates, academics, and Canada’s largest newspaper, criticize these proposals as poorly designed, disproportionate, and out of place in Bill C-63.

The Ugly

We don’t need to discard the beneficial aspects of C-63 just to eliminate its problematic parts. It would be unwise to reject all the well-thought-out, balanced sections of Bill C-63, which aim to enhance the worst aspects of the Internet, due to a deeply flawed but non-critical component of the Bill. We don’t need to discard the beneficial aspects of C-63 just to eliminate its problematic parts. It would be unwise to reject all the well-thought-out, balanced sections of Bill C-63, which aim to enhance the worst aspects of the Internet, due to a deeply flawed but non-critical component of the Bill.

While it’s true that more should be done to address online hate, it’s also valid to have concerns about the approach of Bill C-63. Many people, including myself, are disturbed by some posts on the Internet. There’s a lot of “lawful but awful” speech – blatantly offensive and disrespectful expressions, and subtle cues that hint at unspoken hateful thoughts.

People will naturally have different views on whether some speech is just offensive and terrible, harmful to people, or pushing an important social debate. This is even more evident in the social climate of 2024. Some European democracies have stricter hate speech laws than Canada, while our neighbors to the south have none; both approaches seem to work in thriving democracies with broad expression and participation.

However, no other Western democracy has imposed “pre-crime” restrictions on people for the risk of offensive speech, as opposed to direct violence. No wise democracy overloads their human rights adjudication system with complainants who not only bear no personal cost for frivolous or malicious complaints but also have a direct financial incentive to try their luck. And a thoughtful legislature does not create new life imprisonment offenses without a clear understanding of why they’re doing so or which existing crimes they believe need to be handled differently.

Coming from a community that has consistently experienced antisemitism and genuine threats, I believe we need some strategies to tackle online harms. Given the dramatic rise in antisemitism, this legislation has come at a time when it is most neede. However, the definitions carry the risk of being interpreted too broadly, which could impact freedom of expression. Secondly, the Digital Safety Commission, which will be primarily responsible for enforcing the law, is granted immense power. The range of powers is striking: they can rule on making content inaccessible, conduct investigations, hold hearings that can sometimes be closed to the public, establish regulations and codes of conduct, and impose penalties up to 6% of the global revenues of services that violate the law. There’s a lot to consider here, and questions about the Commission’s oversight and accountability are crucial. Lastly, the provisions involving the Criminal Code and Canadian Human Rights Act need thorough examination as they propose penalties as severe as life imprisonment and could lead to a surge in hate speech related complaints.

Even if you think Canada’s current approach to illegal hate speech could be stricter, I believe you should see C-63’s approach to speech as extreme and inappropriate in its current form. If any changes to Canada’s hate laws are needed, they should be as carefully planned and tested for problems as the rest of C-63, not hastily added alongside more mature measures.

What should be done?

Express your concerns about Bill C-63 to your MP but also implore upon them that the bill is very much needed. It’s important for every MP to understand that there’s a beneficial way forward with Bill C-63. Canadian women and children shouldn’t have to wait for essential protections from bullying and abuse while the government discusses the severe speech punishment provisions of Bill C-63. My suggestion is to divide Bill C-63, retain the positive aspects, and in the meantime, focus on rectifying the negative ones, and quickly.

Sources

Parliament of Canada BILL C-63
https://www.parl.ca/DocumentViewer/en/44-1/bill/C-63/first-reading

Proposed Bill to address Online Harms
https://www.canada.ca/en/canadian-heritage/services/online-harms.html

Canada’s new Online Harms Act (C-63): what you need to know
https://www.osler.com/en/resources/regulations/2024/canada-s-new-online-harms-act-c-63-what-you-need-to-know

Explaining Bill C-63, The Online Harms Act: An OpenMedia FAQ
https://openmedia.org/article/item/explaining-bill-c-63-the-online-harms-act-an-openmedia-faq

Study: Frequent social media use disrupts sleep, physical activity in teen girl
https://publications.aap.org/aapnews/news/13874/Study-Frequent-social-media-use-disrupts-sleep

The effect of social media interventions on physical activity and dietary behaviours in young people and adults: a systematic review
https://ijbnpa.biomedcentral.com/articles/10.1186/s12966-021-01138-3

The Centre for Media, Technology and Democracy
https://www.mediatechdemocracy.com/

DITCH THE LABEL: BUILT ON OVER A DECADE OF RESEARCH
https://www.ditchthelabel.org/research-report

2017 NATIONWIDE ONLINE STUDY OF NONCONSENSUAL PORN VICTIMIZATION AND PERPETRATION
https://www.cybercivilrights.org/wp-content/uploads/2017/06/CCRI-2017-Research-Report.pdf

An Online Environmental Scan of Right-wing Extremism in Canada
https://www.isdglobal.org/isd-publications/canada-online/

Police-reported hate crime in Canada, 2020
https://www150.statcan.gc.ca/n1/pub/85-002-x/2022001/article/00005-eng.htm

Canadians’ assessments of social media in their lives
https://www150.statcan.gc.ca/n1/pub/36-28-0001/2021003/article/00004-eng.htm

Search

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Book Your Success Session

with Craig