Understanding the fight
Want to learn more about why we’re calling Canadian leaders to make online harms legislation a priority now? This section walks you through everything you need to know: what we're asking for and why, what's already been tried, how Canada compares to our allies, and why your email to your MP matters more than you might think.
What are we actually asking for?
Accountability. We want platforms to be legally responsible for the systemic choices they make that encourage harms they profit from. Food companies can't sell unsafe products. Car manufacturers must meet safety standards. Drug companies face clinical trials before going to market. Every major industry is held accountable for the risks they cause—except Big Tech.
That has to change. Large platforms must assess the risks their services create for our welfare and our rights, publish plans to address them, and face real consequences if they’re systematically negligent. That's applying a basic duty of care—and getting it right will provide the same basic accountability every other industry already must provide.
What kinds of online harms are we talking about?
We’re asking for a duty of care that covers the worst of the worst—content that's already illegal. Child sexual abuse material. Intimate images shared without consent. Content that incites hatred or violence toward specific communities—targeting radicalized groups, LGBTQ+ community, or religious groups—driving real-world harm offline. Content that bullies or pushes children toward self-harm.
And increasingly, we need to see AI addressed that is being used to plot real-world violence—like the Tumbler Ridge tragedy, where the shooter used an AI chatbot to plan the attack months in advance, and the platform failed to alert authorities before lives were lost.1
AI is amplifying these harms at a scale and speed we've never seen before. When we leave it to Big Tech to police themselves, we're not just hoping for the best—we're actively leaving Canadians' safety and rights unprotected. That is not a policy. That is an abdication.
But won't this just lead to censorship?
Not if a platform duty of care is defined appropriately in this legislation. We've spent years fighting for exactly that—a duty of care that requires platforms to think seriously about the impact of their systems on both our welfare, and our rights. That’s exactly why we’ve shut down proposals that would attack our rights as users by removing our posts, mass surveillance systems, or breaking our privacy.
We need to put the onus to change where it belongs; on how Big Tech companies design their systems, not on individual users. Large platforms are built on surveillance capitalism. They have no strong interest in protecting your rights or your safety; only in profiting from you. We cannot keep leaving Canadians' safety and rights in the hands of foreign corporations whose decisions are accountable and transparent to no one. Our government has a responsibility to step up. The question isn't whether to regulate—it's how.
Rights-respecting regulation means no surveillance of users, no encryption backdoors, no platform liability for individual posts, and no sweeping takedown powers that could silence legitimate speech. Safety measures must be proportionate in their impact on privacy and free expression—and debated and decided openly, not behind closed doors.
Hasn't Canada tried to regulate online harms before?
Yes, and we almost got there. Bill C-63, the Online Harms Act, was Canada's best attempt yet at holding platforms accountable while protecting our rights. The core of the bill was genuinely promising, but the government bundled it with controversial changes to the Criminal Code that raised serious free expression concerns. (View our explainer on Bill C-63 for more details)
Like many rights advocates, we pushed hard to split the bill, keeping Part one while dropping the overreaching measures. In December 2024, the government finally agreed.2 Then Parliament was prorogued in January 2025,3 and the bill died. Years of hard-won progress, wiped out overnight.
Since then, the Carney government has shown little sign of picking up where we left off. Yet the risks AI-era online harms pose are real and growing. Canada cannot afford further delay on fighting online harms. We need to push this government to bring back the progress we made on Bill C-63, maintain its best parts, and make it stronger to give Canadians the kinds of protections we deserve.
Why does your voice matter most now?
The rules being written now will shape your life directly and profoundly. A surge of direct constituent emails on a specific issue gets noticed, and with AI dominating headlines, now is the time to be heard. If we stay silent, those rules get written by lobbyists and governments without public scrutiny. If we speak up now, we force the digital world we actually want to the top of the agenda at the moment it can make a real difference for Canadians.