Online harassment is a serious problem for every member of the online community, including companies like Change.org. While the majority of petitions are deeply inspiring stories of people power, we occasionally get a sobering reminder of the worst of the Internet when we see petitions calling for acts of suicide, or petitions used by students to bully their classmates.

We also hear first-hand from many of our petition starters, particularly women, about the terrifying bullying and safety threats they experience online — and offline when they put their causes out into the spotlight.

As well as taking down content that violates our Community Guidelines, and addressing bullying in our product and policies, we work with our users, law enforcement and anti-bullying organizations to ensure that people who report serious cases of harassment get the support they need. Balancing the freedom of speech of our users, as an open platform, with community safety is not easy and, as a small company with only 300 employees, we have a long way to go in this area — that’s one of the reasons we consult with experts like the Family Online Safety Institute to make sure we’re on the right track.

All open Internet platforms need to invest time and resources into addressing these critical user safety challenges. But the answer to this complex problem is not to tear down the protections offered to intermediaries in the U.S. under Section 230 of the Communications Decency Act, as a TechCrunch op-ed proposed last week — that would just see the rise of a different form of online harassment: forced silence.

Almost every country has laws around “intermediary liability.” More interesting than they sound, these laws are foundational to protecting free speech online. They apply to social media platforms, service providers and other services that enable people to upload and share content online. They establish the situations in which the intermediary platform becomes liable for its users’ content. Or, put more simply, it’s when the messenger becomes responsible for the message.

U.S. law gives intermediaries immunity over most of their users’ content; this doesn’t just protect companies, it protects every single person who has ever spoken out about an issue online. Outside the U.S., harsh liability laws can compel platforms to silence their users, and this can have chilling consequences.

People use Change.org to speak out about deeply private injustices. In my home state of New South Wales in Australia, a 14-year-old girl used Change.org to speak out about the domestic violence she had experienced that led to her mother’s suicide. Josie’s petition resulted in statewide changes to the education curriculum: Domestic violence will now be taught in all schools, so students like her can seek help before it is too late. Josie is now campaigning for those education changes to be made nationally.

Harsher intermediary liability laws would make it too easy for such tragic, but difficult to verify, personal stories to be silenced. Australia, India and countries across the EU and Latin America have a “notice and takedown” regime where platforms can be compelled to remove content or face legal risks if just one person claims that the content is untrue or otherwise violates a law, even if that claim has no merit.

In addition, laws in many countries are unclear about whether you need a court to rule that a statement is defamatory in order for it to be removed from a website. This is particularly challenging because staff at Internet companies are not best placed to judge what’s true and false — that’s typically the role of the courts.

In Turkey, Russia, Thailand and Indonesia, intermediaries face even higher risks.

In Turkey, for example, entire websites can be shut down because of a single post that someone considers to be defamatory. Such harsh liability laws can discourage smaller Internet companies from entering new markets, because they might not be able to bear the risk. That’s particularly sad, considering that the countries with the harshest intermediary liability laws are often the ones that would benefit the most from new free-speech platforms that give people a place to tell their stories and connect with others.

Had someone objected to the details of Josie’s private story, Change.org might have been compelled to remove it, and she would not have been empowered the way that she was online.

Many organizations are raising awareness about this important issue. Civil Society organizations came together in March and outlined the Manila Principles for Intermediary Liability, which argue that intermediaries should be given immunity over their users’ content, as they have in the U.S., and that content should not be restricted without a judicial order. The United Nations included the importance of global reform in this area in a report about rights to freedom of opinion and expression. Stanford University has put together a fantastic interactive map where people can learn about the intermediary liability laws in their country.

It’s true that intermediary liability laws do compel platforms to act responsibly, by providing a legal incentive to act. Yet, overreaches under these laws can cause myriad other problems, the greatest of which is that people will not have a safe space to speak out online. Internet companies must make every effort to create safe environments for their users. We need to work with our users and expert organizations to come up with meaningful solutions to prevent and address distressing online harassment.

But the answer is not to force the hand of the messenger to remove content. Silencing the messenger silences the message, and can discourage millions of people worldwide from speaking out about the issues that matter to them most.

Featured Image: Tatiana Popova/Shutterstock

Article source: TechCrunch.com

Comments

comments