Facebook stands for free expression

At Facebook, we’re focused on addressing viral misinformation that could lead to imminent physical harm, like misleading health advice. We’ve built specific systems to remove threats such as child exploitation. In countries at risk of conflict, we take down content that could lead to imminent violence or genocide, and we’ve built systems that can detect risks of self-harm within minutes.

Advertisement

There are diverging views on what people consider dangerous. If someone shares a video of a racist attack, are they condemning it or glorifying it? Are they using normal slang, or using an innocent word in a new way to incite violence? Now multiply those challenges by 100 different languages.

Or take misinformation. No one tells us they want to see misinformation. But people enjoy satire, and they often tell stories that have inaccuracies but speak to a larger truth. Even with a common set of facts, media outlets emphasize different angles. I worry about an erosion of truth, but I don’t think most people want a world where you can share only things that tech companies judge to be 100% true.

I don’t think it’s right for a private company to censor politicians or the news in a democracy.

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
Advertisement