Expecting Internet platforms to eliminate hate and harassment is likely to disappoint. As the number of users climbs, community management becomes ever more difficult. If mistakes are made 0.01 percent of the time, that could mean tens of thousands of mistakes. And for a community looking for clear, evenly applied rules, mistakes are frustrating. They lead to a lack of trust. Turning to automation to enforce standards leads to a lack of human contact and understanding. No one has figured out the best place to draw the line between bad and ugly — or whether that line can support a viable business model.
So it’s left to all of us to figure it out, to call out abuse when we see it. As the trolls on Reddit grew louder and more harassing in recent weeks, another group of users became more vocal. First a few sent positive messages. Then a few more. Soon, I was receiving hundreds of messages a day, and at one point thousands. These messages were thoughtful, well-written and heartfelt, in stark contrast to the trolling messages, which were usually made up of little more than four-letter words. Many shared their own stories of harassment and thanked us for our stance.
The writers of these messages often said they could not imagine the hate I was experiencing. Most apologized for the trolls’ behavior. And some apologized for standing on the sidelines. “I didn’t do anything, and that is why I am sorry,” one user wrote. “I stayed indifferent. I didn’t attack nor defend. I am sorry for my inaction. You are a human. And no one needs to be treated like you were.” Some apologized for their own trollish behavior and promised they had reformed.