There’s plenty of hate speech on the Internet, and while some people think it’s funny, it’s starting to look like Facebook isn’t exactly laughing. Though Facebook has policies against hate speech, they have responded today to the Everyday Sexism Project and the boycott described in their open letter from last week.
In Facebook’s blog post, they discuss the difficulties in defining what exactly qualifies as hate speech:
While there is no universally accepted definition of hate speech, as a platform we define the term to mean direct and serious attacks on any protected category of people based on their race, ethnicity, national origin, religion, sex, gender, sexual orientation, disability or disease. We work hard to remove hate speech quickly, however there are instances of offensive content, including distasteful humor, that are not hate speech according to our definition. In these cases, we work to apply fair, thoughtful, and scalable policies.
Facebook is going to take action in a few different ways. First, they’re going to find ways to better handle reports of hate speech on its platform; second, they plan to “update the training for the teams that review and evaluate reports of hateful speech or harmful content on Facebook,” and work with groups that handle these types of hate issues to strike a balance between freedom of speech and tolerance.
It’s difficult to moderate hate speech on a site with user created content, but Facebook seems to be taking the proper steps to dealing with the problem.