![reddit anti gay flag emoji reddit anti gay flag emoji](https://static.independent.co.uk/s3fs-public/thumbnails/image/2019/01/27/10/pridefinal.jpg)
Reddit is navigating relatively uncharted waters now. These are not the kinds of comments you typically see in an AMA, yet due to the growth of these communities, the ideas are rapidly spreading across Reddit. As a result, the AMA was flooded with comments suggesting blacks have lower IQs than whites and are inherently prone to violence, among other things. This AMA was linked to in some of the racist communities on Reddit, and their related off-site chat rooms. The ACLU and recently did an AMA relating to the one-year anniversary of the Michael Brown incident in Ferguson. Then, this behavior gets noticed when ideas that would typically be frowned on are suddenly being screamed through a bullhorn. This is not simply limited to “hate speech” but all ideas. Once a community like this hits a critical mass, the ideas they scream propagate across the site and the users take great offense at dissenting or competing viewpoints.
![reddit anti gay flag emoji reddit anti gay flag emoji](https://www.gaytimes.co.uk/wp-content/uploads/2019/02/anti-lgbt-flag-emoji-300x187.jpg)
Users pick out viewpoints that reflect their own, exclude other ideas, and exist in echo chambers that amplify and reinforce their thoughts. The problem with controversial or hate speech typically is tied to how social media is built. Tumblr banned blogs that encouraged self harm.īut the fundamental issue with content bans is they’re inherently limiting and fail to kill off “group think” that drives the content in the first place. Instagram uses banned hashtags to mitigate pornography. Sites like Facebook and YouTube hire thousands of content moderators to keep the most unsavory elements out of their feeds. Social media platforms have deployed a few different strategies for coping with this-most revolve around the outright ban of content. More than once, female moderators were told they needed to just “shut up, lay back, and be a good woman.” When users are unsure of the gender of the moderators, they tend to hurl threats of rape or insults based on sexual orientation-often at the same time for added effect! Most moderators on our teams have been called disgusting and racist names, threatened with physical violence, and some have been subjected to doxxing attempts. Many times, our female moderators have endured rape threats in public comments and private messages (though it’s not always limited to women). The things people say online with or without a connection to their real identity provides a disturbing glimpse into some of the darker parts of humanity. This is a great step, as combating it on our own is untenable. In the wake of the recent upheaval with Reddit, the new leadership has committed to helping the moderators mitigate these types of speech. For simplicity’s sake, we will refer to communities that support racist or otherwise discriminatory viewpoints as hate speech communities. But where there are people on the Internet, there is also horrible, negative commentary.