As of today, Facebook officially no longer allows the pages "Violently Raping Your Friend Just for Laughs" and "Kicking your Girlfriend in the Fanny because she won't make you a Sandwich" to exist on its social network, and — what do you know? — it only took a well publicized media campaign and angry advertisers to do it. In a message to its users, Facebook has outlined a new policy for dealing with violent and hateful speech to better deal with — though, not outright ban — this kind of "distasteful humor," which, of course, begs the question: How come Facebook wasn't doing anything about this in the first place? Here are three explanations:

Free Speech: Like so many online platforms (ahem, Reddit), Facebook's previous policy allowed its users to bend free speech into an excuse for sexism. The social network doesn't explicitly ban "offensive" or "controversial" content because that would violate its users rights, in a way. "We work hard to remove hate speech quickly, however there are instances of offensive content, including distasteful humor, that are not hate speech according to our definition," explains Facebook's company message, in the form of a Facebook Note. But this is the same Internet that resulted in a photo of Rihanna's bloodied face with the caption "Chris Brown's Greatest Hits" — and a picture of a man holding a rag over a woman's mouth that read: "Does this smell like chloroform to you?"

Facebook now realizes — unlike, say, the leaders of 4chan, or Reddit, which was also forced into a half-apology — that free speech only goes so far. "That being said, we realize that our defense of freedom of expression should never be interpreted as license to bully, harass, abuse or threaten violence," continues the company note, posted last night by Facebook Global Policy VP Marne Levine.

Technology: Even with Facebook's generous leeway for "offensive" or "controversial" posts, that stuff shouldn't really have gotten through the system. Some of the sexist postings could have fallen under the "hate" speech umbrella, which has a notably vague definition, according to Facebook. "There is no universally accepted definition of hate speech," reads the note. But: "As a platform we define the term to mean direct and serious attacks on any protected category of people based on their race, ethnicity, national origin, religion, sex, gender, sexual orientation, disability or disease." By that definition, Facebook rape groups should have been axed in the first place.

But Facebook's backend wasn't catching the creeps, the company admitted: "In recent days, it has become clear that our systems to identify and remove hate speech have failed to work as effectively as we would like, particularly around issues of gender-based hate. In other cases, content that should be removed has not been or has been evaluated using outdated criteria." In other words, Facebook's technology wasn't working the way it should have. 

Accountability: The social network suggests that the creators of these hate-filled and violent postings didn't face enough accountability for their actions. So, much like Facebook's already strict real-name policy, for murkier content that might fall on the free speech side of things, the social network will add in requirements for posters of these "humorous" rape jokes. Such as: "A few months ago we began testing a new requirement that the creator of any content containing cruel and insensitive humor include his or her authentic identity for the content to remain on Facebook." 

Even though the new policy doesn't ban rape jokes, per se, women's and anti-defamation groups have taken the company release as a good sign. They do say acceptance is the first step.