A reporter from the Guardian was selected to participate in a survey on Facebook over the weekend. The questions he was asked surprised him. “In thinking about an ideal world where you could set Facebook’s policies, how would you handle the following: a private message in which an adult man asks a 14-year-old girl for sexual pictures,” Facebook asked. The possible answers to this question ranged from no opinion to that shouldn’t be allowed. Notably missing from the list of options: Call the police.

There was a follow-up question asking users who should decide if this content was viewable on the site. Again, the possible answers ranged from Facebook itself to Facebook’s users. Not mentioned in the follow-up: laws against grooming/soliciting children.

The survey went on to ask similar questions about extremism. A Facebook VP responded on Twitter, suggesting it would, of course, contact authorities in the case of such behavior:

From the Guardian:

In a statement, a Facebook spokesperson added: “We understand this survey refers to offensive content that is already prohibited on Facebook and that we have no intention of allowing so have stopped the survey.

“We have prohibited child grooming on Facebook since our earliest days; we have no intention of changing this and we regularly work with the police to ensure that anyone found acting in such a way is brought to justice.”

So why wasn’t the thing Facebook claims it actually does in these cases an option in the survey? And if having Facebook set policy on illegal behavior isn’t really an option, why pose this as a possible response to a question about how to respond?

The whole premise of both of these questions seems to posit an alternate reality where Facebook is not part of the real world and therefore not subject to existing laws that already regulate such behavior. None of the rules, in this case, are up to Facebook because the rules have already been set by law. It’s very odd that no one preparing this survey seems to have thought about that.