Facebook acknowledged today that its offensive content management policies may “seem contradictory” because they are constructed with the observations of experts from different areas, but the company seeks to be “objective” when evaluating the “gray areas”.
Monika Bickert, the head of Global Policy Management of Technology, referred in a statement to the leaked information on these policies published recently by the British newspaper The Guardian and which showed the difficulty for examiners to draw the line between what is acceptable and The unacceptable.
Although Facebook does not by default share these policies to avoid the emergence of methods to circumvent them, Bickert argued that the firm does publish evolving community standards and that it also receives expert advice.
“We are in constant dialogue with experts and local organizations for everything from child safety, terrorism or human rights. Sometimes this means that our policies may seem contradictory,” she said.
“Experts on self-harm have advised us that it may be better to let live videos of this type be broadcast so people can be alerted and help, but we must eliminate them later to avoid imitators,” he said.
The leaked documents revealed that last summer, moderators faced more than 4,500 reports of self-inflicted harm in two weeks, and this year’s statistics spoke of 5,400 in another two-week period, a growing figure.
However, more potential cases still generate on Facebook the phenomena of pornography and sextorsión, which according to another leaked document amounted to 54,000 in a single month and have the most complex policies for the evaluators.
The company, led by Mark Zuckerberg, announced in April that it would hire 3,000 new testers to tackle such content, amid incidents such as murder, torture or sexual assault that were broadcast on the social network and which saw hundreds of people.
Bickert stressed that those responsible for reviewing content that may be offensive have to overcome the “obstacle” to “understand the context” of them.
“Someone hangs a graphic video of a terrorist attack, will it inspire people to emulate violence or make a statement against it?” “Someone puts a joke on suicide, are they themselves or is it a cry for help?” Of Facebook, who was a prosecutor for more than a decade.
He said the company’s policy team is trying to “stay objective” in cases that are not easy to review because they are “in a gray area where people do not agree.”
For Facebook, he said, the benefits of sharing outweigh the risks. “But we also recognize that society is deciding what is acceptable and harmful, and on Facebook we can be an important part of that conversation,” he concluded.