Dispatches investigation reveals how Facebook moderates content
Category: News ReleaseAn undercover investigation by Firecrest Films for Channel 4 Dispatches has revealed for the first time how Facebook decides what users can and can’t see on the platform. (Inside Facebook: Secrets of the Social Network, Channel 4 Dispatches, 9pm, 17 July). Dispatches’ investigation reveals:
- Violent content such as graphic images and videos of assaults on children, remaining on the site, despite being flagged by users as inappropriate and requests to have it removed.
- Thousands of reported posts remained unmoderated and on the site while we were filming, beyond Facebook’s stated aim of a 24-hour turnaround, including potentially posts relating to suicide threats and self-harm.
- Moderators told not to take any action if content shows a child who is visibly below Facebook’s 13-year-old age limit, rather than report it as posted by underage users, even if the content includes self-harming.
- Allegations from an early Facebook investor and mentor to Mark Zuckerberg, that Facebook’s business model benefits from extreme content which engages viewers for longer, generating higher advertising revenue.
- Pages belonging to far-right groups, with large numbers of followers, allowed to exceed deletion threshold, and subject to different treatment in the same category as pages belonging to governments and news organisations.
- Policies allowing hate speech towards ethnic and religious immigrants, and trainers instructing moderators to ignore racist content in accordance with Facebook’s policies
Dispatches sent an undercover reporter to work as a content moderator in Facebook’s largest centre for UK content moderation. The work is outsourced to a company called Cpl Resources plc in Dublin which has worked with Facebook since 2010. The investigation reveals the training given to content moderators to demonstrate how to decide whether content reported to them by users, such as graphic images and videos of child abuse, self-harming, and violence should be allowed to remain on the site or be deleted. Dispatches also films day-to-day moderation of content on the site, revealing:
Violent content:
One of the most sensitive areas of Facebook’s content rulebook is about graphic violence. When dealing with graphic violence content, moderators have three options – ignore, delete, or mark as disturbing which places restrictions on who can see the content.
Dispatches’ undercover reporter is seen moderating a video showing two teenage schoolgirls fighting. Both girls are clearly identifiable and the video has been shared more than a thousand times. He’s told that Facebook’s rules say that because the video has been posted with a caption condemning the violence and warning people to be careful about visiting the location where it was filmed, it should not be deleted and instead should be left on the site and marked as disturbing content. Dispatches speaks to the mother of the girl involved who tells the programme the distress and impact the video had on her daughter. She struggles to understand the decision to leave the video up on the site. “To wake up the next day and find out that literally the whole world is watching must have been horrifying. It was humiliating for her, it was devastating for her. You see the images and it’s horrible, it’s disgusting. That’s someone’s child fighting in the park. It’s not Facebook entertainment.”
Facebook told Dispatches that the child or parent of a child featured in videos like this can ask them to be removed. Richard Allan, VP of Public Policy at Facebook said, “Where people are highlighting an issue and condemning the issue, even if the issue is painful, there are a lot of circumstances where people will say to us, look Facebook, you should not interfere with my ability to highlight a problem that’s occurred.
Online anti-child abuse campaigner Nicci Astin tells Dispatches about another violent video which shows a man punching and stamping on a toddler. She says she reported the video to Facebook in 2012 and received a message back saying it didn’t violate its terms and conditions. The video is used during the undercover reporter’s training period as an example of what would be left up on the site, and marked as disturbing, unless posted with a celebratory caption. The video is still up on the site, without a graphic warning, nearly six years later. Facebook told Dispatches they do escalate these issues and contact law enforcement, and the video should have been removed.
One moderator tells the Dispatches undercover reporter that people that “if you start censoring too much then people lose interest in the platform…. It’s all about making money at the end of the day.”
Venture Capitalist Roger McNamee was one of Facebook’s earliest investors, a mentor to CEO Mark Zuckerberg, and the man who brought Sheryl Sandberg to the company. He tells Dispatches that Facebook’s business model relies on extreme content:
“From Facebook’s point of view this is, this is just essentially, you know, the crack cocaine of their product right. It’s the really extreme, really dangerous form of content that attracts the most highly engaged people on the platform. Facebook understood that it was desirable to have people spend more time on site if you’re going to have an advertising based business, you need them to see the ads so you want them to spend more time on the site. Facebook has learned that the people on the extremes are the really valuable ones because one person on either extreme can often provoke 50 or 100 other people and so they want as much extreme content as they can get.”
Richard Allan told Dispatches: Shocking content does not make us more money, that’s just a misunderstanding of how the system works …. People come to Facebook for a safe secure experience to share content with their family and friends. The vast majority of those 2 billion people would never dream of sharing content that, like that, to shock and offend people. And the vast majority of people don’t want to see it. There is a minority who are prepared to abuse our systems and other internet platforms to share the most offensive kind of material. But I just don’t agree that that is the experience that most people want and that’s not the experience we’re trying to deliver.
Underage users:
No child under 13 can have a Facebook account. However, a trainer tells the undercover reporter not to proactively take any action regarding their age if the report contains an image of a user who is visibly underage, unless the user admits to being underage: “We have to have an admission that the person is underage. If not, we just like pretend that we are blind and we don’t know what underage looks like.” Even if the content contains images for self-harm for example, and the image is of someone who looks underage the user is treated like an adult and sent information about organisations which help with self-harming issues, rather than being reported for being underage: “If this person was a kid, like a 10-year-old kid we don’t care, we still action the ticket as if they were an adult.” Facebook confirmed to Dispatches that its policy is not to take action about content posted by users who appear to be underage, unless the user admits to being underage.
Hate speech:
Dispatches’ undercover reporter is told that, while content which racially abuses protected ethnic or religious groups violates Facebook’s guidelines, if the posts racially abuse immigrants from these groups, then the content is permitted. Facebook’s training for moderators also includes a post including a cartoon comment which describes drowning a girl if her first boyfriend is a negro, as content which is permitted. Facebook confirmed to Dispatches that the picture violates their hate speech standards and they are reviewing what went wrong to prevent it from happening again.
“Shielded Review” – Popular pages kept up despite violations
Our undercover reporter is told that if any page is found to have five or more pieces of content that violate Facebook’s rules, then the entire page should be taken down, in accordance with the company’s policies. But we have discovered that posts on Facebook’s most popular pages, with the highest numbers of followers, cannot be deleted by ordinary content moderators at Cpl. Instead, they are referred to the Shielded Review Queue where they can be directly assessed by Facebook rather than Cpl staff. These pages include those belonging to jailed former English Defence League leader Tommy Robinson, who has over 900,000 followers, and who has been given the same protected status as Governments and news organisations. A moderator tells the undercover reporter that the far-right group Britain First’s pages were left up despite repeatedly featuring content that breached Facebook’s guidelines because, “they have a lot of followers so they’re generating a lot of revenue for Facebook. The Britain First Facebook page was finally deleted in March 2018 following the arrest of deputy leader Jayda Fransen.
Facebook confirmed to Dispatches that they do have special procedures for popular and high profile pages, which includes Tommy Robinson and included Britain First.
They say Shielded Review has been renamed ‘Cross Check’. Lord Allen told Dispatches: “if the content is indeed violating it will go….I want to be clear this is not a discussion about money, this is a discussion about political speech. People are debating very sensitive issues on Facebook, including issues like immigration. And that political debate can be entirely legitimate. I do think having extra reviewers on that when the debate is taking place absolutely makes sense and I think people would expect us to be careful and cautious before we take down their political speech.”
Delays in moderating content:
Facebook’s publicly stated aim is to assess all reported content within 24 hours. However, during the period of the undercover filming, Dispatches found a significant backlog. Moderators told the undercover reporter that due to the volume of reports, or tickets, they are supposed to moderate, they are unable to check up to 7,000 reported comments on a daily basis. At one point there is a backlog of 15,000 reports which have not been assessed, with some tickets are still waiting for moderation up to five days after being reported. Facebook told Dispatches that the backlog filmed in the programme was cleared by 6 April.
…/ends