centered image

centered image

Ex-Facebook Worker Claims Disturbing Content Led to PTSD

Discussion in 'General Discussion' started by Hadeel Abdelkariem, Dec 6, 2019.

  1. Hadeel Abdelkariem

    Hadeel Abdelkariem Golden Member

    Joined:
    Apr 1, 2018
    Messages:
    3,448
    Likes Received:
    21
    Trophy Points:
    7,220
    Gender:
    Female
    Practicing medicine in:
    Egypt

    A former Facebook moderator is suing the company, alleging that his work scouring the site of violent and obscene content caused his post-traumatic stress disorder.

    [​IMG]

    Chris Gray, who now works as a tour guide, is seeking damages from both Facebook Ireland and CPL, the contracting firm that directly employed him. The case, filed on Wednesday in the Irish high court in Dublin, is thought to be the first time a former moderator has taken the social network to court.

    According to court documents, Gray’s work required him to review “approximately a thousand tickets per night”, initially focused on pornography, and later “on content that had been reported as being threatening, hateful, bullying, or otherwise dangerous”.

    Two years on, a number of specific pieces of content remain “particularly marked” in his memory, the legal writ says, including “a video in which a woman wearing an abaya is seen being stoned to death”, “a video in which persons, who appear to be migrants in Libya, are tortured with molten metal”, and “video footage of dogs being cooked alive”.

    Aggravating the trauma, Gray’s lawyers argue, was the fact that the nature of the work required him to “obsess” over particular videos. Facebook and CPL “valued accuracy … above all else”, and tracked whether individual calls were made correctly or not. But the system did not allow moderators “to hold or skip a ticket pending a decision from above”, requiring them instead to focus deeply on particular pieces of content, often featuring violent or upsetting material.

    The complaint details one video, for instance, “which collaged various scenes of people dying in different accidents … set to a musical soundtrack. [Gray] had a long argument with the quality point of contact [a senior role] about whether the music meant that the person posting it was ‘celebrating’ or whether it just counted as disturbing content.”

    Speaking to the Guardian before the case was filed, Gray said: “You would wake up and you’re remembering the video of someone machine-gunning people in the Middle East somewhere, trying to think whether there was an Isis flag, and so whether it should be marked as terrorism-related or not.

    “It took me a year after I left to realise how much I’d been affected by the job. I don’t sleep well, I get in stupid arguments, have trouble focusing.”

    Gray is being supported in his case by Foxglove, an international NGO that backs efforts to hold big tech to account through the legal system. Cori Crider, Foxglove’s director, said: “The reason we’ve got involved is that we think that social media factory floors are unsafe and need to be cleared up. In a decade we’re going to look back on this as we did at meat packing plants at the turn of the century.

    “Facebook’s only going to pay attention to things when they know that they’ve got a typhoon bearing down on them. What I’d like to see is the moderators realising how much power they have if they just organise. Because let’s face it, social media as we know it could not exist without the labour people like Chris provide.”

    In October, leaked audio from a Facebook all-staff meeting revealed Mark Zuckerberg describing reports about poor working conditions in the company’s moderation centres as “a little overdramatic”.

    “From digging into them and understanding what’s going on, it’s not that most people are just looking at just terrible things all day long,” Zuckerberg told employees. “But there are really bad things that people have to deal with, and making sure that people get the right counselling and space and ability to take breaks and get the mental-health support that they need is a really important thing. It’s something we’ve worked on for years and are always trying to probe and understand how we can do a better job to support that.”

    A Facebook spokesperson said: “We are committed to providing support for those that review content for Facebook as we recognise that reviewing certain types of content can sometimes be difficult. Everyone who reviews content for Facebook goes through an in-depth, multi-week training program on our Community Standards and has access to extensive psychological support to ensure their wellbeing. This includes 24/7 on-site support with trained practitioners, an on-call service, and access to private healthcare from the first day of employment. We are also employing technical solutions to limit their exposure to graphic material as much as possible. This is an important issue, and we are committed to getting this right.”

    Source
     

    Add Reply

Share This Page

<