Can Facebook Manage the Mental Health and Wellbeing Challenges of its Content Moderators?
Case Code: HROB234 Case Length: 10 Pages Period: 2018-2022 Pub Date: 2022 Teaching Note: Available |
Price: Rs.300 Organization: Facebook Inc. Industry: Technology & Communications Countries: United States,Ireland,India Themes: Stress & Motivation, Job Satisfaction, HR Policy,Performance Management |
Abstract Case Intro 1 Case Intro 2 Excerpts
Excerpts
Issues with Content Moderators
Facebook content moderators repeatedly complained about the inhumane conditions at Facebook’s contractor’s facilities. They alleged that while full-time employees at Facebook were treated well, those working through outsourcing partners did not receive the same benefits or pay. This was only because they did not qualify as Facebook employees though they were often tasked with the thankless and brutal task of helping keep Facebook clear of hate speech, graphic violence, and images of child abuse. It also included beheadings, terrorist attacks, suicides, and self-harm..
Measures taken by Facebook
In order to address questions, misunderstandings, and accusations around its content review practices, work, and pay, Facebook put in place various new measures for content moderators. “We want to continue to hear from our content reviewers, our partners and even the media – who hold us accountable and give us the opportunity to improve..
Road Ahead
Content Moderation played a key role in maintaining and attracting users to Facebook’s social media business. With the increasing number of users, the task of cleaning up content uploaded by users scaled up over the years since 2007, requiring 15,000 moderators including contact workers who spent all day wading through three million posts per day..
Exhibits
Exhibit I: Major Contract Partners
Exhibit II: Growth of Facebook’s Content Moderators
Buy this case study (Please select any one of the payment options)
Price: Rs.300 |
Price: Rs.300 | PayPal (7 USD) |