Facebook content moderators in Ireland met with Deputy Prime Minister Leo Varadkar today to demand work-from-home rights. They say the company forced them back into the office, even as COVID-19 cases spiked.
“We should be working from home just like employees,” said Ibrahim Halawa, a former political prisoner and law student who works as a Facebook content moderator. “We should have the same mental health care they get, the same benefits—but we don’t. Facebook can’t exist without us, and it should stop treating us as second-class citizens.”
Since December 30th, Ireland has been in a level five lockdown, meaning household visits are banned and nonessential retail businesses are closed. The country currently has 192,645 confirmed COVID-19 cases.
In August, Facebook announced employees would be able to work from home until July 2021. Two months later, content moderators working for the subcontracting firm CPL in Dublin were told to return to the office. The company had categorized them as essential workers, according to The Guardian. High-risk contractors were exempt, but those with vulnerable family members had to comply. Moderators had previously been told the office would close for 72 hours if there was a confirmed COVID-19 case. But The Guardian reported that after three cases, the office remained open.
During a press conference today, Halawa said Varadkar agreed to send a letter to Facebook and Covalen, a CPL subsidiary, inquiring about the disparate treatment between contractors and employees.
“They’ve made it abundantly clear that our health & safety and lives don’t matter to them,” said Paria Moshfeghi, another Facebook content moderator. “They’re forcing us into the office, putting us and our families at risk of COVID-19, even though our colleagues keep getting COVID. Facebook employees working on the similar content as us are safe and allowed to work from home. Why aren’t we?”
Both Halawa and Moshfeghi stressed that other Facebook moderators want to come forward but are scared because of the strict NDAs workers are forced to sign.
Tech companies have faced mounting pressure to moderate content during the coronavirus pandemic. In March, YouTube announced it would be relying more heavily on AI to enforce its content moderation policy. In September, the company said it was bringing back human moderators, noting AI hadn’t been as effective.
Facebook did not immediately respond to a request for comment from The Verge.
This srticle was first published on The Verge