A security lapse that affected more than 1,000 workers forced one moderator into hiding – and he still lives in constant fear for his safety
Facebook moderators like him first suspected there was a problem when they started receiving friend requests from people affiliated with the terrorist organizations they were scrutinizing.
An urgent investigation by Facebook’s security team established that personal profiles belonging to content moderators had been exposed.
Facebook then discovered that the personal Facebook profiles of its moderators had been automatically appearing in the activity logs of the groups they were shutting down.
In one exchange, before the Facebook investigation was complete, D’Souza sought to reassure the moderators that there was “a good chance” any suspected terrorists notified about their identity would fail to connect the dots.
“Keep in mind that when the person sees your name on the list, it was in their activity log, which contains a lot of information,” D’Souza wrote, “there is a good chance that they associate you with another admin of the group or a hacker …”
The bug in the software was not fixed for another two weeks, on 16 November 2016. By that point the glitch had been active for a month. However, the bug was also retroactively exposing the personal profiles of moderators who had censored accounts as far back as August 2016.
Facebook offered to install a home alarm monitoring system and provide transport to and from work to those in the high risk group. The company also offered counseling through Facebook’s employee assistance program, over and above counseling offered by the contractor, Cpl.
“Our investigation found that only a small fraction of the names were likely viewed, and we never had evidence of any threat to the people impacted or their families as a result of this matter,” the spokesman said.
He was paid just €13 ($15) per hour for a role that required him to develop specialist knowledge of global terror networks and scour through often highly-disturbing material.
“You come in every morning and just look at beheadings, people getting butchered, stoned, executed,” he said.
The moderator said that when he started, he was given just two weeks training and was required to use his personal Facebook account to log into the social media giant’s moderation system.
In an attempt to boost morale among agency staff, Facebook launched a monthly award ceremony to celebrate the top quality performers. The prize was a Facebook-branded mug. “The mug that all Facebook employees get,” he noted.