Tech

The new Facebook job that no one really wants

You can spend your day in your pajamas surfing the internet and earning up to six figures — so long as you don’t mind watching suicide, rape and murder.

Facebook announced Wednesday that it will hire an additional 3,000 “content moderators” around the world in a bid to keep the worst of the web off its social media platform. That’s on top of the existing 4,500 employees who already work to identify hate speech, pornography, child exploitation and other violent and disturbing acts.

The move follows a string of gruesome episodes that were livestreamed on Facebook, including a father in Thailand who hanged his 11-month-old baby girl before killing himself. The video lingered on the site for roughly 24 hours before it was finally removed.

“This is important,” CEO Mark Zuckerberg said in a post announcing the move. “Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren’t so fortunate.”

But in trying to keep the web’s underbelly at bay, Facebook and other social media companies are subjecting a growing group of workers to trauma and emotional distress. Critics say the job can be debilitating, leaving moderators with “permanent videos in their heads” and a need to stay away from everyday items-turned-potential triggers, like knives and computers.

“They’re exposed to the worst things you can imagine,” said Ben Wells, a lawyer who is representing two former Microsoft moderators who claim they developed post-traumatic stress disorder. “If you can imagine the worst person in the world and what they would want to watch, that’s what they’re exposed to, whether it’s on a daily basis or very frequently. And some of this stuff you just cannot un-see and there are things you can’t get rid of. It can be debilitating.”

Henry Soto and Greg Blauert, two former Microsoft online safety program employees, claim the company failed to warn them of the dangers of the job and did not provide adequate psychological support. The men sued in December and are seeking damages, alleging negligence, disability discrimination and violations of the Consumer Protection Act.

Microsoft disagreed with the plaintiffs’ claims, saying it applies “industry-leading, cutting-edge technology” to identify questionable content and bans the users who shared that material. The company also said it has “robust wellness programs” to ensure employees who view the content are properly supported, Courthouse News Service reported.

A hearing has been scheduled in June to decide whether the claim will be dismissed following a motion filed by Microsoft, Wells said.

‘They’re exposed to the worst things you can imagine. If you can imagine the worst person in the world and what they would want to watch, that’s what they’re exposed to.’

As part of his job, Soto saw “many thousands of photographs and video of the most horrible, inhumane and disgusting content you can imagine,” according to the lawsuit.

“In fact, many people simply cannot imagine what Mr. Soto had to view on a daily basis as most people do not understand how horrible and inhuman the worst people in the world can be,” the lawsuit reads. “Mr. Soto supervisors knew that the work was hard on Mr. Soto and he was commended in his employee review for his ‘courage.’”

Soto started having auditory hallucinations after seeing footage of a girl abused and murdered, according to the suit, ultimately going on medical leave in February 2015.

“Soto was embarrassed by his symptoms, which included panic attacks in public, disassociation, depression, visual hallucinations, and an inability to be around computers or young children, including, at times, his own son, because it would trigger memories of horribly violent acts against children that he had witnessed,” the lawsuit reads.

Wells said he “wouldn’t assume” that Facebook could be at risk of being named in similar lawsuits moving forward, but said he had been contacted by content moderators from other social media platforms to inquire about their legal rights.

“I’d want to know how Facebook is taking care of their moderators,” he said.

Facebook declined to indicate whether the additional 3,000 jobs will be located in the United States or overseas in developing countries like the Philippines or India. The company also did not specify whether the new workers will be full-time employees or contractors, or the typical workload for a content moderator at the world’s largest social media network.

A spokesperson for the company told The Post that Facebook recognizes that the position can be difficult. To that end, every employee who reviews content is offered both psychological support and wellness resources. There’s also a program specifically designed to support content reviewers and the offerings are reviewed annually.

Lance Ulanoff, chief correspondent and editor-at-large for tech site Mashable, likened the position to working a 24-hour crisis hotline.

“It’s very intense work,” Ulanoff told The Post. “These are people who are looking specifically for language or images that might indicate self-harm, violence or anything that would indicate someone might harm others. These monitors are seeing potentially intense information on a constant basis. At the same time, that’s what they signed up to do.”

Ulanoff said Facebook’s response to the spate of violence and suicides committed on the site has been good thus far, saying it recognizes the large role it plays in the larger conversation.

“They’re coming around to the idea that they have to become stewards of this content platform and maybe make people’s lives better and improve their product at the same time,” Ulanoff said. “Keeping track of these monitors and maybe refreshing the group every now and then is a good idea, but they had to do something and I think this a very good step … They’re doing what they need to do for these monitors, at least for now.”