JammedUp News

News

Facebook announces new job that no one really wants

May 5, 2017  |  Posted by: Francesca Falzarano
Facebook announces new job that no one really wants

Are you in a legal jam? Find a Lawyer, Bail Bondsman or Private Investigator on JammedUp.

A new job opportunity allows Facebook employees to spend the work day at home in pajamas browsing the internet and making up to six figures — just as long as you don’t mind watching live streams of suicide, rape, and murder.

On Wednesday, Facebook revealed that they are planning to hire 3,000 “content moderators” across the globe to keep the worst of the internet off its social network. The news hires will be implemented on top of the current 4,500 staffers who already work to find hate speech, porn, child exploitation and other violent and disturbing behavior on the site.

The decision comes after a series of horrific episodes that were live streamed on Facebook, including a father in Thailand who hanged his 11-month-old infant before committing suicide. The footage remained on the site for approximately 24 hours before it was taken down,

“This is important,” Mark Zuckerberg, the company’s CEO, said in a post declaring the move. “Last week, we got a report that someone on Live was considering suicide. We reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren’t fortunate.”

However, in trying to keep the internet’s underbelly under control, Facebook and other social media platforms are subjecting a growing group of employees to trauma and emotional discomfort. Critics say the position can be debilitating, leaving moderators with “permanent videos in their heads” and may drive a need to stay away from common objects-turned-potential triggers, such as knives and computers.

“They’re exposed to the worst you can imagine,” Ben Wells said, who is an attorney who is representing two former Microsoft moderators who have developed post-traumatic stress disorder. “If you can imagine the worst person in the world and what they would want to watch, that’s what they’re exposed to… And some of this stuff you can’t un-see, and there are things you can’t get rid of… It can be debilitating.”

Henry Soto and Greg Blauert, two former Microsoft safety program employees, argued that the company neglected to warn them of the risks of the job and did not provide psychological support. The men filed lawsuits in December and are seeking damages, claiming negligence, disability discrimination, and violations of the Consumer Protection Act.

Microsoft contended the plaintiffs’ claims, arguing that it applies “industry-leading, cutting-edge technology” to identify content and bans the users who shared it. The company also stated that it has “wellness programs” to guarantee that employees who see the content are supported.

A hearing has been set in June to determine whether the claim will be dropped following a motion filed by Microsoft.

As part of his position, Soto observed “thousands of photographs and video of the most horrible, inhumane and disgusting content,” the lawsuit stated.

“Many people cannot imagine what Soto had to view on a daily basis as most people don’t understand how horrible and inhuman the worst people can be,” the lawsuit added. “[His] supervisors knew that the work was hard on him and was commended in his review for his ‘courage.’”

Soto began experiencing auditory hallucinations after seeing a video of a girl who was abused and killed, the suit indicated, which ultimately led him to go on medical leave in 2015.

“Soto was embarrassed by his symptoms, which included panic attacks, disassociation, depression, hallucinations, and an inability to be around computers or children, including, at times, his son, because it would trigger memories of violent acts against children that he had witnessed,” the lawsuit stated.

Facebook is adding 3,000 employees to monitor videos of crimes and suicide

Wells added that he “wouldn’t assume” that Facebook could be at risk of being identified in similar lawsuits from here on out, but said he had been contacted by moderators from other social media sites to discuss their legal rights.

Facebook refused to reveal whether the additional 3,000 jobs will be located in the U.S. or developing countries such as the Philippines or India. The company also did not indicate whether the new employees will be full-time or contractors.

A spokesperson told The New York Post that Facebook understands that the position can be challenging. To that end, every staffer who reviews content is offered psychological support and wellness resources. There’s also a program designed to help content reviewers and the offerings are reviewed on a yearly basis.

Lance Ulanoff, the chief correspondent and editor-at-large for tech site Mashable, compared the job to a 24-hour crisis hotline.

“It’s intense work,” Ulanoff told The New York Post. “These are people who are looking for language or images that might indicate self-harm, violence or anything that indicates that someone might harm others. These monitors see intense information on a constant basis. At the same time, that’s what they signed up for.”

Ulanoff added that Facebook’s response to the surge of violence and suicides committed on the social media site has been appropriate thus far, and added that it recognizes the role it plays in a much larger conversation.

“They’re coming around to the idea that they have to become stewards of the platform and make lives better and improve their product at the same time,” Ulanoff stated. “Keeping track of these monitors and refreshing the group now and then is a good idea, but they had to do something, and I think this a good step … They’re doing what they need to do for these monitors, for now.”

Get the latest news from the world of crime