The social media giant is working on how to go one step further and help prevent some of these actions by identifying material that may indicate a user is having suicidal thoughts, leading some to question the implications of “mind reading” technology.
“A lot of what we’re trying to do is not just about taking the content down but also about helping people while they are using the platform,” Facebook CEO Mark Zuckerberg said last week.
Facebook has 60 highly skilled engineers in its secretive research center, Building 8, constructing software to solve the “problem” that we cannot currently type directly from our minds. Facebook calls it a “brain-computer speech-to-text interface,” but critics wonder if it isn’t just another step toward a dystopic scene from Aldous Huxley’s "Brave New World."
The company says “we are not talking about decoding your random thoughts,” according to Facebook executive Regina Dugan, as that may “be more than any of us care to know.” Still, the prospect of the technology falling into the wrong hands is virtually impossible to ignore.
For now, Facebook is teaming with law enforcement to use features currently available on the platform to bring people back from the edge of crisis. In April, Facebook was notified of a video that could feature a potential suicide; instead of taking the video offline, first responders used that “live video to communicate with that person and help save their life,” Zuckerberg said.
In one case, 911 was called about a teenager in the act of self-harm. Because of the user’s privacy settings, only friends could tune into the stream, but a police officer had a family member who was friends with the girl. “It was very serious and we needed to get to it right away,” Sgt. Linda Howard of the local sheriff’s office said. Police found the individual in a bathroom unresponsive, “but one of our sergeants was able to find a pulse,” Howard said.
Facebook wants to be able to detect at-risk users “where it’s likely someone is expressing thoughts of suicide,” NBC Bay Area reported. The company wants to be able to provide resources to those it detects as at-risk without waiting for troubling behaviour to be reported. It’s “new territory,” said psychologist Dan Reidenberg, who is partnering with Facebook, “but really important.”
Last week, Zuckerberg also announced that the Menlo Park tech giant would retain 3,000 people to monitor and analyze videos featuring crime and self-harm.
“We take our responsibility to keep people safe on Facebook very seriously and will remove videos that depict sexual assault and are shared to glorify violence,” a Facebook spokesperson told Sputnik in March, following the broadcast of a murder to more than 40 witnesses via Facebook Live.