In recent months, there have been several cases in which Facebook Live users used the service to livestream their deaths.
Following criticism of the platform, Facebook has been working with suicide prevention organizations in order to come up with effective tools for suicide prevention.
It has developed a computer program that looks for signs a person might commit self-harm in their posts and comments left by friends.
It has added a report tool to its livestream that should enable content that contravenes the social network's "community standards" to be removed more quickly.
"Ideally this new artificial intelligence will be able to pick up on that very quickly and then escalate the case so that resources can be given both to the person streaming the video as well as to the people that are commenting," Dr. Dan Reidenberg, executive director of Suicide Awareness Voices of Education, one of the organizations advising Facebook, told Radio Sputnik.
"If you're that person in distress, a message will come up and it will give you access to immediate resources in your local area, whether it's a messaging service or a crisis service in your area, it'll give you tips on dealing with stress so it'll connect you with a resource for support right away," Reidenberg explained.
"When we give people information on risk factors and warning signs for suicide, that helps them better understand what to watch for and recognize in terms of people that they're interacting with."
"Technology … is able to provide direct links to resources and local communities, whether that's a crisis center, a hotline to call, a texting service. Most people don't know all of those things exist. We now can identify somebody very rapidly that might be at risk of suicide, whatever level of acuity that is, and by giving them resources and direct help we can actually lower their risk of dying by suicide," Reidenberg said.