• Tuesday, May 21, 2019
  • Last Update : 12:43 pm

Facebook turns to artificial intelligence to tackle suicides

  • Published at 10:26 pm March 1st, 2017
Facebook turns to artificial intelligence to tackle suicides

Facebook plans to use artificial intelligence and update its tools and services to help prevent suicides among its users. The world's largest social media network said it plans to integrate its existing suicide prevention tools for Facebook posts into its live-streaming feature, Facebook Live, and its Messenger service.

Artificial intelligence will be used to help spot users with suicidal tendencies, the company said in a blogpost on Wednesday.

In January, a 14-year-old foster child in Florida broadcast her suicide reportedly on Facebook Live, according to the New York Post.

Facebook is already using artificial intelligence to monitor offensive material in live video streams.

[caption id="attachment_49820" align="aligncenter" width="800"]Users judged to be at risk will see a message from Facebook advising them how to seek help FACEBOOK Users judged to be at risk will see a message from Facebook advising them how to seek help FACEBOOK[/caption]

The company said on Wednesday that the updated tools would give an option to users watching a live video to reach out to the person directly and report the video to Facebook.

Facebook Inc will also provide resources, which include reaching out to a friend and contacting a help line, to the user reporting the live video.

Suicide is the second-leading cause of death for 15-29 year olds.

[caption id="attachment_49821" align="aligncenter" width="800"]Facebook Live users who discuss killing themselves will be given advice but will not have their stream interrupted FACEBOOK Facebook Live users who discuss killing themselves will be given advice but will not have their stream interrupted FACEBOOK[/caption]

Pattern recognition

Facebook has offered advice to users thought to be at risk of suicide for years, but until now it had relied on other users to bring the matter to its attention by clicking on a post's report button.

It has now developed pattern-recognition algorithms to recognise if someone is struggling, by training them with examples of the posts that have previously been flagged.

Talk of sadness and pain, for example, would be one signal. Responses from friends with phrases such as "Are you OK?" or "I'm worried about you," would be another.

[caption id="attachment_49822" align="aligncenter" width="800"]Users watching a Facebook Live stream will be advised how to help a user they are concerned about Users watching a Facebook Live stream will be advised how to help a user they are concerned about FACEBOOK[/caption]

Once a post has been identified, it is sent for rapid review to the network's community operations team.

When someone watching the live stream clicks a menu option to declare they are concerned, Facebook displays advice to the viewer about ways they can support the broadcaster. The stream is also flagged for immediate review by Facebook's own team, who then overlay a message with their own suggestions if appropriate.

The new system is being rolled out worldwide. A new option to contact a choice of crisis counsellor helplines via Facebook's Messenger tool, however, is limited to the US for now.

Facebook said it needed to check whether other organisations would be able to cope with demand before it expanded the facility.