Facebook trains artificial intelligence to spot suicidal signs

November 27, 2017
Facebook turns to AI to help prevent suicides
In this April 18, 2017, file photo, conference workers speak in front of a demo booth at Facebook's annual F8 developer conference in San Jose, Calif. Facebook is expanding its use of artificial intelligence to help prevent suicides. The social media giant says it's doing this by scanning people's posts and live videos to detect if someone might be thinking about harming themselves, before the posts are even reported. (AP Photo/Noah Berger, File)

Facebook on Monday said stepping up the use of artificial intelligence to identify members of the leading social network who may be thinking of suicide.

Software will look for clues in posts or even in videos being streamed at Facebook Live, then fire off reports to human reviewers and speed up alerts to responders trained to help, according to the social network.

"This approach uses pattern recognition technology to help identify posts and live streams as likely to be expressing thoughts of suicide," Facebook vice president of product management Guy Rosen said in a blog post.

Signs watched for were said to include texts by people or comments to them, such as someone asking if they are troubled.

Facebook already has tools in place for people to report concerns about friend's who may be considering self-harm, but the software can speed the process and even detect signs people may overlook.

"There have been terribly tragic events—like suicides, some live-streamed—that perhaps could have been prevented if someone had realized what was happening and reported them sooner," Facebook chief executive Mark Zuckerberg said early this year in a post at the social network focused on building global community.

"Artificial intelligence can help provide a better approach."

Facebook is rolling out the artificial intelligence tool outside the US and planned to make it eventually available everywhere except the European Union, where data usage is restricted by privacy regulations.

Facebook has been collaborating with mental health organizations for about a decade on ways to spot signs users may be suicidal and get them help.

Explore further: Facebook will charge to 'promote' user posts (Update)

Related Stories

Facebook boosts search on mobile, desktop

December 9, 2014

Facebook said Monday it is rolling out upgraded search capabilities for mobile and desktop users who want to find favorite posts from their friends on the huge social network.

German court to rule in refugee's Facebook lawsuit

March 7, 2017

A court in southern Germany will rule on a case brought against Facebook by a Syrian refugee who wants the company to seek and delete posts falsely linking him to crimes committed by migrants.

Recommended for you

1 in 3 Michigan workers tested opened fake 'phishing' email

March 16, 2018

Michigan auditors who conducted a fake "phishing" attack on 5,000 randomly selected state employees said Friday that nearly one-third opened the email, a quarter clicked on the link and almost one-fifth entered their user ...

Origami-inspired self-locking foldable robotic arm

March 15, 2018

A research team of Seoul National University led by Professor Kyu-Jin Cho has developed an origami-inspired robotic arm that is foldable, self-assembling and also highly-rigid. (The researchers include Suk-Jun Kim, Dae-Young ...

Tokyo Tech's six-legged robots get closer to nature

March 12, 2018

A study led by researchers at Tokyo Institute of Technology (Tokyo Tech) has uncovered new ways of driving multi-legged robots by means of a two-level controller. The proposed controller uses a network of so-called non-linear ...

1 comment

Adjust slider to filter visible comments by rank

Display comments: newest first

not rated yet Nov 28, 2017
Its Facebook that makes me want to die.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.