Menu
facebook suicide fi
Russell Webster

Russell Webster

Criminal Justice & substance misuse expert and author of this blog.

Facebook artificial intelligence spots suicidal users

Share on twitter
Share on facebook
Share on linkedin
Share on print
Share on email
New measures from Facebook seek to help suicidal users by using artificial intelligence to identify those at risk and then intervene with helping resources.

Facebook gets proactive to help suicidal users

Facebook has begun using artificial intelligence to identify members that may be at risk of killing themselves.

The social network has developed algorithms that spot warning signs in users’ posts and the comments their friends leave in response. After confirmation by Facebook’s human review team, the company contacts those thought to be at risk of self-harm to suggest ways they can seek help.

The tool is being tested only in the US at present and comes in the wake of a 14 year old girl streaming her suicide on Facebook Live earlier this year, although apparently Facebook was working on this issue before the

It marks the first use of AI technology to review messages on the network since founder Mark Zuckerberg announced last month that he also hoped to use algorithms to identify posts by terrorists, among other concerning content.

Facebook also announced new ways to tackle suicidal behaviour on its Facebook Live broadcast tool and has partnered with several US mental health organisations to let vulnerable users contact them via its Messenger platform.

Pattern recognition

Facebook has offered advice to users thought to be at risk of suicide for years, but until now it had relied on other users to bring the matter to its attention by clicking on a post’s report button.

It has now developed pattern-recognition algorithms to recognise if someone is struggling, by training them with examples of the posts that have previously been flagged.

Talk of sadness and pain, for example, would be one signal. Responses from friends with phrases such as “Are you OK?” or “I’m worried about you,” would be another.

Once a post has been identified, it is sent for rapid review to the network’s community operations team.

Key issues

Getting the right response as quickly as possible is critical. There is an ongoing discussion between Facebook and suicide prevention organisations on how best to do this.

One suggestion is that Facebook automatically contacts the family and friends of a user they are concerned about. Facebook has acknowledged that contact from friends or family was typically more effective than a message from Facebook, but said this approach raised complex privacy issues.

Facebook has already introduced new measures to its live streaming platform among concerns that troubled young people may indulge in copycat behaviour. The organisation is now trying to help at-risk users while they are broadcasting, rather than wait until their completed video has been reviewed some time later.

Now, when someone watching the stream clicks a menu option to declare they are concerned, Facebook displays advice to the viewer about ways they can support the person they are concerned about.

The stream is also flagged for immediate review by Facebook’s own team, who then overlay a message with their own suggestions if appropriate.

Facebook LiveCopyright FACEBOOK. Users watching a Facebook Live stream will be advised how to help a user they are concerned about.

Facebook considered shutting down the stream immediately but were advised by experts that this would remove the opportunity for people to reach out and offer support.

Although the new system is being rolled out worldwide, a new option to contact a choice of crisis counsellor helplines via Facebook’s Messenger tool, however, is limited to the US for now.

Facebook said it needed to check whether other organisations would be able to cope with demand before it expanded the facility.

 

[Much of the information in this post was taken from an article by Leo Kelion for BBC News.]

 

All innovation posts are kindly sponsored by Socrates 360 which provides a complete solution for staff, prisoners, probationers, etc. combining engaging content, simple set-up and an easy tracking system. Socrates 360 has no influence over editorial content.

Share on twitter
Share on facebook
Share on linkedin
Share on print
Share on email

Leave a Reply

Your email address will not be published.

Innovation posts sponsored by Socrates 360

The smart solution to communication, information, and education in secure settings and beyond.

Socrates Software is  working with Probation Services, Prison Services and some of the UK’s premierprivate companies bringing innovation and life-changing improvements to the sector by providing a “mobile mentor” via tablets and smartphones for Prisons and the Transforming Rehabilitation Programme.

 

The Future of Resettlement

Socrates 360, mobile mentor, is a true Through The Gates solution for the prison and probation sector. For use by prisoners, probationers and staff.

Select Language

Keep up-to-date on drugs and crime

You will get one email with a new article every day.