Is Facebook's Suicide-Prevention Tool Doing Any Good? (sfgate.com) 99
"Facebook knew there was a problem when a string of people used the platform to publicly broadcast their suicides in real time," reports Business Insider, raising questions about what the company has done since:
Facebook has a suicide-monitoring tool that uses machine learning to identify posts that may indicate someone is at risk of killing themselves. The tool was involved in sending emergency responders to locations more than 3,500 times as of last fall. A Harvard psychiatrist is worried the tool could worsen health problems by homing in on the wrong people or escalating mental-health crises... "We as the public are partaking in this grand experiment, but we don't know if it's useful or not," Harvard psychiatrist and tech consultant John Torous told Business Insider last week....
Without public information on the tool, Torous said big questions about Facebook's suicide-monitoring tool are impossible to answer... "It's one thing for an academic or a company to say this will or won't work. But you're not seeing any on-the-ground peer-reviewed evidence," Torous said. "It's concerning. It kind of has that Theranos feel...." Because of privacy issues, emergency responders can't tell Facebook what happened at the scene of a potential suicide, said Antigone Davis, Facebook's global head of safety. In other words, emergency responders can't tell Facebook if they reached the scene too late to stop a death, showed up to the wrong place, or arrived only to learn there was no real problem.
Torous, a psychiatrist who's familiar with the thorny issues in predicting suicide, is skeptical of how that will play out with regard to the suicide monitoring tool. He points to a review of 17 studies in which researchers analyzed 64 different suicide-prediction models and concluded that the models had almost no ability to successfully predict a suicide attempt. "We know Facebook built it and they're using it, but we don't really know if it's accurate, if it's flagging the right or wrong people, or if it's flagging things too early or too late," Torous said.
Without public information on the tool, Torous said big questions about Facebook's suicide-monitoring tool are impossible to answer... "It's one thing for an academic or a company to say this will or won't work. But you're not seeing any on-the-ground peer-reviewed evidence," Torous said. "It's concerning. It kind of has that Theranos feel...." Because of privacy issues, emergency responders can't tell Facebook what happened at the scene of a potential suicide, said Antigone Davis, Facebook's global head of safety. In other words, emergency responders can't tell Facebook if they reached the scene too late to stop a death, showed up to the wrong place, or arrived only to learn there was no real problem.
Torous, a psychiatrist who's familiar with the thorny issues in predicting suicide, is skeptical of how that will play out with regard to the suicide monitoring tool. He points to a review of 17 studies in which researchers analyzed 64 different suicide-prediction models and concluded that the models had almost no ability to successfully predict a suicide attempt. "We know Facebook built it and they're using it, but we don't really know if it's accurate, if it's flagging the right or wrong people, or if it's flagging things too early or too late," Torous said.
No. (Score:3)
Re: No. (Score:1)
Why doesn't Facebook just literally ask the people it flagged? It's not like Facebook is infested with morons is it?
Re: (Score:2)
Re: (Score:2)
Why did slashdot start deleting whole threads in comments section... whats great is i have screenshots this time.
Of course not. (Score:2, Funny)
Facebook should kill itself.
Works great (Score:4, Insightful)
It works great when you understand what its purpose is. It's not to help people or improve mental health, it's to prevent Facebook getting so much bad press.
Now Facebook can say it's helping people and sending emergency services to them, saving lives! It doesn't matter if it actually works.
Re: Works great (Score:5, Insightful)
I'm reminded of a scene from M*A*S*H
"Do something...do anything!"
"I agree with Frank. I think we should do anything."
Or, of course, the classic from Yes Minister
"Something must be done. This is something. Therefore we must do it."
Re: Works great - MASH Theme Song (Score:4, Insightful)
"Suicide Is Painless"
Re: (Score:1)
Facebook can get feedback, at least some ... through Facebook. If the user logs on and posts stuff after the intervention (or responds to a message from FB asking if they are ok), then there's data to go off of. Comparing the aggregated data back to similar situations prior to the implementation of the program would provide an indication of whether it is helping.
Re: (Score:2, Troll)
And that is exactly the problem with the SJW virtue signaling these days. As long as you do 'something' you're doing good in the intersectional left's eyes, doesn't matter if it's actually harmful, you sent out the right 'signals' so that's what's most important.
In the mean time, suicide rises even though society by and large is ever more 'accepting' of anything. Perhaps it's not acceptance by your peers, but a good moral and social support fabric that's necessary for people to get the feeling they 'fit in'
Re: (Score:1, Flamebait)
You know, not everything is about SJWs. You seem kinda obsessed.
Re: (Score:2)
He's really quite a skilled troll, I think in part because he's of the sort that can switch on and off the earnestness as needed.
Fuck this second guessing ... (Score:3, Insightful)
It gets moderated by a human twice, the first time at facebook the second time at the emergency responders. This system almost certainly only sends responders when someone outright says he's going to kill themselves.
Facebook is acting responsibly, comparing this to Theranos is irresponsible.
Re: (Score:1)
The left doesn't care about jews or asians, they are too successful to be considered part of the intersectional left's agenda. Perhaps you haven't noticed the very outspoken anti-semites running the Democratic party right now.
Re: (Score:2, Interesting)
Facebook has no person-to-person interaction and no human ability to detect a truly suicidal person from a clinically depressed person from a situationally depressed person from a joker, etc. Even trained professionals have problems telling the difference. Better to let actual human beings make the decision to contact the Authorities than Facebook.
I've seen plenty of dystopian movies where the Com
Re: (Score:2)
Facebook staff aren't human?
Re:Fuck this second guessing ... (Score:4, Informative)
So far I have seen no conclusive proof.
Re: (Score:3)
The real question (Score:2, Interesting)
Why are so many Facebook users prone to suicide?
Re: The real question (Score:2, Interesting)
Who says they are prone to suicide? Its got billions of users, finding 3500 potential suicides is way below the norm so what are you basing that on?
Whose fault is it anyway? (Score:4, Interesting)
Why is there a problem for FB? Does FB induce suicide? Why does FB have to do anything just because people use it as intended -- to update others about their current status?
Re: (Score:1)
It's for the same reason we hold DuPont responsible for reporting and preventing suicides that another human may witness while looking through a window made with DuPont glass.
Re: (Score:2)
* not to be confused with "pain of ass", which FB often definitely is.
Re: (Score:2)
One is a crime against nature, one is entertainment...
Everytime I consider going back.. (Score:2)
..I want to kill myself. Does that count?
Re: (Score:1)
Who the fuck uses cpanel? lol..
Re: (Score:2)
Does said loser also "Go broke"? Cause thats what I've been hearing...
The laziest of journalism (Score:2)
So .. Facebook actually created and deployed a tool, which deployed people to a location 3500 times. The journalist who wrote this article is questioning its usefulness, but you would think there would be some way to reach people whose suicides were prevented by these deployments -- or some way to reach the responders whose time was wasted by them.
It appears that the article hasn't even bothered to gather anecdotal evidence on the topic ...
Indeed. And Facebook DOES know some outcomes (Score:3, Insightful)
Indeed no evidence that it isn't useful. It seems rather likely the tool has *some* usefulness; even if it had a 90% false positive rate it would still be good to have.
The author apparently didn't think through the fact that Facebook DOES get some feedback about outcomes. Don't know if it was too late? Maybe the person is already dead? These are people who post on Facebook. Facebook knows if they ever posted again. Heck, Facebook knows what their friends and family posted in the following weeks, and probab
Re: Indeed. And Facebook DOES know some outcomes (Score:3, Interesting)
If you false positive on the wrong person, you could instead of preventing a suicide, cause a veteran style PTSD murder situation.
You also are using those resources that might have gone to a person having a heart attack.
Re: (Score:2)
And let me not-so-guess (Score:5, Interesting)
"Torous, a psychiatrist who's familiar with the thorny issues in predicting suicide, is skeptical of how that will play out with regard to the suicide monitoring tool. He points to a review of 17 studies in which researchers analyzed 64 different suicide-prediction models and concluded that the models had almost no ability to successfully predict a suicide attempt. "We know Facebook built it and they're using it, but we don't really know if it's accurate, if it's flagging the right or wrong people, or if it's flagging things too early or too late," Torous said."
Facebook will keep a database of all the Facebook users they consider suicide threats and sell it to big pharma and local psychologists and insurance companies to make a buck. It will have to be given to Homeland Security by law. Parents will be notified. Schools will be given names to "help" the student before he shoots up the school. And, finally, the family dog doesn't play with the kid anymore. All in his best interest. And that doesn't even consider what it would do to adults. After all, we KNOW it will be hacked and become generally available. That's just the way things work today.
Re: (Score:2)
"Torous, a psychiatrist who's familiar with the thorny issues in predicting suicide, is skeptical of how that will play out with regard to the suicide monitoring tool. He points to a review of 17 studies in which researchers analyzed 64 different suicide-prediction models and concluded that the models had almost no ability to successfully predict a suicide attempt. "We know Facebook built it and they're using it, but we don't really know if it's accurate, if it's flagging the right or wrong people, or if it's flagging things too early or too late," Torous said."
Facebook will keep a database of all the Facebook users they consider suicide threats and sell it to big pharma and local psychologists and insurance companies to make a buck. It will have to be given to Homeland Security by law. Parents will be notified. Schools will be given names to "help" the student before he shoots up the school. And, finally, the family dog doesn't play with the kid anymore. All in his best interest. And that doesn't even consider what it would do to adults. After all, we KNOW it will be hacked and become generally available. That's just the way things work today.
There are already consequences if you say in public that you are going to kill yourself. That's enough to get you an involuntary stay in a psych unit. And your parents and school will hopefully be involved if you are a minor.
Re: (Score:3)
Re: (Score:2)
Do you think saying something like: "I'm going to kill myself!" is all there is to the analysis?
***ALERT** **ALERT*** The automatic comment scanning engine has detected posting keywords indicating a possible self-harm risk by Sqreater.
**AUTOMATICALLY DISPATCHING AUTHORITIES NOW**
Erm... pretty much, probably.
Same deal if someone happens to read out loud the message containing that quoted phrase.
Maybe it is working. Maybe I am too cynical. (Score:5, Interesting)
Re: Maybe it is working. Maybe I am too cynical. (Score:1)
Insurance companies have intelligent contract writers and don't turn down appropriate revenue.
People complain about insurance companies all the time. Complain about prices but I guarantee insurance companies are way ahead of you in any calculation of risk or value. It's why the least profitable business model is insurance fraud.
Facebook doesn't do anything good (Score:1)
Next.
Out of Charity, just let them die (Score:2, Interesting)
Obviously people who want to commit suicide are very unhappy and in pain. It sadistic to prolong someone's suffering just to promote one's parochial "morality".
Let the suffering individuals find peace in the sleep of death.
Re: (Score:2)
I've read somewhere that most of the people saved from their first suicide attempt do not try again, so many of them decide they do in fact still want to live.
Comment removed (Score:4, Insightful)
Re: (Score:1)
Parent is asking to beta-test the project for a while, not to cancel it.
A Harvard psychiatrist is worried the tool could worsen health problems by homing in on the wrong people or escalating mental-health crises. Seems like a good enough reason for the review board to approve the parent's proposed study.
Re: (Score:2)
I suppose you could turn it around: for one half of the populace, call first responders for those FB thinks are potential suicides. For the second half, call first responders for *everyone*. Then see how rates compare. We would need a lot more first responders to do that for the population at large, but if we selected, say 200 r
Depends how you define "good" (Score:3)
...is keeping Facebook-addicts from offing themselves really helping us as a species?
What's so appealing about Facebook? (Score:2)
Re: (Score:2)
Re: (Score:3)
On the other side... (Score:3)
False positives may lead to visits by police force (Score:3)