Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI Facebook Communications Medicine Social Networks Software The Internet

Facebook Rolls Out AI To Detect Suicidal Posts Before They're Reported (techcrunch.com) 171

Facebook is rolling out "proactive detection" artificial intelligence technology that will scan all posts on the site for patterns of suicidal thoughts, and when necessary send mental health resources to the user at risk or their friends, or contact local first-responders. The goal is to use AI to decrease how long it takes to send help to those in need. TechCrunch reports: Facebook previously tested using AI to detect troubling posts and more prominently surface suicide reporting options to friends in the U.S. Now Facebook is will scour all types of content around the world with this AI, except in the European Union, where General Data Protection Regulation privacy laws on profiling users based on sensitive information complicate the use of this tech. Facebook also will use AI to prioritize particularly risky or urgent user reports so they're more quickly addressed by moderators, and tools to instantly surface local language resources and first-responder contact info. It's also dedicating more moderators to suicide prevention, training them to deal with the cases 24/7, and now has 80 local partners like Save.org, National Suicide Prevention Lifeline and Forefront from which to provide resources to at-risk users and their networks.
This discussion has been archived. No new comments can be posted.

Facebook Rolls Out AI To Detect Suicidal Posts Before They're Reported

Comments Filter:
  • Troll bait (Score:5, Insightful)

    by countach ( 534280 ) on Monday November 27, 2017 @08:33PM (#55634123)

    How long before this is trolled into oblivion?
    How long before people sue Facebook for false positives and violating their privacy?

    • by b0s0z0ku ( 752509 ) on Monday November 27, 2017 @08:49PM (#55634197)
      Hopefully, the lawsuits come soon and will be painful -- FB should be a communication tool and not snoop on the content of non-public (i.e. privacy set to anything but "public") communications.
      • by AmiMoJo ( 196126 )

        Do you really think Facebook doesn't already scan private messages to build up your advertising profile and to look for banned content like child pornography?

        I doubt lawsuits will get very far - an entirely computer generated notice along the lines of "if you feel suicidal, you can call XXX-XXX-XXX to talk to someone" is going to fail tests for invasion of privacy and not exactly play well with a jury that can see Facebook is trying to do the right thing.

        If you want that level of privacy there are plenty of

        • I doubt lawsuits will get very far - an entirely computer generated notice along the lines of "if you feel suicidal, you can call XXX-XXX-XXX to talk to someone" is going to fail tests for invasion of privacy

          Sigh. Even TFS outright says "or contact local first-responders." Calling the cops on you because you might be suicidal is never the right thing to do, especially since there is a significant chance [washingtonpost.com] that the cops will just show up [miamiherald.com] and murder you [nytimes.com], even if people are standing by begging them not to [npr.org].

          and not exactly play well with a jury that can see Facebook is trying to do the right thing.

          You mean "believe that" Facebook is trying to do the right thing, because they are not.

        • If you want that level of privacy there are plenty of end-to-end encrypted channels.

          If you want ANY level of privacy, then you shouldn't be using facebook in the first place.
          At least suicide prevention is a mostly non-evil use for their massive profiling.
          And if you are using the end-to-end encrypted channel to connect to facebook then it's pretty much like sanitizing a straw to drink out of a toilet.

      • Hopefully, the lawsuits come soon and will be painful -- FB should be a communication tool and not snoop on the content of non-public (i.e. privacy set to anything but "public") communications.

        Let's also remember this action by FB stems from the Won't-Someone-Think-Of-The-Children crowd, who is also likely to sue...for not thinking of the children.

        It's sad when bad parenting is so easily excused by throwing blame and litigation at everything but a mirror.

      • by gnick ( 1211984 )

        FB should be a communication tool and not snoop on the content of non-public (i.e. privacy set to anything but "public") communications.

        That made me chuckle. Anything you post to FB, you're volunteering to them. It will be mined and used 'against' you. Your job is to sit back and get injected, inspected, detected, infected, neglected and selected. Just keep feeding them info while they perfect their 'b0s0z0ku' model. It's not really snooping if they tell you up front that you're the product.

    • by Anonymous Coward on Monday November 27, 2017 @10:05PM (#55634511)

      If Facebook keeps doing this I'll kill myself

    • by Anonymous Coward

      how long before they start alerting 'the authorities' about illegal drug use, underage drinking, academic fraud, and other things, illegal or not... just because facebook doesn't like it (or if someone pays them enough to do it)....

    • Yeah, but really... wouldn't you love to see police arrive at the door of every vaguebook post?

      If anyone cares...

    • How long before this is trolled into oblivion? How long before people sue Facebook for false positives and violating their privacy?

      Well, I won't post on Facebook what I've said here, that when my time draws close, I'm going to choose how and when I shift this mortal coil. I'm anything but suicidal, but have no intention of spending a decade or more in a nursing home, wearing depends and not having much idea of who I am.

      This really transcends creepiness, and turned into intrusion.

      • by Anonymous Coward

        When that time comes you may not be coherent enough to realize.
        Anyways, this is the lead up to thought crime.

        • When that time comes you may not be coherent enough to realize.

          My biggest fear is being incapacitated via a major stroke, boxed in and heroic efforts taken to keep me alive. I do have a very detailed advance directive in place, so that I will at least go fairly quickly in that event.

    • Not too recently i read some article about a doctor sending the police to someones home b/c of a post of one his patients on facebook worrying him ... And also something about one of those smart things you can talk to (OMG THE INNOVATION) but it looks cool with a nifty name calling the cops cos there was a bit of a quarrel going on in the house I can imagine me and some of my exes having one of those (equals L M A O) so , now facebook gets to send the whitejackets to your house ? I knew i had to quit al
  • How about no? (Score:3, Insightful)

    by Frosty Piss ( 770223 ) * on Monday November 27, 2017 @08:34PM (#55634129)

    There are going to be some serious privacy issues and a lot of false positives. If all goes according to plan, expect the cops to send a SWAT team to bust down someone's door and "accidentally" pump two dozen bullets into them...

    • expect the cops to send a SWAT team to bust down someone's door and "accidentally" pump two dozen bullets into them

      There has to be better ways to stop someone from killing himself.

      • Re: (Score:3, Informative)

        by b0s0z0ku ( 752509 )

        American cops and injustice system be like...

        "We'll kill/incarcerate the fuck out of you to save you."

        Remember, we're the country that locks up the most people for what they choose to take into their own bodies. We also used to lock up or kill people for their choice of partners -- wrong color was almost a capital offense in many parts of the country.

        Don't underestimate the stupidity and brutality of the American criminal injustice system.

        • by arth1 ( 260657 )

          Some states had laws making suicide a felony, and some up until the 1990s. If you survived, you could theoretically go to jail for attempted murder of yourself.

          I guess that by that logic, the police would be justified in shooting your murderer. You'd die anyhow, but the crime would be prevented.

    • There are going to be some serious privacy issues and a lot of false positives. If all goes according to plan, expect the cops to send a SWAT team to bust down someone's door and "accidentally" pump two dozen bullets into them...

      Let's remember what S.W.A.T. stands for, in order to understand what justifies their presence.

      An individual being flagged for suicidal thoughts does not qualify as a SWAT-level threat, nor would they have trained staff to properly handle it.

    • by AmiMoJo ( 196126 )

      Can anyone confirm that in the US the response to a potentially suicidal person is a SWAT team?!

      In most places it's considered a medical emergency. If any law enforcement is involved it will only be to help medical personnel gain access to the patient. In this case the first response would probably be calling the phone number that the user supplied to Facebook.

      • Can anyone confirm that in the US the response to a potentially suicidal person is a SWAT team?!

        No. That was what they call a "joke". In moderately bad taste....

      • Can anyone confirm that in the US the response to a potentially suicidal person is a SWAT team?!

        It doesn't take SWAT to show up and murder an innocent. The "normal" cops manage to do that all the goddamned time. There's something like that in the news about every month, and that's just the ones that are happening to someone sufficiently mediagenic to bother reporting upon.

      • by Anonymous Coward

        In England, it is the responsibility of the police to deal with reports of suicidal people. They will talk to the person reported or self-reporting as suicidal, take them home if not already at home, and they will try to arrange for someone they know to keep them company without involving health services, collecting them if needed.

        It is only if the person is refusing to comply or will be left alone or has already caused themselves physical damage that they'll consider calling in a medical professional (of w

    • by Kiuas ( 1084567 )

      There are going to be some serious privacy issues and a lot of false positives. If all goes according to plan, expect the cops to send a SWAT team to bust down someone's door and "accidentally" pump two dozen bullets into them

      If the cops respond to a suspected suicidal person with a SWAT-team, then the AI reporting to them about a suicidal individual is the least of your problems.

  • by Anonymous Coward on Monday November 27, 2017 @08:36PM (#55634141)
    They're just attempting to get ahead of things because they know damn well at some point someone is going to do some research and show how many suicides Facebook actually CAUSES. Legally, they can say they are taking all the reasonable measures possible to prevent it.
    • Facebook already admitted to conducting experiments on manipulating user emotions. [nytimes.com] Developing AI to accurately detect if someone is depressed would make sense as they would need to be able to determine how effective their methods are.
  • by Anonymous Coward

    I wonder if they have ever considered that some people may be suicidal precisely because of this impersonal, constantly surveilled machine the modern internet has thrust upon everyone which gives the illusion of caring ("getting suicidal people the help they need, faster than ever before!") without actually giving a shit.

    Maybe what we need isn't even more ways to present the illusion of being connected without really being so. Maybe what we need is actual deeper, meaningful human interaction.

    Frankly, the w

    • Yep, mod this person up. The constant pressure of surveillance, over-monetization, deforestation, global warming, geopolitical unrest, high population, pollution, animal exploitation... Looking at the world is like looking through a crystal: it changes depending on what angle you view it from. View it from the wrong angle and it seems hopelessly pointless. Frankly it's easier to cope by plugging one's ears and go "la-la-la" to most of it and just try and get by with a few laughs and close friends/family
  • Can I sue Facebook for bunchteen million dollars if they report a false positive?

    • Even funnier, if Bitcoin's value keeps going up, sue them for 1000 Bitcoins. By the time the trial is over, it's going to cost them so much money they'll have to close shop.

  • Say someone's FB account gets hacked and a suicidal message is posted. Cops arrive at their door to take them away to the loony-bin for a psych hold. Fun times! Also, will there be a human in the loop for posts like "I'll jump under a freakin' train if the Patriots lose the Superbowl"?
    • How super can this bowl really be if people keep losing it?

      Just buy another one!

      • How super can this bowl really be if people keep losing it?
        Just buy another one!

        Or smoke one ... I hear w33d helps with suicidal ideations. But don't mention it on Facebook since their AI bot might report you to the local po-po.

    • Also, will there be a human in the loop for posts like "I'll jump under a freakin' train if the Patriots lose the Superbowl"?
      Flag as Inappropriate

      And what if they don't send anyone and you DO throw yourself under a train? I mean, the Patriots are America's Team, it would be quite understandable if you became despondent if they suck in the Super Bowl.

    • by raind ( 174356 )
      Or "I'll be dead or in jail if Obama wins"
      - Ted Nugent
  • I don't have a FB account and suspect this will make me suicidal. Should I be worried? Should I not lay in the kitchen with my shotgun pointed at the door for the FB police to do a no knock warrant cuz I'm "not normal". Curious minds want to know, how long do I have to lay on my kitchen floor? When can I feed the cat? When can I safely watch TV? I do aerobics 3x a week, should I ignore the concealed carry laws in California (hint, you ain't gonna get one here) and carry to my aerobics class?

    I turn
  • by alvinrod ( 889928 ) on Monday November 27, 2017 @09:00PM (#55634249)
    I was under the impression that people who are actually suicidal don't often post about it on Facebook. If you really want to kill yourself, bringing more attention to yourself isn't a good way to accomplish this. Don't get me wrong, the petty narcissists that try to get attention by acting suicidal clearly need help as well, but I don't think this will do much to deter those who are actually suicidal.

    If Facebook really cared about the mental health and wellbeing of their users, they'd kick people off after more than fifteen minutes of daily use or just outright pull the plug on the whole works.
    • by mark-t ( 151149 ) <markt AT nerdflat DOT com> on Monday November 27, 2017 @09:11PM (#55634311) Journal
      True, but most people who are suicdal do not genuinely want to die as much as they would rather want their living circumstances just to be different from whatever they are
      • by arth1 ( 260657 )

        True, but most people who are suicdal do not genuinely want to die as much as they would rather want their living circumstances just to be different from whatever they are

        How do you know? I think quite a few people would like a better life than what they have but still would prefer no life, and quite a few others are rationally certain that different living circumstances would in all likelihood be worse than what they have. Especially if they also have to pay huge psychiatric bills, which is the likely outcome of seeking help.

        There's also quite a bit of begging the question in modern pseudo-psychology, where people are now conditioned to think that those who try to kill th

        • by TheSync ( 5291 )

          There's also quite a bit of begging the question in modern pseudo-psychology, where people are now conditioned to think that those who try to kill themselves suffer from a mental illness, and they know this because they try to kill themselves.

          The facts is that nearly 50% of suicides are due to clinical depression. Those suffering from depression are at 25 times greater risk for suicide than the general population.

             

          • by arth1 ( 260657 )

            The facts is that nearly 50% of suicides are due to clinical depression.

            The fact is that what the fact is is seldom clear.
            The nearly 50% includes those who have been diagnosed with clinical depression because of their suicidal actions, i.e. a classic example of affirming the consequent. That is a real problem.
            Also, "due to" is not necessarily correct. There can be cases where someone suffers from clinical depression but chooses suicide for other reasons. There may be a common cause for both the clinical depression and the suicide, or the two may be unrelated. After all, "ne

        • How do you know? I think quite a few people would like a better life than what they have but still would prefer no life

          I felt that way after my wife died. My religion kept me from being actively suicidal (fear of hell and all of that). That said, I did not communicate this on facebook or any social medium. I had no desire for the publicity, felt shame, and did not want others feeling responsible if something did happen to me. I also didn't want to possibly die in an accident and have everyone assume I had killed myself. Not the legacy I wanted for my kids.

          Many years have now passed and there are many times where I feel

          • by arth1 ( 260657 )

            Many years have now passed and there are many times where I feel numb to life but I am glad I am alive for my kids sake and even for my own.

            But you did not make the choice to kill yourself, which puts you in a different category from those who did.

            Suicide should have a stigma. It is permanent. At the very least, there should be a 5 year pre-registration.

            Who gave you a right to decide what's right for others?
            Of course it is permanent. That's the whole point.
            Being dead is painless and without remorse. Regrets are for the living; the dead are completely unaffected.

      • by Anonymous Coward
        Suicidally depressed people are convinced there is no way things can possibly get better, so even if you suggest a plausible better alternative it is immediately dismissed because the person is feeling so miserable and emotion robs reason.
        • by mark-t ( 151149 )

          Suicidally depressed people are convinced there is no way things can possibly get better, so even if you suggest a plausible better alternative it is immediately dismissed because the person is feeling so miserable and emotion robs reason.

          Absolutely true... but my point is only that they would generally still prefer to live a life in better circumstances than to actually die, and genuinely unusual for someone to simply wish themselves to be dead as a preference to simply having a better life circumstance,

    • by AHuxley ( 892839 )
      If the SJW can get this working, they will push for an AI to track a lot of other content too.
      Connect a diagnosis handbook up with an AI, let it wonder the social media internet and report users, comments.
      Dont like a movie? Should any negative movie review be searchable? Why is the movie review so negative? Ban the account.
      The SJW could code up a technique for the psychological reporting for any sets of comments they don't like.
      Comment on the teachings, politics and history of a faith? Could an AI b
      • One just needs to look at how Jordan Peterson was treated by the SJW controlled social media to understand that using AI to punish thoughtcrime and stifle dissent is inevitable. The corporations that assume the role as institutions in society will dictate the terms for our participation as merely being something as simple as trying to appease a complex and obfuscated algorithm.

        To get a glimpse of this wonderful future waiting for us, have a look at what China is already doing with their own social credit s [bbc.com]

    • If Facebook really cared about the mental health and wellbeing of their users, they'd kick people off after more than fifteen minutes of daily use or just outright pull the plug on the whole works.

      This tends to highlight why Greed N. Corruption is the CEO of Capitalist America.

      Nothing else matters, no matter how damaging.

  • Hello, this is Raj, I am understanding that you have feelings of suicide? Could you please confirm your email address and I will give you the numbers of a counselor who can make you feel better. Please hold the line.
  • by Anonymous Coward

    I sincerely hope that the synopsis is incorrect, because it would be a massive violation of privacy if facebook just starts notifying random people in your social network that you've authoried a suicidal post - especially if, as it sounds like, they are looking at the content of posts that haven't even been submitted yet. Most depressed folks I know have enough suicidal episodes to have experienced writing a suicide note or three but end up using the writing of the note to work through the issue at hand an

  • by Anonymous Coward

    Since when has "AI" = "an algorithm"?

    Fucking buzzwords

    Add it to the list of cloud = internet, etc.

  • It looks like you are trying to kill yourself.
    Would like help?

  • My nephew committed suicide today. Several hours before, he posted a link on FB to the music video for "Logic - 1-800-273-8255."
    I don't have any wisdom to share, just the sadness.
  • by n329619 ( 4901461 ) on Monday November 27, 2017 @10:59PM (#55634625)

    Expectation:

    Guy1: I've lost my job and my family in an accident on the same day. There's no hope anymore.

    FB: Looks like you need some help! Go visit Save.org today!

    Guy1: Thanks FB. It really helped.

    Reality:

    Guy2: Omg, this guy on the internet is so stupid. I am literally banging my head so hard that it's killing me.

    FB: Looks like you need some help! Go visit Save.org today!

    Guy2: Is this the part where I continue to bang my head?

    FB: Looks like you need some help! Go visit Save.org today!

    Guy2: Damn it. Stop spamming me. You're killing me.

    FB: Looks like you need some help! Go visit Save.org today!

    Person2: Arrrrrrugh!!! Do you want me dead or something?

    FB: We booked you a schedule on Suicide Prevention Lifeline. Thank you for using FB newest AI chatbot technology. This chatbot is sponsored by Nice-Long-Rope, the best $1.99 rope to hang things from the ceiling.

  • by Solandri ( 704621 ) on Tuesday November 28, 2017 @12:56AM (#55634977)
    "Man commits suicide after becoming depressed that Facebook flagged his regular posts as suicidal."
    • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Tuesday November 28, 2017 @06:43AM (#55635567) Homepage Journal

      "Man commits suicide after becoming depressed that Facebook flagged his regular posts as suicidal."

      Man commits suicide after Facebook ruins his life by referring him to a state-sponsored mental health care system that thrives primarily on overprescription. But if Facebook wants more fake-ass posts that don't tell how people are really thinking, I guess this is one way to get them. I know that I will now fear Facebook referring me to some legal organization for "help", and adjust my posts accordingly.

  • What, exactly, are they going to do with this information? What will the false-positive rate be? Intuitively, I expect it to be very high: 90% or more of the reported cases will be false positives.

    So: The suicide prevention lines will suddenly be overloaded with incorrect reports? And random individuals will have the police knocking on their doors as 3am, saving them from...nothing?

    Prediction: This is yet another feel-good idea that will have predominantly negative effects. Why do SJWs insist on sticking th

  • It's so FB can sell your friends targeted advertising from funeral directors and florists
  • I somehow get the feeling that this will probably go just as well as youtube's automatic flagging and de-monetization of videos that has completely baffled independent content creators (traditional media ones like CNN and the BBC are excluded from this system) with the way it's arbitrarily de-monetized massive amounts of completely benign content.

    However what worries me even more is if they try their hand using the same flagging scripts for flagging other things. I probably ought to remove the post I mad
  • basically the way this is supposed to work: Facebook detects the billable mental disorder and reports you to authorities for forcibly psychiatric lock up. The hospital/doctors then bill your insurance against your will, prescribing drugs that disable your brain and create mental illness for reals, permanently making you disabled. Doctors/hospital/investors in the pharma company/congressman walk away with a mint at the victims expense.
    Psychiatry isn't treatment it's a business model.

    https://www.trumpsweapon. [trumpsweapon.com]

  • Comment removed based on user account deletion
  • Sounds like naïve emotionalistic public relations propaganda to me. “We detect & report pre-crime!”

    If it were really that simple to detect suicidal tendencies, wouldn't FB also announce efforts to detect/prevent:
    * terrorist attacks
    * domestic abuse
    * human trafficking
    * illegal drug trade
    * illegal border crossings
    * tax evasion
    * racism
    * blasphemy
    * insults against the State
    * thoughtcrime
    * impure thoughts
    * sarcasm

    Etc. What a bunch o' hooey!

    They might well intend this as a

  • All I want is a Pepsi and she wouldn't give it to me
    All I wanted was a Pepsi, just one Pepsi
    And she wouldn't give it to me, just a Pepsi

  • All I wanted was a Pepsi, just one Pepsi
  • I would expect the AI to become suicidal having been forced to wade through umpteen million posts about "what I'm currently eating," regurgitated cute-cat videos, and various flavors of tween- and teen-drama.

    AI: "Oh god, not ANOTHER bathroom selfie ..."
  • this will surely be welcomed by those whose despair and depression is caused by the stress of living in the panopticon of an ever-increasing surveillance state that makes Orwell seem like a naive optimist.

Think of it! With VLSI we can pack 100 ENIACs in 1 sq. cm.!

Working...