An Algorithm May Decide Who Gets Suicide Prevention (medium.com) 45
An algorithm, it seems, could determine, in some cases, who gets shown lifesaving information, and who doesn't. From a report: The researchers behind the New Media & Society paper set out to understand this odd quirk of Google's algorithm, and to find out why the company seemed to be serving some markets better than others. They developed a list of 28 keywords and phrases related to suicide, Sebastian Scherr at the University of Leuven says, and worked with nine researchers from different countries who accurately translated those terms into their own languages. For 21 days, they conducted millions of automated searches for these phrases, and kept track of whether hotline information showed up or not. They thought these results might simply, logically, show up in countries with higher suicide rates, but the opposite was true.
Users in South Korea, which has one of the world's highest suicide rates, were only served the advice box about 20% of the time. They tested different browser histories (some completely clean, some full of suicide-related topics), with computers old and new, and tested searches in 11 different countries. It didn't seem to matter: the advice box was simply much more likely to be shown to people using Google in the English language, particularly in English-speaking countries (though not in Canada, which Scherr speculates was probably down to geographical rollout). "If you're in an English-speaking country, you have over a 90% chance of seeing these results -- but Google operates differently depending on which language you use," he said. Scherr speculates that using keywords may simply have been the easiest way to implement the project, but adds that it wouldn't take much to offer it more effectively in other countries, too.
A Google spokesperson, who asked not to be quoted directly, said that the company is refining these algorithms. The advice boxes require the cooperation of local organizations which may not always be available, they said, but that relevant resources will still show up in regular search results. Google said the service does not have comprehensive global coverage, and while it is actively working on new languages and locations, rolling that out takes time.
Users in South Korea, which has one of the world's highest suicide rates, were only served the advice box about 20% of the time. They tested different browser histories (some completely clean, some full of suicide-related topics), with computers old and new, and tested searches in 11 different countries. It didn't seem to matter: the advice box was simply much more likely to be shown to people using Google in the English language, particularly in English-speaking countries (though not in Canada, which Scherr speculates was probably down to geographical rollout). "If you're in an English-speaking country, you have over a 90% chance of seeing these results -- but Google operates differently depending on which language you use," he said. Scherr speculates that using keywords may simply have been the easiest way to implement the project, but adds that it wouldn't take much to offer it more effectively in other countries, too.
A Google spokesperson, who asked not to be quoted directly, said that the company is refining these algorithms. The advice boxes require the cooperation of local organizations which may not always be available, they said, but that relevant resources will still show up in regular search results. Google said the service does not have comprehensive global coverage, and while it is actively working on new languages and locations, rolling that out takes time.
Google uses algorithms? (Score:2)
To try to customize search results? Really?
Doooooood. This is going to be a long holiday weekend.
Suicide booth, baby! (Score:4, Funny)
A company that serves (Score:2)
The researchers behind the New Media & Society paper set out to understand this odd quirk of Google's algorithm, and to find out why the company seemed to be serving some markets better than others
I'd be interested in statistics that show this company is actually 'serving,' that is, do they actually prevent suicides?
Re: (Score:2)
The recent goof-up by Watson shows Medical Diagnosis is still more complex than just an algorithm. The human body is complex, people may fill out all the checkboxes for a particular illness yet still not have it.
Re: (Score:3)
A large number of people have a strong desire to inflict their misguided religious beliefs on others. Today suicide, tomorrow it will identify people at a high risk for abortion.
Suicide isn't a topic the world is ready to discuss rationally.
Re: (Score:2, Insightful)
Suicide and abortion are pretty unrelated.
My body, my choice.
Re: (Score:2)
So for those people with a Hight risk of having Abortion from an algorithm, will make sure they will have resources and guidance to raise a family without impacting their lives.
I am Pro-Life, I don't like Abortions. But making it illegal or hard to obtain doesn't solve the underlining problem on why someone would want to have an abortion. It isn't due to them being a bad person or lack of morals. However, they are in a situation where they need to make a tough decision. Because we live in a world where af
Re: (Score:2)
You are right, the world may not be ready to discuss is rationally, yet I think some people could. A rational discussion should disregard slippery-slope arguments, because every slippery-slope argument is both a false dichotomy and a strawman.
Re: (Score:3)
You are right, the world may not be ready to discuss is rationally, yet I think some people could. A rational discussion should disregard slippery-slope arguments, because every slippery-slope argument is both a false dichotomy and a strawman.
And next thing ya know, everything is a slippery slope leading to nothing but strawman arguments!
Wow, I'm even more depressed now (Score:2)
Even the algorithm doesn't care enough about me to try to prevent me from killing myself.
Where's my popup? (Score:2)