Fired Tech Workers Turn To Chatbots for Counseling (bloomberg.com) 96
An anonymous reader shares a Bloomberg report: For months Lovkesh Joshi was quietly terrified of losing his job as a manager at a top Indian tech services company. Joshi didn't want to burden his wife or friends so he turned to a chatbot therapist called Wysa. Powered by AI, the app promises to be "loyal, supportive and very private," and encourages users to divulge their feelings about a recent major event or big change in their lives. "I could open up and talk," says the 41-year-old father of two school-age children, who says his conversations with the bot flowed naturally. "I felt heard and understood." Joshi moved to a large rival outsourcer two months ago. The upheaval in India's $154 billion tech outsourcing industry has prompted thousands of Indians to seek solace in online therapy services. People accustomed to holding down prestigious jobs and pulling in handsome salaries are losing out to automation, a shift away from long-term legacy contracts and curbs on U.S. work visas. McKinsey & Co says almost half of the four million people working in India's IT services industry will become "irrelevant" in the next three to four years. Indians, like people the world over, tend to hide their mental anguish for fear of being stigmatized. That's why many are embracing the convenience, anonymity and affordability of online counseling startups, most of which use human therapists.
Re: (Score:2)
> No one is doing that
Oh yes they are; some people are really strange.
The early chat bots - and I mean EARLY, as in 'about as likely to pass a Turing test as a passage from your preferred dictionary' - had people seeking therapy from them.
Re: (Score:2)
Both can be true, you know. Stories can be hyped for financial gain AND be true.
I suggest you read about Eliza, a much more primitive bot.
https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:2)
The Diceman? Bill Burr?
$600/hour is kind of cheap.
Re: (Score:2)
Eliza, and chatbots, are the ideal of "Rogerian" therapy. See https://en.wikipedia.org/wiki/... [wikipedia.org]
Don't knock Eliza; I use Eliza to help in my life struggles http://www.manifestation.com/n... [manifestation.com]
Re: (Score:2)
> No one is doing that
Oh yes they are; some people are really strange.
The early chat bots - and I mean EARLY, as in 'about as likely to pass a Turing test as a passage from your preferred dictionary' - had people seeking therapy from them.
I would think a therapist chat bot would be an extra easy one to pass a Turing test.
All you have to do is program it to respond to every comment with "And how does that make you feel?" and no one will know they're not talking to a real-live therapist.
Re: (Score:2)
In fact, I think that's pretty much the algorithm behind Eliza/DOCTOR, which might be the first therapist chat bot.
Eliza was more of a 'rephrase it in the form of a question that ends in 'how does that make you feel?', so I guess that makes it unnecessarily sophisticated!
Re: (Score:2)
Paging Dr. Eliza....Dr Freud Eliza....
Get a damn friend (Score:1, Interesting)
>the 41-year-old father of two school-age children, who says his conversations with the bot flowed naturally. "I felt heard and understood."
No way would I hire someone who feels 'heard and understood' after an exchange with a chat bot. This is somebody without the social skills to have anyone in their life to talk to, and will spill to a dumb text parser. How can you have the intellectual capacity to understand what a chat bot is and still gain any emotional benefit from interacting with one?
Root cause
Re:Get a damn friend (Score:5, Insightful)
I really feel like many people around me are chatterbots, just waiting for positive or negative reenforcement in order to engage predetermined responses.
Re:Get a damn friend (Score:5, Funny)
That is interesting. Please go on.
Response courtesy of Eliza: https://www.eclecticenergies.c... [eclecticenergies.com]
Re: (Score:2)
>Response courtesy of iPhone next word prediction.
Ugh. I have enough trouble with my iPhone autocorrecting what I've already typed without dealing with it telling me what it thinks I'm going to type next.
Apparently this is something that is (thankfully) off by default, because I have never experienced it.
Re: (Score:2)
ELIZA: Can you tell me what is going on in you right now ?
YOU: I really feel like many people around me are chatterbots, just waiting for positive or negative reenforcement in order to engage predetermined responses.
ELIZA: Can you tell me more about what you feel about this ?
YOU: I really feel like many people around me are chatterbots, just waiting for positive or negative reenforcement in order to engage predetermined responses.
ELIZA: So, you really feel like many people around you are chatterbots. What d
This about sums up what I have found. (Score:1)
My generation hasn't really had lasting friendships since I was in my teens. My parents generation saw most of their friendships dissolve a few years later than that, due to a combination of alcoholism, migration to other regions, or gossip in extended social circles that caused some bridges to burn.
Me personally, avoiding drugs, alcohol, religion, and desiring privacy pretty much alienated me from everyone in my age group. Doesn't mean I can't do a good job, but it does affect the social network for other
Re: (Score:3)
I generally put people into two categories with regards to their ideas of friendship:
1) Anyone they know and don't hate is a 'friend'. Their relationships are shallow and unreliable, and contact may be infrequent.
2) Friends are rare people they know, like, and have enough of a social bond to depend on them without question in an emergency.
If you, like me, are in the second group... you should be aware it is quite possible you might never find that kind of relationship (I exclude my wife from the count, si
Re: (Score:2)
1) Anyone they know and don't hate is a 'friend'. Their relationships are shallow and unreliable, and contact may be infrequent.
Maybe they've got an finer grained grading scale between 'close friend', 'friend', 'acquaintance' and 'enemy' than that but they don't expose the details of that grading scale to anyone else, mainly because it's subject to change for any individual.
It's like in software. You don't document internal details if you think they might change later. Same with how much you trust people, which is really what differentiates close friends from acquaintances.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
The fact that you and baron yam responded exactly the same proves my point.
Re: (Score:1)
Root causes, buddy, root causes. Figure out why you don't have an actual intelligent human in your life to discuss this stuff with, maybe work on that. Because humans are social primates, and if you're not taking care of your social needs, everything else will eventually crumble anyway.
Re: (Score:2)
>If it's programmed intelligently there's no reason it couldn't be equivalent to (or better than) a human therapist.
Even though humans are doing the same thing - receive stimulus/apply rule/respond - there's no program out there yet that is anywhere near complex enough to do what we do.
So far as I am aware, the Turing test has only been 'passed' by severely constraining the breadth of conversation.
Re: (Score:1)
So far as I am aware, the Turing test has only been 'passed' by severely constraining the breadth of conversation.
How long have you been aware, the Turing test has only been 'passed' by severely constraining the breadth of conversation?
Re: (Score:3)
It looks that you way overestimate your own people skills.
People pray and gain emotional benefit from that. A chatbot at least answers.
Re: (Score:1)
>People pray and gain emotional benefit from that.
I don't trust people who pray, either... but as a general rule religious people seem to manage to compartmentalize their irrational thinking.
>A chatbot at least answers.
A person who prays provides their own answer even if they're not realizing that is the case, so unlike a chat bot there's actually some intelligence there.
Re: (Score:2)
Not really, because they don't consider it being irrational. While a person conversating with a chatbot merely pretends that it is a conversation with a real person. No different than a bit of daydreaming.
So you don'
Re: (Score:2)
You appear to be deliberately misunderstanding me just to be contrary. That's not very productive.
Re: (Score:2)
You want productive? Hire an IT engineer who has a chatbot therapist.
Re: (Score:1)
I now feel more sad for you then I did for the displaced workers. I hope you recover from such a sad mental state soon.
Re: (Score:3)
A person who prays provides their own answer even if they're not realizing that is the case, so unlike a chat bot there's actually some intelligence there.
Have you ever started talking to a coworker about some sort of issue and in the process of explaining it figured out the solution yourself? How often does a therapist really any have true insight versus simply talking you through your emotions? I think you've fallen off the deep end if you think a chat bot "understands you", but I have no doubt that it can have a big effect to verbalize your thoughts even if you're talking to a teddy bear or rag doll, picture or grave of the deceased or some other inanimate
Re: (Score:3)
>the 41-year-old father of two school-age children, who says his conversations with the bot flowed naturally. "I felt heard and understood."
No way would I hire someone who feels 'heard and understood' after an exchange with a chat bot. This is somebody without the social skills to have anyone in their life to talk to, and will spill to a dumb text parser. How can you have the intellectual capacity to understand what a chat bot is and still gain any emotional benefit from interacting with one?
Root causes, buddy, root causes. Figure out why you don't have an actual intelligent human in your life to discuss this stuff with, maybe work on that. Because humans are social primates, and if you're not taking care of your social needs, everything else will eventually crumble anyway.
Easy to say in your culture. In other cultures where hierarchy can make you or break you (literally), this is just not possible.
You never understand your own culture (the pros and cons of it) until you have actually stepped out of it, at least for long enough to allow some reflection.
Re: (Score:3)
Don't know, is the father of two school-age children so presumably a human woman agreed to unprotected sex with him at some point.
That suggests his social skills go far beyond the typical Slashdot poster.
Re: (Score:1)
He's from a primitive third world culture where women have no rights and are arranged to be married to their uncle or cousin.
Re: (Score:3)
Re: (Score:3)
No way would I hire someone who feels 'heard and understood' after an exchange with a chat bot. This is somebody without the social skills to have anyone in their life to talk to, and will spill to a dumb text parser. How can you have the intellectual capacity to understand what a chat bot is and still gain any emotional benefit from interacting with one?
Many people become IT engineers because this type of jobs allow them to isolate from society as much as possible while still being productive and earning money. Yes, they would feel more comfortable talking to a chat bot.
There's also the type of people who were severely betrayed by others while their character was forming (e.g. during childhood or teenage years). They grew up to distrust other human beings. Yes, it's pathological but the issue is still there. Those chatbots might be a step towards opening u
Re: (Score:2)
You grossly misunderstand the size of a nuclear explosion.
The largest bombs can effect an area of a 30 mile radius. Which would devastate a city and 8 other nearby cities and towns. (being most towns are 10 miles apart)
This is huge... However not enough to kill off a subcontinent.
Re: (Score:2)
Re: (Score:2)
From the sound of things, India and China are not happy with each other over some border issues. We might all be soon breathing in the fallout of two billion freshly vaporized potential H1Bs if they decide to swap nukes.
China thinks that every piece of land that a Chinese citizen has ever set foot belongs to China and is historically Chinese. China and India ARE both mature enough not to nuke each other over a barely habitable stretch of mountains though.
Re: (Score:2)
Re: (Score:2)
You can quit.
You don't have to work. However if you want a steady income, it is a good idea to stay.
To the sound of a really tiny violin... (Score:4, Funny)
At least they don't have to train their replacements.
Re: (Score:2)
It'd be able to train itself. Or is all this AI stuff largely hype? I'd be shocked, shocked if that was the case!
Re: (Score:2)
At least they don't have to train their replacements.
Dang straight.
I don't blame them personally, of course ... in their place I probably would have done the same. I blame our own countrymen who sold out so many.
Still, hard to work up too much sympathy. Oh dear, the robot works cheaper, does it? How about that.
I would feel insulted. (Score:2)
That would be we care about your mental health but you aren't worth a human's time.
Re: (Score:3)
That would be we care about your mental health but you aren't worth a human's time.
If you're paying for a human's time, you're not really worth their time either. They're not doing it for you, they're doing it for the money. Head shrinkers can't afford to care about you; if they cared about everyone as real people, they wouldn't have any emotional energy left to care for themselves. They just go through the motions for money, and then you hopefully feel better for expressing yourself. A chatbot can go through the same motions.
Re: (Score:2)
And if your employer is not paying for you technical skills they are not worth your time either right?
That is life. I like to think people do care about other people. It's just everyone has issues and problems and are overwhelmed in their own worlds. THey do no not need to hear yours. The good news is most people don't give a shit about your employment so the only gorrilla in the room is in the man in the mirror.
It would be nice if everyone can do what they love without the stupid money, connections, and sc
Re: (Score:2)
Re: (Score:2)
Shrinks are nuts. They choose their carrier in hope of someday diagnosing and fixing themselves.
Re: (Score:2)
While a therapist does have to avoid becoming overly emotionally attached to individual patients, many of them, like many other medical professionals, chose their careers because they want to help people.
That there are many medical professionals that want to help people doesn't change the fact that there are many people getting into medical professions for the money.
Re: (Score:2)
A person can see whats going on and make recommendations a chat bot isn't going to be able to do that.
I don't think its as much about them caring as them actually being able to help in situations where required.
Re: (Score:2)
Re: (Score:2)
That would be we care about your mental health but you aren't worth a human's time.
Humans are not nice or altruistic by any sense. Sure we have built in empathy as a group so we can work together, but we have killed species, ruined land and the environment, and hurt other people for personal gains. What makes you think such a bad species cares about you unless they can get something off you? ... if I provide no value in return.
Well when I hit hard times I acknowledged that I am exactly worth 0 to everyone
It sucks and is depressing but your Momma is the only one who will ever love you unco
Re: (Score:2)
That would be we care about your mental health but you aren't worth a human's time.
Yell at the people who pushed the big anti-mental health hospitals in the 1970's and 1980's, then further pushed the revolving door medication system. We're still suffering from that massive screw-up and probably will for another 30 years.
Re: (Score:2)
Those people are called the ACLU. Take it up with them.
Commitment has historically been a mechanism of police states. Never forget that. The flipside of loonies shitting in the streets is people with wrong opinions sitting in the loony bin.
So where are the... (Score:2)
If you can't.... (Score:2)
If you can't confide to and rely upon your friends and family for support in hard times, then you really have no friends and family. Time to find better people to surround yourself with.
Re: (Score:2)
"you're the outsourced people who we pay do this shit, and not only can't you actually fucking to it, but you think I'm going to do this shit?"
In a lot of cases, sending the instructions is the most useful response. If its something that's going to come up often, and the person who raised the item is capable... just getting them to do it instead of having them open tickets and wait each time is a better use of everyone's time.
It also is sometimes the best way to deal with people who are never available. I'v
Re: (Score:2)
You know, I've had several occasions to bump up against Indian IT services ... and on all occasions, it was impossible not to notice that at the start of the contract you might have gotten a couple of intelligent people with an actual skillset, and that as time went on you got utter morons who could do nothing but follow a script.
There's usually one really smart guy in the group and a dozen who don't know what they're doing. Usually the smart guy is sent overseas to the client to act as a local contact and at that point the idiots get left alone at home without the smart guy holding their hands and correcting them.
It would be much more useful if they sent one of the idiots and left the smart guy to supervise the ones left behind.
I use vi (Score:3)
I don't even have a command line psychotherapist available.
Re: (Score:2)
And how does that make you feel?
Just dial 999-999-9999 (Score:2)
Of course your secrets will not be used against you. Only the oppressors will be outed and held accountable.
I am Dr. Spaitso from Creative Labs (Score:2)
All conversation will be held in the strictest of confidence.
Memory will be wiped after you leave.
So, tell me about your problems.
(forgive me if I've flubbed a line or two, it's been close to thirty years since my second sound card).
Why so sympathetic (Score:1)
Male strength (Score:2)
Why not Life Coaching? (Score:1)