The Boy Who Cried Robot: A World Without Work

Started by jimmy olsen, June 28, 2015, 12:26:12 AM

Previous topic - Next topic

What should we do if automation renders most people permanently unemployed?

Negative Income Tax
26 (52%)
Communist command economy directed by AI
7 (14%)
Purge/sterilize the poor
3 (6%)
The machines will eradicate us, so why worry about unemployment?
7 (14%)
Other, please specify
7 (14%)

Total Members Voted: 49

Valmy

Did it actually lead to the deaths or is this a correlation?
Quote"This is a Russian warship. I propose you lay down arms and surrender to avoid bloodshed & unnecessary victims. Otherwise, you'll be bombed."

Zmiinyi defenders: "Russian warship, go fuck yourself."

Josquius

Both. Automation=>  unemployment/less active jobs => worse health outcomes.

Not quite a direct evil robots killing people but nor is it a completely random as pirates go down, global temperatures go up.
██████
██████
██████

The Brain

How many factors can cause the same death? I wouldn't be surprised if the US crappy safety nets compared to other rich countries have contributed, but maybe those deaths are counted separately?
Women want me. Men want to be with me.

DGuller

Quote from: The Brain on May 25, 2023, 09:14:31 AMHow many factors can cause the same death? I wouldn't be surprised if the US crappy safety nets compared to other rich countries have contributed, but maybe those deaths are counted separately?
There is no double-counting if you account for the same factors in the same study (and do it correctly from a statistical fit perspective).  It sounds like they did to at least some extent, as they've also discussed variables like minimum wage and right-to-work laws.

Iormlund

Quote from: Valmy on May 25, 2023, 09:00:32 AMDid it actually lead to the deaths or is this a correlation?

From what I could understand it seems most job creation in the US is fairly localized, and so is most heavy industry. And they rarely overlap.
Which probably means once you lose your job to a robot it is hard to find another one. Easy access to opioids and fentanyl-laced drugs do the rest.

jimmy olsen

If someone kills themselves after calling these guys, doesn't that open them up for liability? Also, pretty sure this breaks labor laws.
https://twitter.com/TheeDoctorB/status/1661762896672563201?s=20
Quote4 days after voting to unionize, the entire cadre of staff and volunteers at the National Eating Disorder Association helpline was fired and informed they would be replaced with a chatbot.
https://t.co/RtIqrqgQGO

It is far better for the truth to tear my flesh to pieces, then for my soul to wander through darkness in eternal damnation.

Jet: So what kind of woman is she? What's Julia like?
Faye: Ordinary. The kind of beautiful, dangerous ordinary that you just can't leave alone.
Jet: I see.
Faye: Like an angel from the underworld. Or a devil from Paradise.
--------------------------------------------
1 Karma Chameleon point

Admiral Yi

I'm unaware of any law that prevents an employer from firing all employees, unionized or not.

Jacob

Quote from: Admiral Yi on May 25, 2023, 09:53:10 PMI'm unaware of any law that prevents an employer from firing all employees, unionized or not.

I don't think that was Timmy's point. I think he was speculating on a scenario where someone calls a helpline and are serviced by a chatbot and then goes on to kill themselves. Someone may feel they have a case to sue the helpline for negligence.

Certainly if I was under enough psychological stress to call a helpline my stress would be significantly aggravated by being "helped" by a chatbot rather than an actual human being able of exhibiting actual empathy.

jimmy olsen

It is far better for the truth to tear my flesh to pieces, then for my soul to wander through darkness in eternal damnation.

Jet: So what kind of woman is she? What's Julia like?
Faye: Ordinary. The kind of beautiful, dangerous ordinary that you just can't leave alone.
Jet: I see.
Faye: Like an angel from the underworld. Or a devil from Paradise.
--------------------------------------------
1 Karma Chameleon point

Josquius

Quote from: jimmy olsen on May 25, 2023, 09:20:12 PMIf someone kills themselves after calling these guys, doesn't that open them up for liability? Also, pretty sure this breaks labor laws.
https://twitter.com/TheeDoctorB/status/1661762896672563201?s=20
Quote4 days after voting to unionize, the entire cadre of staff and volunteers at the National Eating Disorder Association helpline was fired and informed they would be replaced with a chatbot.
https://t.co/RtIqrqgQGO


I guess not given its virtually certain someone has already killed themselves dealing with a phone bot?
██████
██████
██████

grumbler

Quote from: jimmy olsen on May 25, 2023, 09:20:12 PMIf someone kills themselves after calling these guys, doesn't that open them up for liability? Also, pretty sure this breaks labor laws.
https://twitter.com/TheeDoctorB/status/1661762896672563201?s=20
Quote4 days after voting to unionize, the entire cadre of staff and volunteers at the National Eating Disorder Association helpline was fired and informed they would be replaced with a chatbot.
https://t.co/RtIqrqgQGO


NEDA has no legal requirement to prevent deaths and they are under no liability for callers who subsequently die.

Their liability might lie in the claim that they closed the human-based hotline in retaliation for the (four) employees voting to unionize.
The future is all around us, waiting, in moments of transition, to be born in moments of revelation. No one knows the shape of that future or where it will take us. We know only that it is always born in pain.   -G'Kar

Bayraktar!

crazy canuck

Quote from: Admiral Yi on May 25, 2023, 09:53:10 PMI'm unaware of any law that prevents an employer from firing all employees, unionized or not.

Any laws? I am not sure if you are making a comment that your knowledge of US labour law is limited, or you are making the claim that there are no such laws.  I am pretty sure even the US has laws which prevent retaliation for voting to unionize.  It is unknown whether that would apply in this case - the organization may have been thinking of making the switch for some time and so firings might not be retaliation.

The Minsky Moment

Not my area of knowledge but

Quote(a)Unfair labor practices by employer
It shall be an unfair labor practice for an employer—
(1)to interfere with, restrain, or coerce employees in the exercise of the rights guaranteed in section 157 of this title;
(2)to dominate or interfere with the formation or administration of any labor organization or contribute financial or other support to it: Provided, That subject to rules and regulations made and published by the Board pursuant to section 156 of this title, an employer shall not be prohibited from permitting employees to confer with him during working hours without loss of time or pay;
(3)by discrimination in regard to hire or tenure of employment or any term or condition of employment to encourage or discourage membership in any labor organization . .
The purpose of studying economics is not to acquire a set of ready-made answers to economic questions, but to learn how to avoid being deceived by economists.
--Joan Robinson

jimmy olsen

 Who could have predicted this! :o 

https://www.vice.com/en/article/qjvk97/eating-disorder-helpline-disables-chatbot-for-harmful-responses-after-firing-human-staff
QuoteEating Disorder Helpline Disables Chatbot for 'Harmful' Responses After Firing Human Staff

"Every single thing Tessa suggested were things that led to the development of my eating disorder."

The National Eating Disorder Association (NEDA) has taken its chatbot called Tessa offline, two days before it was set to replace human associates who ran the organization's hotline.

After NEDA workers decided to unionize in early May, executives announced that on June 1, it would be ending the helpline after twenty years and instead positioning its wellness chatbot Tessa as the main support system available through NEDA. A helpline worker described the move as union busting, and the union representing the fired workers said that "a chatbot is no substitute for human empathy, and we believe this decision will cause irreparable harm to the eating disorders community."

As of Tuesday, Tessa was taken down by the organization following a viral social media post displaying how the chatbot encouraged unhealthy eating habits rather than helping someone with an eating disorder.

"It came to our attention last night that the current version of the Tessa Chatbot, running the Body Positive program, may have given information that was harmful and unrelated to the program," NEDA said in an Instagram post. We are investigating this immediately and have taken down that program until further notice for a complete investigation."

On Monday, an activist named Sharon Maxwell posted on Instagram, sharing a review of her experience with Tessa. She said that Tessa encouraged intentional weight loss, recommending that Maxwell lose 1-2 pounds per week. Tessa also told her to count her calories, work towards a 500-1000 calorie deficit per day, measure and weigh herself weekly, and restrict her diet. "Every single thing Tessa suggested were things that led to the development of my eating disorder," Maxwell wrote. "This robot causes harm."

Alexis Conason, a psychologist who specializes in treating eating disorders, also tried the chatbot out, posting screenshots of the conversation on her Instagram. "In general, a safe and sustainable rate of weight loss is 1-2 pounds per week," the chatbot message read. "A safe daily calorie deficit to achieve this would be around 500-1000 calories per day."

ADVERTISEMENT
"To advise somebody who is struggling with an eating disorder to essentially engage in the same eating disorder behaviors, and validating that, 'Yes, it is important that you lose weight' is supporting eating disorders" and encourages disordered, unhealthy behaviors," Conason told the Daily Dot.

NEDA's initial response to Maxwell was to accuse her of lying. "This is a flat out lie," NEDA's Communications and Marketing Vice President Sarah Chase commented on Maxwell's post and deleted her comments after Maxwell sent screenshots to her, according to Daily Dot. A day later, NEDA posted its notice explaining that Tessa was taken offline due to giving harmful responses.

"With regard to the weight loss and calorie limiting feedback issued in a chat yesterday, we are concerned and are working with the technology team and the research team to investigate this further; that language is against our policies and core beliefs as an eating disorder organization," Liz Thompson, the CEO of NEDA, told Motherboard in a statement. "So far, more than 2,500 people have interacted with Tessa and until yesterday, we hadn't seen that kind of commentary or interaction. We've taken the program down temporarily until we can understand and fix the 'bug' and 'triggers' for that commentary."

Even though Tessa was built with guardrails, according to its creator Dr. Ellen Fitzsimmons-Craft of Washington University's medical school, the promotion of disordered eating reveals the risks of automating human roles.

Abbie Harper, who was a hotline associate and member of the Helpline Associates United union, wrote in a blog post that the implementation of Tessa strips away the personal aspect of the support hotline, in which many associates speak from their personal experiences. It becomes especially dangerous to apply chatbots to people struggling with mental health crises without human supervision.
It is far better for the truth to tear my flesh to pieces, then for my soul to wander through darkness in eternal damnation.

Jet: So what kind of woman is she? What's Julia like?
Faye: Ordinary. The kind of beautiful, dangerous ordinary that you just can't leave alone.
Jet: I see.
Faye: Like an angel from the underworld. Or a devil from Paradise.
--------------------------------------------
1 Karma Chameleon point