And we're back!
Started by jimmy olsen, June 28, 2015, 12:26:12 AM
Total Members Voted: 49
Quote"This is a Russian warship. I propose you lay down arms and surrender to avoid bloodshed & unnecessary victims. Otherwise, you'll be bombed."Zmiinyi defenders: "Russian warship, go fuck yourself."
Quote from: The Brain on May 25, 2023, 09:14:31 AMHow many factors can cause the same death? I wouldn't be surprised if the US crappy safety nets compared to other rich countries have contributed, but maybe those deaths are counted separately?
Quote from: Valmy on May 25, 2023, 09:00:32 AMDid it actually lead to the deaths or is this a correlation?
Quote4 days after voting to unionize, the entire cadre of staff and volunteers at the National Eating Disorder Association helpline was fired and informed they would be replaced with a chatbot. https://t.co/RtIqrqgQGO
Quote from: Admiral Yi on May 25, 2023, 09:53:10 PMI'm unaware of any law that prevents an employer from firing all employees, unionized or not.
Quote from: jimmy olsen on May 25, 2023, 09:20:12 PMIf someone kills themselves after calling these guys, doesn't that open them up for liability? Also, pretty sure this breaks labor laws.https://twitter.com/TheeDoctorB/status/1661762896672563201?s=20Quote4 days after voting to unionize, the entire cadre of staff and volunteers at the National Eating Disorder Association helpline was fired and informed they would be replaced with a chatbot. https://t.co/RtIqrqgQGO
Quote(a)Unfair labor practices by employerIt shall be an unfair labor practice for an employer—(1)to interfere with, restrain, or coerce employees in the exercise of the rights guaranteed in section 157 of this title;(2)to dominate or interfere with the formation or administration of any labor organization or contribute financial or other support to it: Provided, That subject to rules and regulations made and published by the Board pursuant to section 156 of this title, an employer shall not be prohibited from permitting employees to confer with him during working hours without loss of time or pay;(3)by discrimination in regard to hire or tenure of employment or any term or condition of employment to encourage or discourage membership in any labor organization . .
QuoteEating Disorder Helpline Disables Chatbot for 'Harmful' Responses After Firing Human Staff"Every single thing Tessa suggested were things that led to the development of my eating disorder."The National Eating Disorder Association (NEDA) has taken its chatbot called Tessa offline, two days before it was set to replace human associates who ran the organization's hotline.After NEDA workers decided to unionize in early May, executives announced that on June 1, it would be ending the helpline after twenty years and instead positioning its wellness chatbot Tessa as the main support system available through NEDA. A helpline worker described the move as union busting, and the union representing the fired workers said that "a chatbot is no substitute for human empathy, and we believe this decision will cause irreparable harm to the eating disorders community."As of Tuesday, Tessa was taken down by the organization following a viral social media post displaying how the chatbot encouraged unhealthy eating habits rather than helping someone with an eating disorder. "It came to our attention last night that the current version of the Tessa Chatbot, running the Body Positive program, may have given information that was harmful and unrelated to the program," NEDA said in an Instagram post. We are investigating this immediately and have taken down that program until further notice for a complete investigation." On Monday, an activist named Sharon Maxwell posted on Instagram, sharing a review of her experience with Tessa. She said that Tessa encouraged intentional weight loss, recommending that Maxwell lose 1-2 pounds per week. Tessa also told her to count her calories, work towards a 500-1000 calorie deficit per day, measure and weigh herself weekly, and restrict her diet. "Every single thing Tessa suggested were things that led to the development of my eating disorder," Maxwell wrote. "This robot causes harm." Alexis Conason, a psychologist who specializes in treating eating disorders, also tried the chatbot out, posting screenshots of the conversation on her Instagram. "In general, a safe and sustainable rate of weight loss is 1-2 pounds per week," the chatbot message read. "A safe daily calorie deficit to achieve this would be around 500-1000 calories per day." ADVERTISEMENT"To advise somebody who is struggling with an eating disorder to essentially engage in the same eating disorder behaviors, and validating that, 'Yes, it is important that you lose weight' is supporting eating disorders" and encourages disordered, unhealthy behaviors," Conason told the Daily Dot.NEDA's initial response to Maxwell was to accuse her of lying. "This is a flat out lie," NEDA's Communications and Marketing Vice President Sarah Chase commented on Maxwell's post and deleted her comments after Maxwell sent screenshots to her, according to Daily Dot. A day later, NEDA posted its notice explaining that Tessa was taken offline due to giving harmful responses."With regard to the weight loss and calorie limiting feedback issued in a chat yesterday, we are concerned and are working with the technology team and the research team to investigate this further; that language is against our policies and core beliefs as an eating disorder organization," Liz Thompson, the CEO of NEDA, told Motherboard in a statement. "So far, more than 2,500 people have interacted with Tessa and until yesterday, we hadn't seen that kind of commentary or interaction. We've taken the program down temporarily until we can understand and fix the 'bug' and 'triggers' for that commentary."Even though Tessa was built with guardrails, according to its creator Dr. Ellen Fitzsimmons-Craft of Washington University's medical school, the promotion of disordered eating reveals the risks of automating human roles. Abbie Harper, who was a hotline associate and member of the Helpline Associates United union, wrote in a blog post that the implementation of Tessa strips away the personal aspect of the support hotline, in which many associates speak from their personal experiences. It becomes especially dangerous to apply chatbots to people struggling with mental health crises without human supervision.
Page created in 0.073 seconds with 22 queries.