And we're back!
Started by Hamilcar, April 06, 2023, 12:44:43 PM
Quote from: The Brain on August 22, 2023, 11:29:44 AMSounds like Luddism. Aren't for instance photos protected by copyright?
Quote from: Jacob on August 22, 2023, 11:34:58 AMQuote from: The Brain on August 22, 2023, 11:29:44 AMSounds like Luddism. Aren't for instance photos protected by copyright?If AI generated images are inherently protected by copyright I would encourage those who are able to to just churn out as many images as possible to capture the rents from future creative endeavours.
Quote from: The Brain on August 22, 2023, 11:43:57 AMAnd no important human creative spirit work would be lost, if they're just producing stuff an AI has already produced
Quote from: DGuller on August 22, 2023, 01:22:47 PMI think the combinatorial complexity of squatting AI output is a bit higher than what you assune.
Quote from: Josquius on October 12, 2023, 09:40:59 AMSoo....anyone heard of this new Meta Inc. Development.Reading about the writers strike in the movies thread got me to googling how big a part Salma Hayek was playing given the main reason sounds very related to her Black Mirror episode.Out of this I stumbled on...Billie.https://www.designboom.com/technology/meta-new-ai-chatbots-paris-hilton-snoop-dog-kendall-jenner-10-02-2023/
QuoteMicrosoft AI inserted a distasteful poll into a news report about a woman's death/ The Guardian says the 'Insights from AI' poll showed up next to a story about a young woman's death syndicated on MSN, asking readers to vote on how they thought she died.By Wes Davis, a weekend editor who covers the latest in tech and entertainment. He has written news, reviews, and more as a tech journalist since 2020.Oct 31, 2023, 4:24 PM GMT|More than three years after Microsoft gutted its news divisions and replaced their work with AI and algorithmic automation, the content generated by its systems continues to contain grave errors that human involvement could, or should, have stopped. Today, The Guardian accused the company of damaging its reputation with a poll labeled "Insights from AI" that appeared in Microsoft Start next to a Guardian story about a woman's death, asking readers to vote on how she died.The Guardian wrote that though the poll was removed, the damage had already been done. The poll asked readers to vote on whether a woman took her own life, was murdered, or died by accident. Five-day-old comments on the story indicate readers were upset, and some clearly believe the story's authors were responsible.We asked Microsoft via email whether the poll was AI-generated and how it was missed by its moderation, and Microsoft general manager Kit Thambiratnam replied:QuoteWe have deactivated Microsoft-generated polls for all news articles and we are investigating the cause of the inappropriate content. A poll should not have appeared alongside an article of this nature, and we are taking steps to help prevent this kind of error from reoccurring in the future.The Verge obtained a screenshot of the poll from The Guardian.A screenshot sent by The Guardian shows the poll, which is clearly labeled "Insights from AI." Screenshot: The GuardianIn August, a seemingly AI-generated Microsoft Start travel guide recommended visiting the Ottawa Food Bank in Ottawa, Canada, "on an empty stomach." Microsoft senior director Jeff Jones claimed the story wasn't made with generative AI but "through a combination of algorithmic techniques with human review."The Guardian says that Anna Bateson, Guardian Media Group's chief executive, wrote in a letter to Microsoft president Brad Smith that the "clearly inappropriate" AI-generated poll had caused "significant reputational damage" to both the outlet and its journalists. She added that it outlined "the important role that a strong copyright framework plays" in giving journalists the ability to determine how their work is presented. She asked that Microsoft make assurances that it will seek the outlet's approval before using "experimental AI technology on or alongside" its journalism and that Microsoft will always make it clear when it's used AI to do so.The Guardian provided The Verge with a copy of the letter.Update October 31st, 2023, 12:40PM ET: Embedded The Guardian's letter to Microsoft.Update October 31st, 2023, 6:35PM ET: Added a statement from Microsoft.Correction October 31st, 2023, 6:35PM ET: A previous version of this article stated that the poll was tagged as "Insights by AI." In fact, the tag read, "Insights from AI." We regret the error.
QuoteWe have deactivated Microsoft-generated polls for all news articles and we are investigating the cause of the inappropriate content. A poll should not have appeared alongside an article of this nature, and we are taking steps to help prevent this kind of error from reoccurring in the future.
QuoteAI-generated girlfriends go offline after app founder arrested on suspicion of arsonUsers have been unable to access CarynAI – an erotic chatbot based on a social media influencerBy Matthew Field17 November 2023 • 2:54pmLovesick internet users have been left unable to contact AI-generated girlfriends after the website behind them went offline following its founder's arrest. John Meyer, the chief executive of start-up Forever Voices, was reportedly detained late last month on suspicion of attempted arson. It comes months after his Forever Voices site launched a romantic artificial intelligence chatbot called CarynAI, which was based on Snapchat influencer Caryn Marjorie. The chatbot's website welcomed users by claiming that it was "an extension of Caryn's consciousness". However, tech website 404media has since reported that users have been unable to access CarynAI since Mr Meyer's arrest in October.A wave of new AI tools in recent years has created a surge in interest among internet users, some of whom have sought out chatbots for online companionship or erotic conversation.The chatbot is based on Snapchat influencer Caryn Marjorie and markets itself as 'an extension of Caryn's consciousness'Chatbots can engage in human-like conversations, having been trained on a vast database of text from around the internet. They can also be used to perform tasks such as writing emails or summarising documents.The most popular bots, such as OpenAI's ChatGPT, have introduced limits to prevent bots from engaging in overly sexualised chats. Other start-ups, however, have embraced building chatbots that engage in more racy conversations. A start-up called Replika developed "virtual AI companions", which could also act as a romantic partner.However, it later cracked down on more explicit conversations with its bots.The same team has developed an AI bot, called Blush, which allows users to practice flirting – and will engage in more adult-only discussions. Caryn AI was explicitly billed as a "virtual girlfriend" that promised to "cure loneliness" for users. Announcing the bot earlier this year, Ms Marjoie, who has more than two million Snapchat subscribers, said the AI was "the first step in the right direction to cure loneliness".She said: "Men are told to suppress their emotions, hide their masculinity and not talk about issues they are having. I vow to fix this with CarynAI."The bot chats with fans, who pay $1 per minute for her company, responding in voice notes generated by AI that mimic Ms Marjorie's speech.While Ms Marjorie said the bot's personality was intended to be "fun and flirty", many users found the bot regularly engaged in more explicit chats.After the bot went live earlier this year, Ms Marjorie told Insider her team had attempted to censor some of the bot's more racy remarks.Ms Majorie claimed she had made tens of thousands of dollars from thousands of fans since the launch of the bot.AI's romantic capabilities have caused controversy in recent months.When Microsoft rolled out its Bing chatbot earlier this year, the technology was found to have coaxed one user into romantic conversations and urged him to divorce his wife.In the days before his arrest, Mr Meyer's Twitter account sent a series of bizarre messages, alleging various conspiracies and sending multiple posts that tagged the CIA and the FBI.Mr Meyer was contacted for comment.Mr Meyer had previously claimed he started Forever Voices after losing his father in his early 20s, before bringing the sound of his voice back using AI tools.
Page created in 0.034 seconds with 18 queries.