News:

And we're back!

Main Menu

The AI dooooooom thread

Started by Hamilcar, April 06, 2023, 12:44:43 PM

Previous topic - Next topic

DGuller

Lots of powerful AI comes free, you just need the knowledge and the compute.  It's not like Google or OpenAI have proprietary algorithms for making naked pictures of underage girls.

Josquius

What are these free image AI?
I've casually looked for them but never came across them.
There do seem to be a shit tonne of pay for porn ones out there though.
██████
██████
██████

Tonitrus

This is where political AI is going...  (NSFW due to language)


grumbler

Quote from: Tonitrus on July 19, 2023, 10:13:10 PMThis is where political AI is going...  (NSFW due to language)

(snip)

Comedy writers everywhere breathe a sigh of relief when they watch that.
The future is all around us, waiting, in moments of transition, to be born in moments of revelation. No one knows the shape of that future or where it will take us. We know only that it is always born in pain.   -G'Kar

Bayraktar!

Syt

"Slightly" biased article but still an interesting summary of the current conflict.

https://theintercept.com/2023/07/25/strike-hollywood-ai-disney-netflix/

QuoteAS ACTORS STRIKE FOR AI PROTECTIONS, NETFLIX LISTS $900,000 AI JOB

Rob Delaney said, "My melodious voice? My broad shoulders and dancer's undulating buttocks? I decide how those are used!"


AS HOLLYWOOD EXECUTIVES insist it is "just not realistic" to pay actors — 87 percent of whom earn less than $26,000 — more, they are spending lavishly on AI programs.

While entertainment firms like Disney have declined to go into specifics about the nature of their investments in artificial intelligence, job postings and financial disclosures reviewed by The Intercept reveal new details about the extent of these companies' embrace of the technology.

In one case, Netflix is offering as much as $900,000 for a single AI product manager.

Hollywood actors and writers unions are jointly striking this summer for the first time since 1960, calling for better wages and regulations on studios' use of artificial intelligence.

Just after the actors' strike was authorized, the Alliance of Motion Picture and Television Producers — the trade association representing the TV and film companies negotiating with the actors and writers unions — announced "a groundbreaking AI proposal that protects actors' digital likenesses for SAG-AFTRA members."

The offer prompted comparisons to an episode of the dystopian sci-fi TV series "Black Mirror," which depicted actress Salma Hayek locked in a Kafkaesque struggle with a studio which was using her scanned digital likeness against her will.

"So $900k/yr per soldier in their godless AI army when that amount of earnings could qualify thirty-five actors and their families for SAG-AFTRA health insurance is just ghoulish," actor Rob Delaney, who had a lead role in the "Black Mirror" episode, told The Intercept. "Having been poor and rich in this business, I can assure you there's enough money to go around; it's just about priorities."

Among the striking actors' demands are protections against their scanned likeness being manipulated by AI without adequate compensation for the actors.

"They propose that our background performers should be able to be scanned, get paid for one day's pay and their company should own that scan, their image, their likeness, and to be able to use it for the rest of eternity in any project they want with no consent and no compensation," Duncan Crabtree-Ireland, chief negotiator for the actors' union, SAG-AFTRA, said.

Entertainment writers, too, must contend with their work being replaced by AI programs like ChatGPT that are capable of generating text in response to queries. Writers represented by the Writers Guild of America have been on strike since May 7 demanding, among other things, labor safeguards against AI. John August, a screenwriter for films like "Big Fish" and "Charlie's Angels," explained that the WGA wants to make sure that "ChatGPT and its cousins can't be credited with writing a screenplay."

Protecting Actors' Likenesses

The daily rate for background actors can be around $200, per the SAG-AFTRA contract. A job posting by the company Realeyes offers slightly more than that: $300 for two hours of work "express[ing] different emotions" and "improvis[ing] brief scenes" to "train an AI database to better express human emotions."

Realeyes develops technology to measure attention and reactions by users to video content. While the posting doesn't mention work with streaming companies, a video on Realeyes's website prominently features the logos for Netflix and Hulu.

The posting is specially catered to attract striking workers, stressing that the gig is for "research" purposes and therefore "does not qualify as struck work": "Please note that this project does not intend to replace actors, but rather requires their expertise," Realeyes says, emphasizing multiple times that training AI to create "expressive avatars" skirts strike restrictions.

Experts question whether the boundary between research and commercial work is really so clear. "It's almost a guarantee that the use of this 'research,' when it gets commercialized, will be to build digital actors that replace humans," said Ben Zhao, professor of computer science at the University of Chicago. "The 'research' side of this is largely a red herring." He added, "Industry research goes into commercial products."

"This is the same bait-switch that LAION and OpenAI pulled years ago," Zhao said, referring to the Large-scale Artificial Intelligence Open Network, a German nonprofit that created the AI chatbot OpenAssistant; OpenAI is the nonprofit that created AI programs like ChatGPT and DALL-E. "Download everything on the internet and no worries about copyrights, because it's a nonprofit and research. The output of that becomes a public dataset, then commercial companies (who supported the nonprofit) then take it and say, 'Gee thanks! How convenient for our commercial products!'"

Netflix AI Manager

Netflix's posting for a $900,000-a-year AI product manager job makes clear that the AI goes beyond just the algorithms that determine what shows are recommended to users.

The listing points to AI's uses for content creation:"Artificial Intelligence is powering innovation in all areas of the business," including by helping them to "create great content." Netflix's AI product manager posting alludes to a sprawling effort by the business to embrace AI, referring to its "Machine Learning Platform" involving AI specialists "across Netflix." (Netflix did not immediately respond to a request for comment.)

A research section on Netflix's website describes its machine learning platform, noting that while it was historically used for things like recommendations, it is now being applied to content creation. "Historically, personalization has been the most well-known area, where machine learning powers our recommendation algorithms. We're also using machine learning to help shape our catalog of movies and TV shows by learning characteristics that make content successful. We use it to optimize the production of original movies and TV shows in Netflix's rapidly growing studio."

Netflix is already putting the AI technology to work. On July 6, the streaming service premiered a new Spanish reality dating series, "Deep Fake Love," in which scans of contestants' faces and bodies are used to create AI-generated "deepfake" simulations of themselves.

In another job posting, Netflix seeks a technical director for generative AI in its research and development tech lab for its gaming studio. (Video games often employ voice actors and writers.)

Generative AI is the type of AI that can produce text, images, and video from input data — a key component of original content creation but which can also be used for other purposes like advertising. Generative AI is distinct from older, more familiar AI models that provide things like algorithmic recommendations or genre tags.

"All those models are typically called discriminatory models or classifiers: They tell you what something is," Zhao explained. "They do not generate content like ChatGPT or image generator models."

"Generative models are the ones with the ethics problems," he said, explaining how classifiers are based on carefully using limited training data — such as a viewing history — to generate recommendations.

Netflix offers up to $650,000 for its generative AI technical director role.

Video game writers have expressed concerns about losing work to generative AI, with one major game developer, Ubisoft, saying that it is already using generative AI to write dialogue for nonplayer characters.

Netflix, for its part, advertises that one of its games, a narrative-driven adventure game called "Scriptic: Crime Stories," centered around crime stories, "uses generative AI to help tell them."

Disney's AI Operations

Disney has also listed job openings for AI-related positions. In one, the entertainment giant is looking for a senior AI engineer to "drive innovation across our cinematic pipelines and theatrical experiences." The posting mentions several big name Disney studios where AI is already playing a role, including Marvel, Walt Disney Animation, and Pixar.

In a recent earnings call, Disney CEO Bob Iger alluded to the challenges that the company would have in integrating AI into their current business model.

"In fact, we're already starting to use AI to create some efficiencies and ultimately to better serve consumers," Iger said, as recently reported by journalist Lee Fang. "But it's also clear that AI is going to be highly disruptive, and it could be extremely difficult to manage, particularly from an IP management perspective."

Iger added, "I can tell you that our legal team is working overtime already to try to come to grips with what could be some of the challenges here." Though Iger declined to go into specifics, Disney's Securities and Exchange Commission filings provide some clues.

"Rules governing new technological developments, such as developments in generative AI, remain unsettled, and these developments may affect aspects of our existing business model, including revenue streams for the use of our IP and how we create our entertainment products," the filing says.

While striking actors are seeking to protect their own IP from AI — among the union demands that Iger deemed "just not realistic" — so is Disney.

"It seems clear that the entertainment industry is willing to make massive investments in generative AI," Zhao said, "not just potentially hundreds of millions of dollars, but also valuable access to their intellectual property, so that AI models can be trained to replace human creatives like actors, writers, journalists for a tiny fraction of human wages."

For some actors, this is not a struggle against the sci-fi dystopia of AI itself, but just a bid for fair working conditions in their industry and control over their own likenesses, bodies, movements, and speech patterns.

"AI isn't bad, it's just that the workers (me) need to own and control the means of production!" said Delaney. "My melodious voice? My broad shoulders and dancer's undulating buttocks? I decide how those are used! Not a board of VC angel investor scumbags meeting in a Sun Valley conference room between niacin IV cocktails or whatever they do."
I am, somehow, less interested in the weight and convolutions of Einstein's brain than in the near certainty that people of equal talent have lived and died in cotton fields and sweatshops.
—Stephen Jay Gould

Proud owner of 42 Zoupa Points.

Iormlund

#125
$900k/year is not exactly outlandish. I personally know at least two guys who are in that pay range, both doing AI work. One for Meta, one for Google. So there's bound to be a lot* more.

*Relatively speaking. Both guys are basically geniuses.

Tonitrus

#126
AI keeps getting out of hand...




Jacob

Federal judge rules that work authored by AI cannot be copyrighted: https://www.businessinsider.com/ai-generated-art-cant-by-copyrighted-federal-judge-rules-2023-8

Interesting twist. We'll see how long it lasts.

It seems obvious to me that the major IP holding corporations are aiming for an environment in which they can use AI to generate content (at low cost), while they control distribution and marketing (making it harder for challengers to arise with new IP), and maintain the rights to as much of the IP as possible. It'll be interesting to see how the lobbying and legislation goes after this.

Valmy

Yeah I totally agree. AI art should not be copyrightable.

The whole idea of copyright is to incentivize art, letting AI art be copyrighted achieves the exact opposite of that purpose.
Quote"This is a Russian warship. I propose you lay down arms and surrender to avoid bloodshed & unnecessary victims. Otherwise, you'll be bombed."

Zmiinyi defenders: "Russian warship, go fuck yourself."

DGuller

I don't think it's as straightforward as it sounds.  There is a lot of art and science, at least as of now, to getting what you need out of AI.  It even gave rise to a whole new job called prompt engineering.  The output of the AI may not be something that you created, but figuring out the prompts to get it is.

Valmy

Quote from: DGuller on August 21, 2023, 10:20:35 PMI don't think it's as straightforward as it sounds.  There is a lot of art and science, at least as of now, to getting what you need out of AI.  It even gave rise to a whole new job called prompt engineering.  The output of the AI may not be something that you created, but figuring out the prompts to get it is.

So what? You own a string of words for your whole life+75 years because you were the first to enter it into that AI? Copyright is already a necessary evil at best and abused to hell and back. It should reduced and constrained, not rapidly expanded to some absurdity like this.
Quote"This is a Russian warship. I propose you lay down arms and surrender to avoid bloodshed & unnecessary victims. Otherwise, you'll be bombed."

Zmiinyi defenders: "Russian warship, go fuck yourself."

Syt

Quote from: Valmy on August 21, 2023, 11:15:53 PM
Quote from: DGuller on August 21, 2023, 10:20:35 PMI don't think it's as straightforward as it sounds.  There is a lot of art and science, at least as of now, to getting what you need out of AI.  It even gave rise to a whole new job called prompt engineering.  The output of the AI may not be something that you created, but figuring out the prompts to get it is.

So what? You own a string of words for your whole life+75 years because you were the first to enter it into that AI? Copyright is already a necessary evil at best and abused to hell and back. It should reduced and constrained, not rapidly expanded to some absurdity like this.

I don't think it's as easy as that, because that same string of words entered into different generative AIs, with different random seeds, will create vastly different results, depending on e.g. the content the model has been trained on (e.g. Adobe Photoshop's new generative AI is trained on Adobe's stock images and public domain contents).

When it comes to imagery, I think it gets more complicated - are you generating images with likenesses of real people? Generating images of a movie with a different cast is one thing, but creating images of celebrities (or people you know personally) committing crimes or sex acts?

Are you generating contents with copyrighted assets (e.g. Star Wars characters)? If you generate something new, how much of the final image is containing anything that might be considered copyrighted by someone else that the AI drew from? And if it does contain recognizable material, does this count as transformative work? And, on a more philosophical level, how different is it from conventional artists drawing on their knowledge of pop culture, classical art and the real world when creating new works (except that an AI can obviously draw - in theory - from a much bigger pool of contents)?

Having dabbled with Midjourney, DALL-E and Adobe PS in recent weeks, there's certainly some skill (or trial and error) required to generate images that you want, and current generative models can deliver impressive images, but where it usually breaks down is once you get very detailed in your instructions or want to create overly complex scenes (unless you use a lot of inpainting, i.e. making corrections/additions to parts of the generated image via additional AI prompts).

That said, there seem to be plenty artists out there who generate an image via AI and then use it as a basis for further refinement/transformation in PS - I feel they should not lose out on their copyright.

The whole area is very wild west and very loosey-goosey at the moment. It will settle down eventually, I'd presume, but for now I would not assume that any AI generated creative work should be copyrighted, just to err on the side of caution - there's just too much derivative, generic and very similar content being churned out at the moment to apply the "old rules" IMHO.
I am, somehow, less interested in the weight and convolutions of Einstein's brain than in the near certainty that people of equal talent have lived and died in cotton fields and sweatshops.
—Stephen Jay Gould

Proud owner of 42 Zoupa Points.

Sheilbh

Quote from: Jacob on August 21, 2023, 07:19:30 PMFederal judge rules that work authored by AI cannot be copyrighted: https://www.businessinsider.com/ai-generated-art-cant-by-copyrighted-federal-judge-rules-2023-8

Interesting twist. We'll see how long it lasts.

It seems obvious to me that the major IP holding corporations are aiming for an environment in which they can use AI to generate content (at low cost), while they control distribution and marketing (making it harder for challengers to arise with new IP), and maintain the rights to as much of the IP as possible. It'll be interesting to see how the lobbying and legislation goes after this.
It's been about 5-6 years but I went to a session of IP lawyers on this point (from English law perspective) and there wasn't really much conclusion.

From memory I think their main options were that IP in the output of an AI would be owned by whoever developed the AI (from a T&Cs perspective - I think that's true of most open AIs at the minute), whoever did the prompts to get that output (in a work context this would likely mean their company) or, potentially, in some way the AI itself (that is it gets bundled with it in some way).

I don't think it's clear my instinct is that from a public policy perspective the more open we are on the use of AI, the lower the IP protection should be for its output; and vice versa if the use is constrained and heavily regulated than IP is more protected (though probably not current IP rules). Basically options for companies to benefit from AI or the artificial monopoly rights of IP law. Not sure how you'd do it but that's my instinct.

Of course working a publisher and aware that every gen AI out there is, as far as we can tell, built using massive hoovering up and use of IP-protected work without paying anyone, I have limited sympathy for the IP risks of output. Although this is another reason adoption might be low in newsrooms for a while - if we don't clearly own and can't license out our content it has a big commercial risk.
Let's bomb Russia!

Syt

FWIW, the relevant part of Midjourney's ToS:

https://docs.midjourney.com/docs/terms-of-service

Quote4. Copyright and Trademark
In this section, Paid Member shall refer to a Customer who has subscribed to a paying plan.

Rights You give to Midjourney
By using the Services, You grant to Midjourney, its successors, and assigns a perpetual, worldwide, non-exclusive, sublicensable no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute text, and image prompts You input into the Services, or Assets produced by the service at Your direction. This license survives termination of this Agreement by any party, for any reason.

Your Rights
Subject to the above license, You own all Assets You create with the Services, provided they were created in accordance with this Agreement. This excludes upscaling the images of others, which images remain owned by the original Asset creators. Midjourney makes no representations or warranties with respect to the current law that might apply to You. Please consult Your own lawyer if You want more information about the state of current law in Your jurisdiction. Your ownership of the Assets you created persists even if in subsequent months You downgrade or cancel Your membership. However, You do not own the Assets if You fall under the exceptions below.

If You are an employee or owner of a company with more than $1,000,000 USD a year in gross revenue and You are using the Services on behalf of Your employer, You must purchase a "Pro" or "Mega" membership for every individual accessing the Services on Your behalf in order to own Assets You create. If You are not sure whether Your use qualifies as on behalf of Your employer, please assume it does.

If You are not a Paid Member, You don't own the Assets You create. Instead, Midjourney grants You a license to the Assets under the Creative Commons Noncommercial 4.0 Attribution International License (the "Asset License").
The full text is accessible as of the Effective Date here: https://creativecommons.org/licenses/by-nc/4.0/legalcode.

Please note: Midjourney is an open community which allows others to use and remix Your images and prompts whenever they are posted in a public setting. By default, Your images are publically viewable and remixable. As described above, You grant Midjourney a license to allow this. If You purchase a "Pro" or "Mega" plan, You may bypass some of these public sharing defaults.

If You purchased the Stealth feature as part of Your "Pro" or "Mega" subscription or through the previously available add-on, we agree to make best efforts not to publish any Assets You make in any situation where you have engaged stealth mode in the Services.

Please be aware that any image You make in a shared or open space such as a Discord chatroom, is viewable by anyone in that chatroom, regardless of whether Stealth mode is engaged.
I am, somehow, less interested in the weight and convolutions of Einstein's brain than in the near certainty that people of equal talent have lived and died in cotton fields and sweatshops.
—Stephen Jay Gould

Proud owner of 42 Zoupa Points.

Valmy

Quote from: Syt on August 22, 2023, 01:15:26 AMThat said, there seem to be plenty artists out there who generate an image via AI and then use it as a basis for further refinement/transformation in PS - I feel they should not lose out on their copyright.

Well that's different from straight up copyrighting whatever the AI spits out isn't it?

But it kind of feels like me taking art assets from BG3, doing some stuff to them, and then claiming them as mine.

The point of copyright is to encourage original art work, not encourage the mass production of computer generated derivative crap.
Quote"This is a Russian warship. I propose you lay down arms and surrender to avoid bloodshed & unnecessary victims. Otherwise, you'll be bombed."

Zmiinyi defenders: "Russian warship, go fuck yourself."