Training AI on Artist Work: Risks and Ethical Concerns
Generative AI has emerged as a transformative technology in the worlds of art and animation. However, training AI models on the work of artists and animators raises significant ethical and legal concerns. This practice could put the creative talent in Hollywood’s animation and game art departments at risk, especially as studios explore using artist-created assets to refine their AI models.
AI and Creativity: Opportunities and Threats
Independent AI studios are already experimenting with incorporating generative AI into animation workflows. These models can be fine-tuned to match specific character designs for social media or transfer unique art styles to other project assets. While such advancements demonstrate the potential of AI, they also highlight a troubling scenario: studios could use work created by artists to enhance AI models, potentially exploiting talent and reducing future opportunities for creatives.
Data Ethics and Labor Concerns
The Animation Guild (TAG)’s recent agreement with studios does not include restrictions on using artist-created work to train AI models. Moreover, artists are not granted the right to refuse the use of generative AI tools in their projects, unlike writers who gained such protections in the WGA’s agreement last fall.
For many artists, AI already feels like “labor theft,” as the technology is built on unlicensed works scraped from the internet. Now, the possibility of studios using assets submitted by artists to train proprietary AI systems exacerbates fears of job loss, reduced pay, and diminished creative roles.
The Role of Contracts in Protecting Artists
TAG member and storyboard artist Sam Tung expressed concern that artists are “digging their own graves” by creating assets that might improve AI models capable of replacing them. This could lead to downsizing of creative teams or limiting artists’ roles to editing AI-generated outputs.
Jon Lam, a senior storyboard artist at Riot Games, emphasized the need for clearer contractual terms. He argued for distinguishing between delivering an asset and effectively handing over a digital version of one’s creative skills. “Artists must specify in contracts that their work can be used for reference but not for fine-tuning AI models,” Lam said.
Legal and Consumer Risks
Major studios remain cautious about AI due to legal and consumer backlash risks. To minimize potential copyright issues, some studios aim to train AI models only on assets they own or commission directly from artists. However, these fine-tuned models still rely on pretrained systems, many of which have used copyrighted material without permission.
Storyboard artist Tung pointed out that even fine-tuning does not eliminate legal risks, as outputs might still resemble unlicensed material. “Studios need to be very confident about the legality of the software they use, as AI-generated outputs could carry serious legal implications,” he warned.
Education and Advocacy by Artists
Artists are taking the initiative to educate themselves about generative AI and its implications, sharing this knowledge with team leads and union representatives. Advocacy groups like the Concept Art Association have also stepped in, offering standardized legal language for artists to include in their contracts. However, securing restrictions against training AI on their work remains a challenge.
Conclusion
Generative AI offers exciting possibilities but brings significant ethical and legal challenges. For artists, protecting their rights and ensuring fair treatment requires greater awareness and stronger contractual safeguards. As the industry grapples with balancing innovation and respect for creative labor, the future of art and animation remains uncertain. Studios and artists must work together to establish a fair and sustainable path forward.
ILLUSTRATION: CHEYNE GATELEY
Post Comment