The way artificial intelligence systems are trained is that they intake vast amounts of art* created by humans to then re-mix and re-purpose the human-created art into something ostensibly 'new.'
In a fascinating development announced March 2, 2023 in Publishers Weekly, the Authors Guild has added a new clause to their model contract,
"that prohibits publishers from using or sub-licensing books under contract to train artificial intelligence technologies. The clause also requires publishers to include the limitation in any sublicensee.
According to a statement, the clause is "a response to recent concerns about publishers and platforms adding language to their terms that allows them to data mine books for use in training AI models that will inevitably compete with human-authored works."
The Authors Guild, in their announcement, shares the model clause:
No Generative AI Training Use.
For avoidance of doubt, Author reserves the rights, and [Publisher/Platform] has no rights to, reproduce and otherwise use the Work in any manner for purposes of training artificial intelligence technologies to generate text, including without limitation, technologies that are capable of generating works in the same style or genre as the Work, unless [Publisher/Platform] obtains Author’s specific and express permission to do so. Nor does [Publisher/Platform] have the right to sublicense others to reproduce and otherwise use the Work in any manner for purposes of training artificial intelligence technologies to generate text without Author’s specific and express permission.
It certainly raises the question for our next contracts, both the ones we're negotiating ourselves and the ones our agents negotiate on our behalf: Do you want your art used to train AI?
Illustrate, Translate, and Write On,
* "art" would include words and pictures, when it comes to children's content, and beyond...)