When the news broke Last year, when OpenAI and Axel Springer reached a financial agreement and partnership, it seemed to portend a good harmony between the people who write words and the tech companies that use them to help build and train AI models. At that time OpenAI had also reached one agreement with the APfor reference.
Then, as the year ended, The New York Times sued OpenAI and backer Microsoft, alleging that the AI company’s AI models were “created by copying and using millions of copyrighted Times stories, extensive research, opinions, reviews, how-to guides and more.” Because of what the Times considers “illegal use of [its] work to create artificial intelligence products,” OpenAI “can produce output that recites the Times’ content verbatim, carefully summarizes it, and mimics its expressive style, as demonstrated by several examples.”
Exchange explores startups, markets and money.
Read it every morning on TechCrunch+ or get the Exchange newsletter every Saturday.
The Times added in its lawsuit that it “objected after discovering that the defendants were using Times content without permission to develop their models and tools” and that “negotiations have not resulted in a resolution” with OpenAI.
How to balance the need to respect copyright and make sure that the development of artificial intelligence will not stop, it will not be answered quickly. However, the agreements and more persistent disagreements between the creators and the AI companies that want to consume and use this work to build AI models make for an unpleasant moment for both sides of the conflict. Tech companies are busy producing new artificial intelligence models trained on data that includes copyrighted material in their software products. Microsoft is a pioneer in this work, it is worth noting. And media companies that have spent enormously over time to create a body of reported and otherwise created material are outraged that their efforts are fed into machines that give nothing back to the people who provided their educational data.