Creative Artists Agency (CAA), one of the leading entertainment and sports talent agencies, hopes to be at the forefront of AI protection services for celebrities in Hollywood.
With many stars using their digital likeness without permission, CAA has created a virtual media storage system for A-list talent — actors, athletes, comedians, directors, musicians and more — to store their digital assets, such as names them, their pictures, digital scans, voice recordings and so on. The new development is part of ‘theCAAvault’, the company’s studio where actors record their bodies, faces, movements and voices using scanning technology to create AI clones.
CAA has partnered with AI technology company Veritone to provide its digital asset management solution, the was announced earlier this week.
The announcement comes amid a wave of AI deepfakes of celebrities, often created without their consent. Tom Hanks, famous actor and client on the CAA roster, fell victim to an AI scam seven months ago. He claimed a company used an AI-generated video of him to promote an unlicensed dental plan.
“Over the past two years or so, there has been a massive misuse of our customers’ names, images, likenesses and voices without consent, without credit, without proper compensation. It’s very clear that the law is not currently in place to be able to protect them, and so we’re seeing a lot of open lawsuits out there right now,” Shannon said.
A significant amount of personal data is necessary to create digital clones, which raises numerous privacy concerns due to the risk of sensitive information being compromised or misused. CAA customers can now store their AI digital doubles and other assets in a secure personal hub in CAAvault that only authorized users can access, allowing them to share and monetize their content as they see fit.
“This enables us to start setting precedents for what consensual AI use looks like,” CAA’s head of strategic development, Alexandra Shannon, told TechCrunch. “Honestly, our view was that the law would take time to catch up, and so from the talent that creates and owns their digital likeness to [theCAAvault]… there is now a legal way for companies to work with one of our clients. If a third party chooses not to work with them in the right way, it makes it much easier for legal cases to prove that their rights have been violated and help protect customers over time.”
Notably, the vault also ensures that actors and other talent are fairly compensated when companies use their digital likenesses.
“All these assets belong to the individual client, so it is very much up to them if they want to grant access to anyone else… It also depends on the talents to decide the right business model for opportunities. This is a new space and very much shaping up. We believe these assets will increase in value and opportunity over time. This shouldn’t be a cheaper way to work with someone… We see [AI clones] as an improvement rather than a cost savings,” added Shannon.
CAA also represents Ariana Grande, Beyoncé, Reese Witherspoon, Steven Spielberg and Zendaya, among others.
The use of AI cloning has sparked a lot of debate in Hollywood, with some believing it could lead to fewer job opportunities as studios may choose digital clones over real actors. That was a major point of contention during the 2023 SAG-AFTRA strikes, which ended in November after members approved a new agreement with the AMPTP (Alliance of Motion Picture and Television Producers) recognizing the importance of human performers and including guidelines on how to use ‘digital copies’.
There are also concerns about the unauthorized use of AI clones of deceased celebrities, which can be upsetting to family members. For example, Robin Williams’ daughter expressed her disdain for an AI-generated voice recording of the star. However, some argue that when done ethically, it can be an emotional way to preserve an iconic actor and recreate his performances in future works for all generations to enjoy.
“Artificial intelligence clones are an effective tool that allows legacies to live on in future generations. The CAA takes a consent and permission-based approach to all AI applications and will only work with estates that own and have permission to use such assets. It is up to the artists to whom they wish to grant ownership and license after their death,” Shannon noted.
Shannon declined to share which of CAA’s clients currently store their AI clones in the vault, however, she said it was only a select few at the moment. CAA also charges a fee for customers to join the vault, but did not say exactly how much it costs.
“The ultimate goal will be to make this available to all our customers and anyone in the industry. It’s not cheap, but over time, the cost will continue to come down,” he added.