Equity Warns of Mass Action Over AI Use of Actors Images
What Happened
British performers union Equity is warning of large scale direct action if actors images and likenesses are used in AI-generated content without their explicit consent. The union raised the alarm following increasing reports of performers images being used in synthetic media and deepfakes for commercials and productions. Equity is demanding stronger industry regulation and legal measures to protect actors against unauthorized exploitation by artificial intelligence technologies. The warning comes amid growing concern in the UK creative sector about AI disrupting traditional roles and rights.
Why It Matters
The dispute highlights an urgent need for clear legal safeguards as AI rapidly transforms the entertainment industry. The outcome could redefine how artistic rights are protected as automation and machine learning reshape creative work. Read more in our AI News Hub