← Back to incidents

Scarlett Johansson Accuses OpenAI of Copying Her Voice for GPT-4o Assistant

High

Scarlett Johansson accused OpenAI of copying her voice for GPT-4o's 'Sky' assistant without permission, leading to legal threats and OpenAI pausing the voice feature.

Category
Copyright Violation
Industry
Technology
Status
Resolved
Date Occurred
May 13, 2024
Date Reported
May 20, 2024
Jurisdiction
US
AI Provider
OpenAI
Model
GPT-4o
Application Type
chatbot
Harm Type
reputational
People Affected
1
Human Review in Place
Unknown
Litigation Filed
No
voice cloningpersonality rightscelebrity likenessOpenAIGPT-4ointellectual propertyAI ethicsconsent

Full Description

On May 13, 2024, OpenAI launched GPT-4o with advanced voice capabilities, featuring five distinct voice options including one called 'Sky.' The launch garnered significant attention for the natural, conversational quality of the AI assistant. However, many observers immediately noted the striking similarity between the Sky voice and actress Scarlett Johansson's voice, particularly her portrayal of an AI assistant in the 2013 film 'Her.' The controversy intensified when OpenAI CEO Sam Altman posted the single word 'her' on social media platform X (formerly Twitter) on the day of the GPT-4o launch, appearing to reference Johansson's role in the Spike Jonze film. This post was widely interpreted as an intentional nod to the voice similarity, suggesting deliberate mimicry rather than coincidental resemblance. On May 20, 2024, Scarlett Johansson released a detailed statement through her legal representatives revealing that OpenAI had approached her in September 2023 requesting permission to license her voice for their AI assistant. Johansson stated she had declined the offer for personal reasons. The statement indicated that OpenAI had contacted her agent just two days before the GPT-4o launch, making another request to reconsider, which she again declined. Johansson's legal team demanded that OpenAI cease using any voice that resembled hers and provide details about how the Sky voice was created. The actress expressed shock and anger at what she perceived as unauthorized use of her voice likeness, stating that the similarity was so close that her friends and family could not distinguish between her voice and Sky. In response to the legal pressure, OpenAI initially maintained that Sky's voice was not intended to resemble Johansson and was performed by a different professional voice actress hired before any contact with Johansson. However, on May 20, 2024, the company announced it would pause the Sky voice 'out of respect for Ms. Johansson' while they worked to address her concerns. OpenAI published a blog post detailing their voice selection process, claiming they had worked with professional voice actors and that any resemblance to Johansson was unintentional.

Root Cause

OpenAI allegedly created an AI voice assistant that closely resembled Scarlett Johansson's voice without her consent, despite her having previously declined their licensing request. The similarity was so notable that CEO Sam Altman's social media post of 'her' strongly implied intentional mimicry.

Mitigation Analysis

This incident could have been prevented through robust voice rights clearance protocols requiring explicit written consent before developing voice models. Implementation of voice similarity detection systems could flag potential unauthorized voice replication. Legal review of voice casting decisions and marketing communications could have identified the risks before launch.

Lessons Learned

The incident highlights the critical importance of personality rights in AI development and the legal risks of creating AI voices that resemble celebrities without explicit consent. It demonstrates how social media communications can undermine claims of unintentional similarity and emphasizes the need for clear voice rights protocols in AI development.