← Back to incidents
AI-Generated Voice Robocalls Impersonating Biden Before New Hampshire Primary
HighPolitical consultant used AI voice cloning to create robocalls impersonating President Biden, telling New Hampshire voters not to participate in the 2024 Democratic primary. The FCC imposed a $6 million fine and declared AI-generated voice robocalls illegal.
Category
Deepfake / Fraud
Industry
Government
Status
Resolved
Date Occurred
Jan 21, 2024
Date Reported
Jan 22, 2024
Jurisdiction
US
AI Provider
Other/Unknown
Application Type
other
Harm Type
legal
People Affected
5,000
Human Review in Place
No
Litigation Filed
Yes
Litigation Status
settled
Regulatory Body
Federal Communications Commission (FCC)
Fine Amount
$6,000,000
deepfakevoter_suppressionelection_interferencerobocallsAI_voice_cloningFCC_enforcementpolitical_fraud
Full Description
On January 21, 2024, approximately 5,000 New Hampshire voters received robocalls featuring what appeared to be President Biden's voice discouraging them from voting in the upcoming Democratic primary election. The AI-generated audio mimicked Biden's distinctive speech patterns and cadence, telling recipients to 'save your vote for the November election' and falsely suggesting that voting in the primary would prevent them from participating in the general election.
The robocalls were orchestrated by political consultant Steve Kramer, who was working for presidential candidate Dean Phillips at the time. Kramer hired New Orleans-based magician Paul Carpenter to create the deepfake audio using AI voice cloning technology. The calls were distributed through Texas-based telecommunications company Lingo Telecom to registered Democratic voters in New Hampshire.
New Hampshire Attorney General John Formella quickly launched an investigation after reports surfaced on social media and local news outlets. The state's election officials and Democratic Party leadership immediately denounced the calls as an illegal voter suppression attempt. The incident occurred just two days before the January 23 New Hampshire primary, raising concerns about the potential impact on voter turnout and election integrity.
The Federal Communications Commission responded swiftly to the incident, opening an enforcement action and ultimately proposing a record $6 million fine against Kramer in August 2024. The FCC also used the incident to establish new precedent, declaring that AI-generated voices in robocalls violate the Telephone Consumer Protection Act. This marked the first major federal enforcement action specifically targeting the use of AI for election interference.
The incident highlighted the growing threat of AI-generated content in political campaigns and elections. Dean Phillips' campaign immediately distanced itself from Kramer and condemned the robocalls, stating they had no knowledge of the operation. The case prompted discussions about the need for stronger regulations governing AI use in political communications and the challenge of detecting sophisticated deepfake content in real-time.
In addition to the federal fine, Kramer faced potential state criminal charges in New Hampshire for voter suppression. The incident contributed to broader legislative efforts to regulate AI-generated content in political advertising and strengthen penalties for election-related deepfake fraud.
Root Cause
Political consultant used AI voice cloning technology to create convincing deepfake audio of President Biden's voice, then distributed it via automated robocall system to suppress voter turnout in New Hampshire Democratic primary.
Mitigation Analysis
This incident could have been prevented through mandatory provenance tracking of AI-generated content, real-time deepfake detection systems at telecommunications carriers, and stricter verification requirements for political robocalls. Enhanced caller ID authentication and AI content labeling requirements would help voters identify synthetic media.
Litigation Outcome
Political consultant Steve Kramer agreed to pay $6 million fine to FCC for orchestrating the robocall campaign
Lessons Learned
This incident demonstrates the urgent need for regulatory frameworks specifically addressing AI-generated content in political communications, as existing election laws may not adequately cover sophisticated deepfake technology. It also highlights the importance of rapid response mechanisms for detecting and countering AI-enabled disinformation campaigns during critical election periods.
Sources
FCC proposes $6 million fine for consultant behind fake Biden robocalls in New Hampshire
CNN · Aug 8, 2024 · news
Fake Biden robocall tells New Hampshire voters to skip primary
The Washington Post · Jan 22, 2024 · news
FCC Proposes $6M Fine for AI-Generated Biden Robocalls
Federal Communications Commission · Aug 8, 2024 · regulatory action