← Back to incidents
Gannett Pauses AI Sports Articles After Viral Errors and Nonsensical Content
MediumGannett paused its AI sports article generation in August 2023 after LedeAI produced viral errors including repeated phrases and nonsensical game descriptions, forcing the media company to temporarily halt automated journalism efforts.
Category
Hallucination
Industry
Media
Status
Resolved
Date Occurred
Aug 1, 2023
Date Reported
Aug 29, 2023
Jurisdiction
US
AI Provider
Other/Unknown
Application Type
other
Harm Type
reputational
Human Review in Place
No
Litigation Filed
No
journalismsportscontent_generationviral_errorsmediaeditorial_oversightLedeAIautomation
Full Description
In August 2023, Gannett, the parent company of USA Today and numerous local newspapers, was forced to suspend its automated high school sports coverage after AI-generated articles containing significant errors went viral on social media. The company had been using LedeAI, an artificial intelligence platform designed to generate sports articles from game statistics and basic data inputs, to cover high school sports events across its network of local publications.
Readers began noticing and sharing examples of bizarre AI-generated content that included repetitive phrases, nonsensical game descriptions, and factually incorrect information about high school sports events. The errors were particularly noticeable in articles covering football and other fall sports, where the AI system appeared to struggle with contextual understanding of game flow and player statistics. Social media users quickly amplified these mistakes, turning them into viral examples of AI journalism failures.
The incident highlighted the challenges media companies face when implementing AI automation without sufficient human oversight. Gannett had been expanding its use of AI-generated content as part of cost-cutting measures and efforts to maintain coverage of local sports despite reduced newsroom staff. However, the viral nature of the errors created significant reputational damage for the company's journalism brand.
Following the public attention and criticism, Gannett announced it would pause the use of AI for sports article generation while reviewing and improving its automated content systems. The company acknowledged the need for better quality control and editorial oversight of AI-generated content. This incident became a cautionary tale for other media organizations considering similar AI implementations without adequate human review processes.
The suspension represented a significant step back for Gannett's AI journalism initiatives and highlighted the ongoing tension between cost-saving automation and maintaining editorial quality standards. The incident also demonstrated how quickly AI content errors can spread on social media, amplifying reputational damage beyond the original audience of local newspaper readers.
Root Cause
LedeAI's natural language generation system produced repetitive and factually inaccurate content when processing high school sports data, likely due to insufficient training data, poor prompt engineering, or inadequate content validation systems.
Mitigation Analysis
This incident could have been prevented through mandatory human editorial review before publication, automated content quality checks to detect repetitive phrases, and better training data for sports-specific language models. Real-time monitoring for reader complaints and engagement metrics could have flagged problematic content earlier.
Lessons Learned
Media organizations must implement robust human oversight and quality control systems before deploying AI for content generation, as viral spread of AI errors can cause disproportionate reputational damage. The incident demonstrates the critical importance of editorial review processes and the risks of prioritizing cost reduction over content quality in journalism.
Sources
Gannett experiments with AI-generated sports stories, draws criticism
Nieman Journalism Lab · Aug 29, 2023 · news
AI-Generated Sports Articles Are Getting Mocked for Hilarious Errors
Futurism · Aug 30, 2023 · news