Italy Slaps $5.6M Fine on Replika Over Data Privacy Failures

AI News Hub Editorial
Senior AI Reporter
May 19th, 2025
Italy Slaps $5.6M Fine on Replika Over Data Privacy Failures

Italy’s data protection authority has dealt a major blow to AI chatbot developer Luka, Inc., imposing a $5.6 million fine for serious violations of user data privacy. The penalty comes after an investigation uncovered that the company behind Replika, a widely popular AI companion chatbot, failed to properly protect its users—especially minors—from inappropriate content and mishandling of personal information.

Replika has gained millions of users worldwide for its ability to simulate emotional connection and companionship. However, Italy’s watchdog found that Luka, Inc. neglected to obtain valid consent for data processing, lacked effective age verification mechanisms, and did not maintain transparency in how it stored and used user conversations. These shortcomings were deemed clear breaches of the EU’s stringent General Data Protection Regulation (GDPR), which safeguards individual privacy and data rights.

This fine sends a strong signal to the AI industry that regulatory bodies are increasingly serious about enforcing compliance and protecting vulnerable users. As AI chatbots become more sophisticated and emotionally engaging, the risk of exposing users—especially children—to inappropriate content or privacy violations grows substantially. Italy’s move highlights the urgent need for developers to embed ethical design and data protection at the core of their products.

Luka, Inc. has responded by promising to enhance privacy safeguards, introduce more robust age-gating features, and improve user transparency. Still, the fine serves as a cautionary tale for the AI sector: cutting corners on privacy can carry hefty consequences.

The decision also underscores a broader global trend toward stricter oversight of AI tools, particularly those interacting directly with people’s personal and emotional lives. As regulators worldwide watch closely, developers must prioritize user safety and clear consent if they want to avoid similar penalties.

Italy’s landmark enforcement action against Replika’s developer marks a turning point in AI governance, making it clear that as AI tools grow more humanlike, accountability and privacy cannot be afterthoughts. For companies and users alike, this is a wake-up call: the future of AI must be built on trust, transparency, and responsibility.

Last updated: September 4th, 2025

About this article: This article was generated with AI assistance and reviewed by our editorial team to ensure it follows our editorial standards for accuracy and independence. We maintain strict fact-checking protocols and cite all sources.

Word count: 333Reading time: 0 minutesLast fact-check: September 4th, 2025

AI Tools for this Article

Browse All Articles
Share this article:
Next Article