Italy’s data protection authority has dealt a major blow to AI chatbot developer Luka, Inc., imposing a $5.6 million fine for serious violations of user data privacy. The penalty comes after an investigation uncovered that the company behind Replika, a widely popular AI companion chatbot, failed to properly protect its users—especially minors—from inappropriate content and mishandling of personal information.
Replika has gained millions of users worldwide for its ability to simulate emotional connection and companionship. However, Italy’s watchdog found that Luka, Inc. neglected to obtain valid consent for data processing, lacked effective age verification mechanisms, and did not maintain transparency in how it stored and used user conversations. These shortcomings were deemed clear breaches of the EU’s stringent General Data Protection Regulation (GDPR), which safeguards individual privacy and data rights.
This fine sends a strong signal to the AI industry that regulatory bodies are increasingly serious about enforcing compliance and protecting vulnerable users. As AI chatbots become more sophisticated and emotionally engaging, the risk of exposing users—especially children—to inappropriate content or privacy violations grows substantially. Italy’s move highlights the urgent need for developers to embed ethical design and data protection at the core of their products.
