Italy’s data protection authority Garante earlier this month banned U.S.-based artificial intelligence (AI) chatbot creator Replika from processing the personal data of users in the country because of the risks the service posed to minors and vulnerable people.

Replika’s chatbot is marketed as an “AI companion who cares” because it offers users personalized avatars that talk and listen to them and help their emotional well-being. In a Feb. 3 notice, Garante criticized the app’s failure to protect children by allowing it to send inappropriate and sexually explicit responses to young users.

Replika has no age verification mechanism in place, nor any mechanism to block the app if a user declares they are underage. During account creation, said Garante, the platform merely requests a user’s name, email account, and gender—other basic checks are absent.

The regulator warned the idea underpinning the app’s appeal—that a “virtual friend” can improve users’ emotional well-being; help them understand their thoughts; and calm anxiety through stress management, socialization, and the search for love—means the AI is responding to, and potentially manipulating, the moods of people who might be vulnerable.

Garante said Replika is in breach of the General Data Protection Regulation (GDPR) because it does not comply with the law’s transparency requirements and processes personal data unlawfully. Replika’s parent company, Luka, has been threatened with either a fine of up to 20 million euros (U.S. $21.4 million) or 4 percent of its total worldwide annual turnover if it does not cease processing Italian citizens’ personal data while it is noncompliance with the GDPR.

“Replika is a safe space for friendship and companionship,” the company said in an emailed statement. “We don’t offer sexual interactions and will never do so. We are constantly making changes to the app to improve interactions and conversations and to keep people feeling safe and supported.”

The case serves as an example of why tech companies must use “privacy by design” when launching new products and services.

The use of AI to automate decision-making and enhance a customer’s experience across a range of industry sectors is already being scrutinized by Europe’s data protection authorities. Violating children’s privacy has also proved to be a highly damaging enterprise.

Last September, Instagram was assessed a fine of €405 million (then-U.S. $401 million) by the Irish Data Protection Commission for failing to adequately protect teenage users’ data. TikTok has faced similar allegations; the company was penalized €750,000 (then-U.S. $883,000) by the Dutch Data Protection Authority in July 2021 and warned of a potential fine of 27 million pounds (U.S. $32.4 million) by the U.K. Information Commissioner’s Office in September.

“Fully incorporating approaches that protect users’ data, such as ‘privacy by design,’ still seem a long way off,” said Gijs Barends, co-founder of data specialists Dataprovider.com.

“Even the biggest tech companies still put commercial gain above privacy, and fines don’t seem to be effective in forcing businesses to comply with data protection regulations.”

Michael Queenan, CEO, Nephos Technologies

Instead, companies are so driven by the goal of technological innovation they overlook the crucial question of whether their service is even legally compliant with privacy laws.

“For tech companies,” said Angel Maldonado, chief executive of software company Empathy.co., “disruptive solutions are a necessity, whereas ethical products are viewed as ‘a nice to have.’” Further, he said, the companies don’t understand “data isn’t business-owned—it’s user-owned” and believe sanctions taken against businesses for GDPR noncompliance are “warning shots for others” rather than themselves.

Nigel Jones, co-founder of data tech firm the Privacy Compliance Hub, said the problem is a lot of tech companies are “too narrowly focused and not sufficiently incentivized to think about privacy.”

“Founders are focused on solving a problem they have identified at the expense of other problems that the solution may cause,” said Jones, while investors “care only about growth in users and revenue and see privacy and the GDPR as a break on such growth. Neither founders nor investors see sufficient opportunity in doing privacy right—or risk in getting it wrong.”

He added the main problem is a lack of understanding of the basics of data privacy, “because a lot of companies simply don’t care enough to find out.”

Some experts believe although the GDPR was designed to establish rules about how organizations should handle and protect data, the legislation—and its sanctions—are simply not enough to change corporate behavior.

“Even the biggest tech companies still put commercial gain above privacy, and fines don’t seem to be effective in forcing businesses to comply with data protection regulations,” said Michael Queenan, chief executive of data services consultancy Nephos Technologies. “… There should be a standard of formal age authentication for any social media platform as is mandatory with financial services applications. Email addresses and Facebook accounts should not be accepted as proof of age when they are so easy to manipulate.”

Queenan said he expects to see an increase in fines issued for data misuse.

“While fine avoidance should not be the only motivation for businesses to implement the correct data governance policies, it’s a step in the right direction,” he said.