Businesses that make false or unsubstantiated claims regarding facial recognition and other biometric technologies could face enforcement from the Federal Trade Commission (FTC), the agency warned Thursday.

Use of biometric information technologies has proliferated to collect personal information about people, including their age, gender, race and heritage, and even their overall demeanor, the FTC said in a 12-page policy statement.

This collection poses new privacy risks for consumers, who might be unaware their data is being gathered and have no way to avoid the practice, said Samuel Levine, director of the FTC’s Bureau of Consumer Protection, in a press release.

“In recent years, biometric surveillance has grown more sophisticated and pervasive, posing new threats to privacy and civil rights,” Levine said. “Today’s policy statement makes clear that companies must comply with the law regardless of the technology they are using.”

Biometric technologies “could reveal sensitive personal information about [users],” including healthcare, religious, political, or union affiliations, the FTC said. Consumers can be harmed by the secret collection of their data and subsequent use of it for marketing purposes, the agency added.

The policy statement, approved in a 3-0 vote, should serve as a warning and reminder for companies FTC rules apply to biometric data, the agency said.

Practices the FTC will scrutinize include false or unsubstantiated marketing claims about biometric technologies and uses of biometric data that might result in unfairness to consumers.

Businesses must conduct risk assessments before launching any new technology that collects information from the public, the FTC advised. Companies must promptly address any issues observed and use third-party audits and other means to monitor for risks going forward, the agency said.

In addition, businesses that collect biometric data must secure that data to reduce the chance of it being accessed by unauthorized parties. Biometric datasets might be “attractive targets for malicious actors who could misuse such information,” the FTC said.

Companies that collect biometric information are responsible for evaluating the risks introduced by vendors and other third parties who have access to the data.

Relying solely on contractual agreements with third parties about their handling of biometric information is not enough, the FTC noted. The agency said companies “should also go beyond contractual measures to oversee third parties and ensure they are … not putting consumers at risk” through supervision, auditing, and/or monitoring.

The FTC cited enforcement actions in recent years against Everalbum and Facebook for alleged misrepresentations regarding facial recognition technology. The agency issued a report in 2012 regarding facial recognition technology best practices for protecting consumers’ privacy.

Also Thursday, the FTC announced proposed changes to its Health Breach Notification Rule to clarify the rule’s applicability to health apps and other similar technologies. The proposal followed a case earlier this week in which ovulation tracking app Premom was accused of sharing users’ sensitive personal information with third parties for advertising purposes without consent.

Easy Healthcare, the operator of Premom, was fined $100,000 as part of the FTC’s proposed order.