The U.K.’s data regulator has launched an investigation into the use of facial recognition technology at London’s King’s Cross, an area of the capital which is home to one of the city’s busiest railway stations.

Last Friday, Information Commissioner Elizabeth Denham said she was “deeply concerned” about why the technology was being used and whether its use is legal.

“Scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all. That is especially the case if it is done without people’s knowledge or understanding,” said Denham in a statement.

“I remain deeply concerned about the growing use of facial recognition technology in public spaces, not only by law enforcement agencies but also increasingly by the private sector. My office and the judiciary are both independently considering the legal issues and whether the current framework has kept pace with emerging technologies and people’s expectations about how their most sensitive personal data is used,” she added.

Argent, the developer which manages the King’s Cross site, was approached for comment but did not respond before publication.

Facial recognition technology is a “priority area” for the Information Commissioner’s Office (ICO), and the regulator has said it will not hesitate to use its investigative and enforcement powers to protect people’s legal rights.

In the statement, the ICO said it will seek detailed information about how the technology is used, as well as inspect the system and its operation on-site to assess whether or not it complies with data protection law.

“Put simply, any organisations wanting to use facial recognition technology must comply with the law—and they must do so in a fair, transparent and accountable way,” said Denham. “They must have documented how and why they believe their use of the technology is legal, proportionate and justified.”

“We support keeping people safe but new technologies and new uses of sensitive personal data must always be balanced against people’s legal rights,” she added.

London’s Metropolitan Police has carried out 10 trials using facial recognition technology across the capital as part of efforts to incorporate the latest technologies into day-to-day policing. In May, an ethics panel provided guidelines to the Mayor of London, Sadiq Khan, as to how it could be used appropriately to reduce crime.

Also in May, however, the same police force revealed the ICO was investigating it over the potentially illegal use of the technology, which was also found to be wrong in 81 percent of cases.

Civil liberties campaign group Big Brother Watch says major property developers, shopping centers, museums, conference centers, and casinos already use the technology in the United Kingdom, and that it is also used to scan people attending concerts and even gallery exhibitions.

Last year, the Trafford Centre—Manchester’s main shopping complex—was forced to stop using live facial recognition surveillance following an intervention by the Surveillance Camera Commissioner, the government office that regulates CCTV usage. It was estimated that up to 15 million people had been scanned.