Thanks to advances in technology, companies now have more ways than ever before to amass troves of personal data on consumer practices and to generate business in a more targeted fashion. But such data mining practices are proving to be a sore spot for compliance and enforcement risks.

In recent months, numerous companies have found themselves the target of legal and enforcement actions that easily could have been avoided. Many of these complaints, for example, are the result of companies not obtaining proper consent to collect personal data, overstepping the boundaries of consent, or not giving individuals the right to opt-out.

“If companies simply complied with their own policies, if they gave full disclosure or obtained full consent, a lot of these actions couldn’t be brought,” says Mark Eisen, an associate with law firm Sheppard Mullin. Consistently updating terms of use and privacy policies to ensure you’re abiding by your own terms can mitigate a lot of the legal liability companies are facing, he says.

Few statutes create causes of action over privacy rights concerning data mining practices, and so “over the last couple of years, the most common type of lawsuit that we’ve seen in the privacy area that has been successful has been your run-of-the-mill breach of contract type claims,” Eisen says. Many of these complaints allege that companies collected personal data when they said they wouldn’t, or did so without terms of use and privacy policies, he says.

Technology giant Google is battling a lawsuit right now, for example, over breach of contract allegations. That case, Svenson v. Google, alleges that Google failed to honor its own privacy policy governing its Google Wallet electronic payment service by giving users’ personal information to third-party app developers. In April, a California federal judge rejected Google’s attempt to dismiss the lawsuit, which was first filed by Alice Svenson, a Google Wallet user, in 2013.

Collecting and sharing personal data without the knowledge or consent of users is also a risky practice for companies, as demonstrated by a class action unfolding right now against Facebook, Twitter, Instagram, Apple, and 14 other technology and social networking companies. According to the lawsuit, the companies collected and shared personal data from Apple’s iPhones and iPads without the knowledge or consent of users.

In March 2015, U.S. District Judge Jon Tigar for the Northern District of California ruled that the class could proceed forward. He rejected the app developers’ arguments that Apple device users knew the “Find Friends” feature on the apps would scan their personal contacts, ruling that the users didn’t know the information would be used in unauthorized ways. The original lawsuit was filed in 2012 against app developer Path over allegations that its photo sharing and messaging app was accessing users’ contacts and calendar information without consent.

One important way companies can reduce the risk of liability is to ensure that consumers are fully aware of what personally identifiable information is being collected, why it’s being collected, how it’s being used, and with whom it is shared. “The more transparency there is, the better,” says Francoise Gilbert, managing director of the IT Law Group.

A report conducted last year by the Global Privacy Enforcement Network (GPEN) finds that mobile apps still have lots of work to do in this area. According to GPEN’s report, 85 percent of 1,200 mobile apps surveyed failed to clearly explain how they were collecting, using, and disclosing personal information.

Another important consideration should be “privacy by design,” says Gilbert. When developing a new website or mobile app, privacy protections should be taken into consideration at the developmental stage, rather than as an afterthought during the product launch, she says. (The concept is very much in vogue with privacy regulators in Europe.)

“If companies simply complied with their own policies, if they gave full disclosure or obtained full consent, a lot of these actions couldn’t be brought.”
Mark Eisen, Associate, Sheppard Mullin

FTC Action

In another recent action, Nomi Technologies in April became the first retail tracking company to reach a settlement with the Federal Trade Commission for failing to live up to promises made in its privacy policy. Nomi is a third party that employs what is called “beacon” technology, which allows retailers to track customers in a store to provide promos and offers in real-time.

According to the FTC’s complaint, Nomi’s privacy policy promised that it would provide an opt-out mechanism for consumers at retailers using its services, when in reality no such option was made available. Furthermore, consumers were not informed when tracking was taking place, the complaint alleged.

As part of the settlement, Nomi is barred from misrepresenting the options that consumers have to exercise control over information that Nomi collects, uses, discloses, or shares about them or their devices. The company may be subject to civil penalties if it violates any of these prohibitions.

BREACH OF CONTRACT CLAIM

Below is a partial excerpt from Svenson v. Google on the court’s decision to allow the breach of contract claim to move forward.
Plaintiff does not dispute that some users agree to the Google Wallet Terms of Service when initially registering for Wallet. However, she alleges that even after initially registering for Wallet, users are required to enter into a new and separate ‘Buyer Contract’ in order to complete each subsequent App purchase. According to Svenson, a ‘Buyer Contract’ is not complete until the user making the purchase in the Play Store clicks a button that (1) indicates consent to the Google Wallet Terms of Service existing at the time of the purchase and (2) authorizes Google to execute the transaction. It is not clear from the [first amended complaint] whether Svenson initially registered for Wallet and then later purchased the App or whether she registered for Wallet concurrently with purchasing the App. The court concludes that this lack of clarity makes little difference, as under Svenson’s theory even a user who already has a Wallet account must enter into a new ‘Buyer Contract’ at the time of each subsequent App purchase. Thus, the court does not view Svenson’s current allegations to be irreconcilable with her earlier allegations. Moreover, to the extent that Google disputes the factual accuracy of the current allegations—that a separate “Buyer Contract” is created each and every time a user purchases an App in the Play Store—resolution of that factual dispute is not appropriate on a motion to dismiss. Nothing in this order precludes Google from challenging Svenson’s allegations regarding ‘Buyer Contracts’ in a motion for summary judgment or at another appropriate point in the litigation. However, for pleading purposes, plaintiff adequately has alleged the existence of a ‘Buyer Contract’ between herself and Google.
Source: Svenson v. Google.

“This case is simply about ensuring that when companies promise consumers the ability to make choices, they follow through on those promises,” FTC Chairwoman Edith Ramirez said in a statement. “The order will also serve to deter other companies from making similar false promises and encourage them to periodically review the statements they make to consumers to ensure that they are accurate and up-to-date.”

Not all the Commissioners agreed with the decision. In a dissent, FTC Commissioner Maureen Ohlhausen argued that Nomi—a third-party contractor, with no direct relationship with consumers—was unfairly held liable. Nomi offered consumers the opportunity to opt out on its website, while Nomi’s retail clients didn’t offer consumers the opportunity or ability to opt out at all, she wrote in her dissent.

The Nomi action serves as warning to other companies about the importance of backing up the claims you make in your privacy policies or terms of use, “making sure you’re doing what you say you’re doing,” says Ross Buntrock, a partner with Arnall Golden Gregory. That applies to policies no matter whether they’re posted in an app, on a website, or elsewhere, he says.

Training and awareness is equally important. “Policies are useless, unless you have organization-wide awareness and training on how to comply with them,” Buntrock says.

App developers think only about the technology. “They don’t think about the privacy issues,” Gilbert says. “So as a compliance officer, you have to teach everybody in the company—from the C-suite to the programmers—about privacy and what that means,” she says.

Kristoph Gustovich, director of services at Mitratech, advises that the business units—compliance, legal, IT, and privacy—need to work collaboratively “to come together to come up with a structured policy, standards, guidelines—ultimately the tools to make compliance around data protection most effective as possible.”

“The marketing and the business side are always going to look at legal and compliance as holding them back,” Eisen says. Running new methods of data collection by legal or compliance, however, are easy ways to ensure the business reduces its liability risk, he says. “Getting the right counsel before moving forward with data collection, and especially data sharing and data disclosure, will often times save serious legal headaches.”