A recent flap over a controversial use of Big Data analyzation techniques by Facebook has once again spurred calls for the more ethical use of data-gathering tools by companies.

Earlier this month, Facebook drew fire for an “experiment” that studied how users' news feeds could be manipulated to affect their moods.  The so-called “emotional contagion experiment” involved nearly 700,00 users, all of whom, according to Facebook, agreed to the privacy parameters set forth in its Terms of Use agreement that allow such tinkering.

With an apology, Facebook insisted it had considered the ethics of conducting the project. “The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product,” Adam Kramer, one of the Facebook researchers involved in the study, said in a statement.

The research and the reaction to it, however, illustrate a dilemma companies have faced since the early years of the Internet: How do we use the powerful data-gathering abilities that the online environment affords without trampling on the privacy of customers and others. “Just because we can do something, doesn't mean we should do it,” warns Deborah Johnson, a professor of applied ethics in the School of Engineering and Applied Sciences at the University of Virginia.

Unfortunately, many companies on the cutting edge of social media and Big Data give into the rationale that, “if the law doesn't tell me I can't, why shouldn't I?” says Johnson. “The way the information world has developed has been a free for all.”  

The episode may prove to be a tipping point for the consideration of ethics in Big Data. Moving forward, companies will need to give greater consideration to ethics when finding new and creative ways to collect and parse data. Facebook itself, through Kramer's apology, acknowledged as much. “While we've always considered what research we do carefully, we have been working on improving our internal review practices,” he said. “The experiment in question was run in early 2012, and we have come a long way since then. Those review practices will also incorporate what we've learned from the reaction to this paper.”

The power of Big Data—piecing together clues that reveal more about the personal lives of customers and potential customers—should not be treated cavalierly. Many companies have created privacy committees and installed privacy officers.  This is important to insulate the brand from negative repercussions and associations.

Developing a Process

“Companies have to, at least, take the ethical perceptions of what they are doing into account in the short term,” says Neil Richards, professor of law at Washington University School of Law and co-author of a research paper, “Big Data Ethics.” “If they do something perceived as outrageous, they are going to suffer a short-term business hit. In the long term, developing a process for ethical data usage is essential to ensure productive and profitable relationships with customers, users, and business partners.”

Companies will need to give greater consideration to what they plan to do with the data, even before they begin collecting it. “We are starting to realize that, when it comes to data, the era of digital strip mining is over,” Richards adds. “We can't just, as companies, exploit for the immediate short-term gain. Things we do in the short term have long-term consequences.”

Increasingly, companies will likely find that they need an internal arbiter of what is not just legal when it comes to data, but what is ethical. “Organizations are realizing data ethics are not going away, says Kord Davis, a digital strategist, business consultant, and author of the book “Ethics of Big Data.”

“The question, however, is who should be in charge of parsing those ethical quandaries? One of the first places companies may turn to is the compliance function,” he adds. “Organizations already have compliance capabilities, legal capabilities, and program managers capable of taking new enterprise initiatives and developing programs around them.”

“Just because we can do something, doesn't mean we should do something ... The way the information world has developed has been a free for all.”

—Deborah Johnson,

Professor of Applied Ethics,

University of Virginia.

While compliance may be “a fine place to start” initially, Davis says companies need to dig deeper. “We are on the cusp of organizations realizing what skill sets and business processes they need to develop,” he says. “This can be formed by compliance, but ethical data handling is not just a compliance issue and organizations are starting to realize that.”

Leading the Ethical Discussion

Companies will also need to embrace values-based management. “There should always be somebody outside the system who is observing, validating, and analyzing how the system is working and whether it was doing what it was intended to do,” Davis suggests. Organizations are going to realize this idea of having a '10th man' that puts them in a position to do that internal review, analysis, and reporting.”

It may be easier said than done, however. “One of the big challenges I've seen, is that organizations just don't know how to have ethical discussions in the context of business,” he says. “Why is at ethics so hard? It is a loaded word. It makes people uncomfortable and it implies that you and your values are gong to be judged.” Those fears, however, can quickly dissipate when a company commits to having ethical discussions. “At a minimum, if you just create a space for the explicit conversations, you are gong to be in a better position,” he says.

A SENATOR'S QUESTION TO THE FTC

The following is from a letter written by U.S. Sen. Mark Warner (D-Va.) that asked the Federal Trade Commission to provide more information on recent reports that the social network Facebook conducted an experiment involving nearly 700,000 users to study the emotional effect of manipulating information on their News Feeds.

While Facebook may not have been legally required to conduct an independent ethical review of this behavioral research, the experiment invites questions about whether procedures should be in place to govern this type of research.

I am not convinced that additional federal regulation is the answer. Public concerns may be more appropriately addressed through industry self-regulation. As the federal regulator with oversight of privacy and consumer protection policies, I would be interested in your responses to the following questions:

Does the FTC have a role to play in improving transparency, accountability, and consumer trust in industry's use of Big Data?

Are there better ways to educate consumers or otherwise improve transparency, about the practices consumers agree to through their use of social media platforms? Are there incentives in placefor companies to voluntarily create, or to consult with independent review boards, or to utilize other means of self-regulation before conducting studies such as this? Additionally, are there incentives that could encourage the hiring or designation of Chief Privacy Officers at social media companies, or to establish other credible review programs?

Does the FTC make any distinction between passively observing user data versus actively manipulating it? Should consumers be provided more of an explicit opt-in or opt-out of such studies? Additionally, is it appropriate for any research findings to be shared with participants prior to public dissemination?

Does the FTC or another federal entity require any additional regulatory authority or technology in order to monitor this type of data-mining?

Source: Sen. Mark Warner.

“All the headlines out there talk about how data is the new currency,” says Dave Deasy, a vice president at TRUSTe, a data privacy management company. “Yes, but the old currency is trust. Companies built their brand on this idea of building a trusted relationship with their customers. Yet, you make a couple of missteps with regard to how you are collecting data, and you wipe out years of brand trust you built up over time.”

While companies focus on data from a legal compliance perspective, at least initially, the “ultimate driver” needs to be making sure they can continue to have a trusted relationship with their customers. “Companies are in an unprecedented place in terms of their ability to do creative things from a marketing perspective, but at the same time that creates lots of challenges, and it is only going to get harder,” Deasy says.

Deasy's advice for companies is to start with privacy and transparency as a cornerstone for all business decisions. “It is all about letting people know what data you are collecting, what you are doing with that data, and giving them the ability to control it,” he says.

“Step one is making sure the company understands where all the data is being collected and conducting a data audit,” he suggests. The second step is putting internal procedures and guidelines in place around who gets access to data and what they can do with it. Next, there must be a “big focus on training” so those procedures and policies are communicated throughout the company.

Companies also need to carefully vet and audit any third parties that gain access to customer data. “Sometimes they may know exactly who those third parties are and how they got there,” Deasy says. “It can be a complex thing for a company to understand all that tracking activity and be able to manage it.”

“One of the fundamental focus areas of compliance is having proper vendor management procedures in place,” he adds. “These third parties are not as easy to figure out and, in a lot of cases, there are fourth parties that can have access to your Website through other third parties. If you don't have the right tools to see that, and manage it, there can be unintended consequences.”