Companies that use Big Data analytics and don’t unintentionally want to run afoul of employment discrimination laws will want to carefully review a new report issued this month by the Federal Trade Commission.
The FTC’s lengthy 50-page Big Data report, released on Jan. 6, warns companies about the sort of ethical, legal, and compliance risks they could encounter when using data analytics practices that fly in the face of consumer protection and equal opportunity laws. The report also poses a series of questions for companies to consider when using Big Data to mitigate these risks.
In remarks this month at the FTC’s PrivacyCon conference, FTC Commissioner Julie Brill noted that data analytics “have developed more quickly than have frameworks for specific concrete guidance on legal and ethical issues.” The report marks the “first step toward providing such guidance,” she said.
Big Data refers to the collection of massive troves of consumer information by companies, which they can then analyze to reveal consumer and employee patterns and trends. “The analysis of this data is often valuable to companies and to consumers, as it can guide the development of new products and services, predict the preferences of individuals, help tailor services and opportunities, and guide individualized marketing,” the FTC stressed in its report.
Because no single law comprehensively addresses data analytics, however, companies may wrongly assume they can use it without regulatory constraint. Thus, the FTC report means to alert companies about already existing consumer protection laws under which data analytics practices apply. The FTC further warned in its report that it will “continue to monitor areas where Big Data practices could violate existing laws … and will bring enforcement actions where appropriate.”
“Make sure you’re not just looking at data without considering other things that have to go into the mix before you make an employment decision.”
Joseph Lazzaroti, Principal & Co-Leader, Privacy and Data Security Practice, Jackson Lewis
Below is a discussion of those relevant laws and how companies can mitigate legal and compliance risks.
Fair Credit Reporting Act
The Fair Credit Reporting Act (FCRA) applies to consumer reporting agencies (CRAs) that compile and sell consumer reports containing consumer information intended to be used for credit, employment, insurance, housing, or other eligibility considerations. The emerging risk posed by Big Data is that a new trend is emerging whereby companies are purchasing predictive analytics products to make eligibility determinations.
“If an unaffiliated firm regularly evaluates companies’ own data and provides the evaluations to the companies for eligibility determinations, the unaffiliated firm would likely be acting as a CRA, each company would likely be a user of consumer reports, and all of these entities would be subject to Commission enforcement under the FCRA,” the FTC said.
Consider the following example: A company asks a consumer to provide his zip code and shopping behavior on an application, strips any identifying information, and then sends the application to a data analytics firm. That firm then analyzes the creditworthiness of other consumers in the same zip code, and then provides an analysis of such data to the company, knowing that it will be used to decide the consumer’s credit eligibility.
QUESTIONS FOR LEGAL COMPLIANCE
Below is an excerpt from the FTC’s report, “Big Data: A Tool for Inclusion or Exclusion,” which offers the following considerations for companies already using or considering engaging in Big Data analytics:
If you compile Big Data for others who will use it for eligibility decisions (such as credit, employment, insurance, housing, government benefits, and the like), are you complying with the accuracy and privacy provisions of the FCRA? FCRA requirements include requirements to (1) have reasonable procedures in place to ensure the maximum possible accuracy of the information you provide, (2) provide notices to users of your reports, (3) allow consumers to access information you have about them, and (4) allow consumers to correct inaccuracies.
If you receive Big Data products from another entity that you will use for eligibility decisions, are you complying with the provisions applicable to users of consumer reports? For example, the FCRA requires that entities that use this information for employment purposes certify that they have a “permissible purpose” to obtain it, certify that they will not use it in a way that violates equal opportunity laws, provide pre-adverse action notice to consumers, and thereafter provide adverse action notices to those same consumers.
If you are a creditor using big data analytics in a credit transaction, are you complying with the requirement to provide statements of specific reasons for adverse action under ECOA? Are you complying with ECOA requirements related to requests for information and record retention?
If you use big data analytics in a way that might adversely affect people in their ability to obtain credit, housing, or employment: Are you treating people differently based on a prohibited basis, such as race or national origin? Do your policies, practices, or decisions have an adverse effect or impact on a member of a protected class, and if they do, are they justified by a legitimate business need that cannot reasonably be achieved by means that are less disparate in their impact?
Are you honoring promises you make to consumers and providing consumers material information about your data practices?
Are you maintaining reasonable security over consumer data?
Are you undertaking reasonable measures to know the purposes for which your customers are using your data?
o If you know that your customer will use your big data products to commit fraud, do not sell your products to that customer. If you have reason to believe that your data will be used to commit fraud, ask more specific questions about how your data will be used.
o If you know that your customer will use your big data products for discriminatory purposes, do not sell your products to that customer. If you have reason to believe that your data will be used for discriminatory purposes, ask more specific questions about how your data will be used.
The FTC said that FCRA requirements and protections could be triggered any time a company makes a credit decision based on a consumer report. Therefore, it is important from a legal and compliance standpoint to conduct a fact-specific analysis as to whether the intent of a Big Data initiative or product is to make an eligibility decision about the consumer. In contrast, the FCRA does not apply when companies use data derived from their own relationship with their customers for purposes of making decisions about them, the FTC said.
Equal Employment Laws
When engaging in Big Data analytics, companies also need to consider numerous federal equal opportunity laws, including Title VII of the Civil Rights Act, the Age Discrimination in Employment Act, the American with Disabilities Act, and the Genetic Information Nondiscrimination Act. Generally speaking, companies can be found in violation of these laws if their practices suggest “disparate treatment” or a “disparate impact” on protected characteristics—such as race, gender, religion, age, disability status, national origin, and more.
In an employment context, companies can realize a lot of benefits through the use of data analytics, but you have to analyze that against other considerations—such as potential disability and workers comp issues, says Joseph Lazzaroti, principal and co-leader of the privacy, e-communication, and data security practice at Jackson Lewis. “Make sure you’re not just looking at data without considering other things that have to go into the mix before you make an employment decision,” he says.
Federal Trade Commission Act
Section 5 of the Federal Trade Commission Act prohibits unfair or deceptive acts or practices in or effecting commerce. Unlike the FCRA or equal opportunity laws, Section 5 is not confined to particular market sectors, but is generally applicable to most companies acting in commerce.
Section 5 defines a “deceptive” act or practice as one that involves a material statement or omission that is likely to mislead a consumer acting reasonably under the circumstances. An act or practice is “unfair” if it is likely to cause substantial consumer injury.
According to the FTC, companies engaging in Big Data analytics should consider whether they are violating any material promises to consumers—whether that promise is to refrain from sharing data with third parties, to provide consumers with choices about sharing, or to safeguard consumers’ personal information—or whether they have failed to disclose material information to consumers.
Lazzaroti says companies often get into trouble when they collect data for one purpose, keep that data for several years, and then start using it for a different purpose, not necessarily intending to be deceptive. So being transparent about how the data is being used mitigates risk.
Legal and compliance officers should consider the following important questions: “What kinds of promises are you making to consumers? What kinds of things are you doing with data that you’re not telling consumers about?” says Reed Freeman, a partner and co-chair of the cyber-security, privacy, and communications practice at WilmerHale.
In addition, companies that maintain Big Data on consumers should reasonably secure consumers’ data. The FTC also directs companies not to sell Big Data analytics products to customers if they know or have reason to know that those customers will use the products for fraudulent or discriminatory purposes. “The FTC is focused not only on the collection of the data itself, but more so that it’s collected under the presumption that people know how it’s going to be used,” says Lazzaroti.
To mitigate the risks of Big Data usage, the FTC recommends that companies consider the following questions:
How representative is your data set? Companies should consider whether their data sets are missing information about certain populations and take steps to address issues of underrepresentation and overrepresentation.
Does your data model account for biases? Companies should consider whether biases are being incorporated at both the collection and analytics stages of Big Data’s life cycle, and they should develop strategies to overcome them.
How accurate are your predictions based on Big Data? Companies should remember that while Big Data is very good at detecting correlations, it does not explain which correlations are meaningful.
Does your reliance on Big Data raise ethical or fairness concerns? Companies should assess the factors that go into an analytics model and balance the predictive value of the model with fairness considerations.
“As Big Data applications become more widespread and cost-efficient, employers may feel the need to use it to remain competitive,” concludes Lazzaroti. “They will need to proceed cautiously, however, and understand the technology, the data collected, and whether the correlations work and work ethically.”