You are on page 1of 10

NOTE:

The test at the end of this segment includes at least one question
on this reading.

When you have finished the article and its Review Questions,
click on the forward arrow to view the Discussion Issues for this
segment.

On January 6, 2016, the Federal Trade Commission (FTC) released a


report on the growing use of "big data" which discusses potential
benefits and risks to big data use and offers practical and legal
considerations for businesses. The report, Big Data: A Tool for
Inclusion or Exclusion? Understanding the Issues, focuses on the
potential impact of big data on low-income and underserved
populations and addresses a host of considerations related to how
such data is used. The report follows the FTC's public workshop, "Big
Data: A Tool for Inclusion or Exclusion," on September 15, 2014.

The report begins by describing big data and the typical life cycle
phases involved, and then explores possible benefits and risks to big
data use. The report concludes by identifying potentially applicable
laws and offering legal and compliance considerations for businesses
using big data. Notably, the report expressly states that it was "not
intended to identify legal or policy gaps," but rather guide companies
on existing laws that may apply to big data practices. That focus
suggests that the Commission believes that existing laws provide for
meaningful regulation of big data and that the FTC plans to use
existing authority to address big data practices it deems problematic.

Commissioner Ohlhausen issued a separate statement indicating that


she supported today's report and acknowledged that concerns about
potential effects of inaccurate data are legitimate, but noted that
businesses have strong incentives to ensure accuracy and that free-
market competition may inherently resolve the identified issues.

What is Big Data and How is it Used?

The report explains that the term "big data" does not have a singular
definition, but refers to "a confluence of factors, including the nearly
ubiquitous collection of consumer data from a variety of sources, the
plummeting cost of data storage, and powerful new capabilities to
analyze data to draw connections and make inferences and
predictions." Big data is often characterized by reference to the "three
Vs": volume (the sheer amount of data that can now be collected and
analyzed); velocity (the speed at which industry can collect it); and
variety (the breadth and diversity of data).

In synthesizing discussions from the workshop and the sixty-five


public comments submitted, the FTC divided the lifecycle of big data
into four phases: (1) collection; (2) compilation and consolidation; (3)
data mining and analytics; and (4) use. The Commission's May 2014
report, Data Brokers: A Call for Transparency and Accountability,
focused on the first three phases of the lifecycle, while this report
focuses on uses of big data and potential benefits and risks to use for
underserved populations.

Potential Benefits of Big Data: In terms of benefits, the Commission


acknowledged that using algorithms to identify patterns could facilitate
efficiently matching products and services to consumers. The report
identifies a number of ways that big data is already being used to the
benefit of low-income and underserved communities, including by:

 Increasing educational attainment by analyzing big data to identify students for advanced
classes that might not otherwise be eligible or by examining trends that might otherwise be
left unexamined but result in a better understanding of educational effects.
 Providing access to credit by using big data to provide alternative credit scores based on
non-traditional data such as educational history, professional licensure data, and personal
property ownership data for populations that previously were deemed unscorable.
 Providing tailored health care solutions to individuals and underserved communities by
using big data sets to consider individual variability in genes, environment, and lifestyle in
developing disease prevention and treatment plans.
 Increasing equal access to employment by using big data to consider shortcomings and
potential changes to employer hiring practices.
Potential Risks of Big Data: The Commission noted that researchers
and others have expressed concern that big data analytics could be
used to make predictions that disproportionately impact certain
populations, such as by excluding certain populations from targeted
service offerings based on inaccurate predictions. While big data may
be great at showing correlations, large data sets may identify spurious
correlations that lack any element of causation, according to the
report. Reliance on such trends may result in detrimental effects on
low-income and underserved populations, such as by:

 Denying opportunities based on the actions of others, for example, by lowering a


customer's credit limit based on analysis of other customers.
 Reinforcing existing disparities by targeting ads for financial products such that low-
income consumers may never receive ads for better offers.
 Exposing sensitive information such as sexual orientation, ethnic origin, or alcohol, drug
and cigarette use.
 Creating new mechanisms for exclusion, for instance, by using identified trends (e.g.,
individuals who install non-default web browsers are better employees) to support hiring
decisions.
Current Law Governing Big Data and Related Considerations

The report acknowledges that companies will inevitably continue to


use big data as a reality of today's marketplace, but emphasizes that
companies should ensure they have an adequate understanding of
laws governing big data before doing so and adapt practices to reflect
considerations under those laws. While the Commission
acknowledges that current laws may not address every instance of
potential misuse of big data, it notes that the report is intended to
provide an overview of the existing framework and suggests that the
Commission believes it can exercise significant oversight under the
current regime.

The report addresses three statutes specifically: (1) the Fair Credit
Reporting Act; (2) equal opportunity laws, including the Equal Credit
Opportunity Act, and (3) the FTC Act.

The Fair Credit Reporting Act: The Fair Credit Reporting Act (FCRA)
requires consumer reporting agencies (CRAs) that compile and sell
consumer reports to implement certain policies and procedures to
ensure the safety and accuracy of reports, provide access to
consumers' own information, and follow reasonable procedures to
correct identified errors. "Consumer report" is defined broadly and
generally includes reports bearing on a consumer's credit, character,
general reputation, personal characteristics if that report is to be used
for a specified purpose such as credit, insurance or employment.

Consumer reports are only allowed to be provided when the receiving


entity has a permissible purpose, which may be the consumer's own
written authorization or for credit, employment, insurance or housing
determinations. Moreover, users of consumer reports are required to
take certain steps and provide disclosures when they take adverse
action as a result of information contained in a consumer report.

While consumers typically think of conventional credit bureaus or


background screening services as CRAs, the definition under FCRA is
substantially broader and includes any entity that regularly furnishes
consumer reports to third parties for fees. This means that data
brokers and other entities providing reports with consumer information
may constitute CRAs under FCRA, even if the provided reports do not
contain conventional information typically used in consumer reports.
Indeed, the FTC's report highlights enforcement actions that the
Commission took with respect to online data brokers that were
supplying consumer data for purposes that rendered the information
subject to FCRA but that failed to comply with FCRA's requirements.

For example, in United States v. Spokeo, the Commission brought


charges against a data broker for merging online and offline data to
create detailed personal profiles that were used by human resource
departments for hiring decisions. Because such reports were used for
employment purposes, they constituted consumer reports under
FCRA but Spokeo had not complied with the requirements for
providing such reports under FCRA. The report also notes that the
FTC has brought actions against users of consumer reports who have
failed to comply with requirements under FCRA to use the reports. In
2013, the Commission brought charges against Time Warner Cable
as a consumer report user on the grounds that, according to the
complaint, it should have provided certain consumers with a risk-
based pricing notice under FCRA and the Risk-Based Pricing Rule.

While FCRA provides a potentially powerful tool to address a


company's use of third party data, it does not generally apply to
companies when they use data derived from their own relationship
with the customer. However, the Commission posits in the report that
an unaffiliated company's aggregation and evaluation of a company's
own data would likely make the unaffiliated company a CRA and the
other company a user of consumer reports subject to FCRA.

The Commission also articulated a novel position that a report could


still constitute a "consumer report" even if it does not identify a specific
consumer, provided it is crafted for eligibility purposes with reference
to a particular consumer or set of particular consumers. The report
distinguished between generating an analysis of a group that shares
characteristics with the consumer or consumers being evaluated
(potentially a consumer report) from pulling an existing analysis of a
characteristic that was not created based on a particular consumer or
consumers (likely not a consumer report).

Ultimately, a company's obligations under FCRA relative to big data


practices will be highly fact specific and depend on the scope and
specificity of the report and how the report is used, amongst other
factors.

Considerations for Legal Compliance: The report offers the following


considerations for companies compiling and/or using big data relative
to FCRA:

 CRAs engaged in big data analytics should review the accuracy and privacy provisions of
the FCRA, which include requirements to:(1) have reasonable procedures in
place to ensure the maximum possible accuracy of the information
you provide,
(2) provide notices to users of your reports,
(3) allow consumers to access information you have about them, and
(4) allow consumers to correct inaccuracies.

 Users of consumer reports should review the provisions applicable to users of consumer
reports under the FCRA, such as the "permissible purpose" provisions and adverse notice
requirements.
Federal Equal Credit and Employment Opportunity Laws: Companies
using big data analytics should also review and familiarize themselves
with the federal equal credit and employment opportunity laws, which
prohibit discrimination based on protected characteristics. These
include the Equal Credit Opportunity Act ("ECOA"), Title VII of the
Civil Rights Act of 1964, the Americans with Disabilities Act, the Age
Discrimination in Employment Act, the Fair Housing Act, and the
Genetic Information Nondiscrimination Act.

Disparate Impact or Disparate Treatment: These sector-specific anti-


discrimination laws require proof of either disparate impact or
disparate treatment, both of which are theories the FTC has advanced
successfully in its ECOA-related enforcement actions. Disparate
treatment occurs when an entity treats an applicant differently based
on a protected characteristic such as race or national origin. Among
other things, the ECOA prohibits discrimination based on whether an
applicant receives public assistance. The report cites to several
disparate treatment theory enforcement actions where the FTC
alleged that lenders excluded public assistance income in deciding
whether to extend credit to a consumer in violation of the ECOA.

The report also highlights an FTC ECOA action involving a disparate


impact analysis. A disparate impact analysis involves a facially neutral
policy that has a disproportionate adverse impact on a protected
class. The policy may violate the ECOA if it has a disparate impact
unless it furthers a legitimate business need that cannot reasonably
be achieved by means that are less disparate in their impact. In FTC
v. Golden Empire Mortgage, Inc., the FTC alleged that the mortgage
lender charged Latino mortgage loan applicants higher prices than
non-Latino white applicants and failed to appropriately monitor loan
officers and branch managers.8 In Golden Empire, there was no
legitimate business need justifying this pricing disparity.

Advertising and Anti-Discrimination Laws: Using big data for targeted


advertising may also implicate sector specific anti-discrimination laws.
Indeed, the FTC's report explored whether a credit product
advertisement targeted to a specific community would implicate equal
opportunity laws. Assuming the offer is open to all to apply and there
is no disparate treatment or unjustified disparate impact in subsequent
lending, there is likely no equal opportunity law violation. However, the
Commission cautioned that companies should review Regulation B,
the implementing regulation for ECOA, in tandem with a targeted
advertising campaign. Regulation B has certain record keeping
requirements for prescreened solicitations and prohibits creditors from
making statements to applicants that would discourage, on a
prohibited basis, a reasonable person from pursuing an application.

Similar to a FCRA analysis, whether an employment or credit practice


is unlawful will be fact and case specific. Nevertheless, it would be
prudent to review credit or employment practices that rely on big data
to ensure they do not discriminate on the basis of protected
characteristics in violation of these laws.

Considerations for Legal Compliance: The report offers the following


considerations for entities using big data relative to compliance with
equal credit and equal opportunity laws:

 Creditors using big data analytics in a credit transaction should review the adverse action
requirements under ECOA and requirements related to requests for information and record
retention.
 Big data analytics decision-makers that approve credit, housing, or employment
applications should consider whether they are treating people differently based on a
prohibited basis, such as race or national origin.
 They should likewise consider whether their policies, practices, or decisions have an
adverse effect or impact on a member of a protected class, and if they do, whether that impact
is justified by a legitimate business need that cannot reasonably be achieved by means that
are less disparate in their impact.
Section 5 of the FTC Act: Companies should also consider whether, in
using big data analytics, they are violating any material promises to
consumers or making omissions of material facts that are likely to
mislead consumers. Such acts or practices may violate Section 5 of
the FTC Act, which prohibits unfair or deceptive acts or practices. The
Commission's report offers several examples of enforcement actions
where it initiated actions against companies for violating material
promises to consumers under its deception authority. For example,
the FTC brought an action against CompuCredit for failing to disclose
that the company used a behavioral scoring model to reduce credit
lines when consumers used their cards for cash advances or certain
types of transactions when the company touted the capacity for
consumers to use cards for those purposes.

The report also points to the applicability of FTC's unfairness authority


to the misuse of big data. The elements of an unfairness claim under
the FTC Act are (i) a substantial injury, (ii) that is not reasonably
avoidable by consumers, and (iii) that is not outweighed by the
benefits to consumers or to competition. Cited examples of unfair
practices in the big data context include (1) the failure to reasonably
secure consumers' data and (2) the sale of data to entities that a
company has reason to know will use the data for fraudulent
purposes. For instance, in 2006, ChoicePoint, a consumer data
broker, settled FTC charges that the company compromised the
financial records of more than 163,000 consumers when it furnished
those records to identity thieves posing as legitimate subscribers and
failed to maintain reasonable procedures to screen prospective
subscribers.

Considerations for Legal Compliance: The report offers the following


considerations for legal compliance relative to the FTC Act:

 Honor promises made to consumers and provide consumers material information about
data practices.
 Maintain reasonable security over consumer data.
 Undertake reasonable measures to know the purposes for which customers are using the
company's data.
Policy Considerations for Companies Using Big Data Analytics

Recognizing the potential for big data benefits and the need to limit
possible harms, the Commission report suggests that companies
already using or considering engaging in big data analytics should
consider:

 Representation. Consider whether data sets are missing information from certain
populations and take appropriate steps to overcome this problem.
 Biases. Review data sets and algorithms to ensure that hidden biases are not having an
unintended impact on certain populations.
 The Accuracy of Predications. A big data finding of correlation does not necessarily mean
that the correlation is meaningful. Balance the risks of using those results, especially where
policies could negatively affect certain populations. Consider human oversight of data and
algorithms when big data tools are used to make important decisions, such as those
implicating health, credit, and employment.
 Ethical or Fairness Concerns. Consider whether fairness and ethical considerations advise
against using big data in certain circumstances. Consider further whether you can use big data
in ways that advance opportunities for previously underrepresented populations.
Conclusion

The report concludes with the FTC encouraging multi-stakeholder


collaborative efforts to maximize the benefits of big data. The
Commission states its intention to continue to bring enforcement
actions were appropriate and to highlight and examine big data
practices that impact, both constructively and adversely, underserved
populations.

Big data is certainly here to stay, and so are the laws that big data
practices can potentially violate. Companies in the big data space
should review the considerations for legal compliance identified in the
report to ensure that their practices do not violate existing laws and
expose them to potential liability.
Postscript: The Right to Disconnect

Have you heard about the new "right to disconnect" law in France that
has finally come into effect on January 1, 2017? Don't believe all the
hype!

While the law may be the first in the world to attempt to address the
modern reality of around-the-clock connectivity of employees, the law
does not have any concrete requirements other than to come up with
a plan to address the problem or any real penalties attached if such a
plan does not work. French employers of 50 or more employees are
already required to discuss with their works council, on a yearly basis,
a plan to provide for the professional equality of men and women and
to provide for work/life balance.

Article L.2242-8 of the Labor Code now includes the following as one
aspect of the required discussion on work/life balance (translated from
the original French):

The full exercise by the employee of his right to disconnect and the
establishment by the company of means to regulate the use of digital
tools, with a view to ensuring respect for rest periods and leave, as
well as personal and family life. In the absence of agreement, the
employer shall draw up a charter, after consultation with the works
council or (in the absence of a works council) employee delegates.
This charter defines the procedures for the exercise of the right to
disconnect and furthermore provides for the implementation, for
employees and management, of training and awareness-raising
activities on the reasonable use of digital tools.

Despite the lack of concrete requirements to actually limit connectivity,


employers may want to go through the exercise to determine
reasonably feasible guidelines in consultation with the works council.
Doing so may avoid unfavorable consequences such as: (1) your
works council claiming that the company is preventing it from doing its
job; (2) employees claiming "burn out" due to stress; and (3)
employees claiming that they are entitled to overtime compensation
for reading emails outside of work hours.
The French reaction: "Today the digital tools are blurring the boundary
between personal and professional lives," Bruno Mettling, human
resources director of the French telecom giant Orange, wrote in a
report for the government before the new law came into effect. "With
this accumulation of emails, and these employees who return
exhausted from the weekend because they have not disconnected, it
is not the best way to be effective in companies." He added that that
employees felt increasingly at ease checking their personal emails in
the office. After all, there was no longer any clear beginning or end to
their work days.

The American reaction: "We need to learn digital manners," said


Orianna Fielding, founder of The Digital Detox Company which runs
retreats and courses for firms looking to help their employees manage
their digital lives in a healthier way. "I think I would like to see
acceptance and an awareness of how digital overload is negatively
impacting the modern workplace, and finding solutions. I think we
need to have digital wellness programs in every company, and I think
every company needs a head of email," she added.

Finally, it is worth mentioning that the law does not provide a real right
for employees to refuse to work or to "disconnect"- it only creates an
obligation for companies established in France to promulgate
guidelines to ensure the right balance between professional and
personal life.