Call us now:
For a while now, pundits have been talking about the three V’s of big data: Volume – the vast quantity of information that can be gathered and analyzed; Velocity – the speed with which companies can accumulate, analyze, and use new data; and Variety – the breadth of data companies can analyze effectively. A report just issued by the FTC suggests a fourth: Vigilance – the care companies should exercise in complying with the law.
Following up on an FTC conference on big data and a seminar on alternative scoring products, Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues analyzes the lifecycle of data, with a focus on how its use affects consumers. Ultimately, does it pose a benefit or does it raise concerns? After reviewing the research and considering the public comments we received, we think the answer to both questions is yes.
In areas as diverse as individualized healthcare to increased access to non-traditional credit, big data can offer substantial benefits to consumers, including members of underserved communities. But the report also points to areas of concern. What about the potential for reinforcing existing disparities, creating new purported justifications for exclusion, and putting sensitive information at risk?
The report outlines some of the laws that apply in this context, including the Fair Credit Reporting Act, equal opportunity statutes, and Section 5 of the FTC Act. It also suggests compliance questions for businesses to bear in mind. For example:
- If you use big data products for eligibility decisions, is your FCRA house in order? Have you certified that you have a “permissible purpose” to obtain the information and that you won’t use it to violate equal opportunity laws? Do you give people the required pre-adverse action notices and adverse action notices required by the FCRA?
- If you use big data analytics in a way that might adversely affect people’s ability to get credit, housing, or employment, are you careful not to treat them differently on a prohibited basis, like race or national origin?
- Do you explain to consumers how you use their information? Are you honoring promises you make about your data practices?
- Do you maintain reasonable security to protect sensitive information?
The report offers additional considerations for companies in this context:
- Consider whether your data sets are missing information from particular populations and if they are, take appropriate steps to address this problem.
- Review your data sets and algorithms to ensure that hidden biases don’t have an unintended impact on certain populations.
- Just because big data found a correlation, it doesn’t necessarily mean the correlation is meaningful. As such, consider the risks of using those results, especially where your policies could negatively affect certain populations. It may be worthwhile to have human oversight of data and algorithms when big data tools are used to make important decisions, like ones implicating health, credit, and employment.
- Consider whether fairness and ethical considerations advise against using big data in certain circumstances. Consider further whether you can use big data in ways that advance opportunities for previously underrepresented populations.