What Fintech Companies Need to Know About Key Federal Privacy Requirements | Orrick, Herrington & Sutcliffe LLP

The financial technology (“Fintech”) industry has boomed over the past decade, from the rise of mobile payment apps, robo-advisors, lending platforms, consumer-friendly brokerages to cryptocurrency trading platforms. By their very nature, many Fintech companies handle highly sensitive personal and financial information for consumers and face a range of privacy issues and legal obligations as a result. In the United States, rather than a single federal set of regulations governing all personal information, there are several different laws and regulations that impose different privacy and data security requirements on different sectors or jurisdictions. This article discusses the scope and requirements of two such laws that are prominent for many Fintechs: the Gramm-Leach-Bliley Act (“GLBA”) and the Fair Credit Reporting Act (“FCRA”).

GLBA

The GLBA regulates all collection, use and disclosure of personal financial information, and broadly applies to all “financial institutions”, which are businesses that are “substantially engaged” in “financial activities”, as well as businesses whose services facilitate financial operations on behalf of financial institutions . What constitutes “financial activity” has been interpreted broadly, and can potentially encompass many services provided by Fintechs. The Federal Trade Commission (FTC) is the primary agency enforcing the GLBA, while state law may require a higher degree of compliance than that required by the GLBA. The FTC has implemented regulations to carry out the GLBA’s financial privacy provisions (the “Privacy Rule”) and information security provisions (the “Security Rule”).

Under the Privacy Rule, financial institutions have several obligations to consumers (defined as individuals “who obtain or have[ve] obtained a financial product or financial service from the financial institution to be used primarily for personal, family or household purposes, or that person’s legal representative”) and customers (defined as “a subclass of consumers who have an ongoing relationship with the financial institution”) if non-public personal information (“NPI”) is handled by the financial institution. The privacy rules’ obligations do not include information collected about individuals who obtain financial products or services for commercial or business purposes. The privacy rule also does not apply to information collected from non-consumers, who have not obtained and do not obtain financial products or services from the institution.

A company subject to the Privacy Policy must provide a “clear and conspicuous” written notice describing the company’s privacy policies and practices, including the categories of information collected, the categories of information disclosed, the categories of affiliated and non-affiliated parties that NPI will be disclosed, and policies and practices with respect to protecting NPI in accordance with the GLBA Safeguards Rule (described in detail below). Further, if the Company shares NPI with third parties outside of the exceptions listed in the Privacy Policy, it must provide an opt-out notice explaining the consumer’s right to direct the institution not to share NPI and a reasonable method for opting out. The Privacy Rule requires covered entities to provide each consumer with a privacy statement both when they become a customer and each year they remain a customer.

The Security Rule mandates financial institutions to develop, implement and maintain a comprehensive information security program that describes the specific measures taken to protect customer information. In late 2021, the FTC announced significant new security requirements for non-bank financial institutions subject to the GLBA, which are incorporated into the Security Rule. The new security rule is far more prescriptive than the original rule with key requirements including, among other things, designating a qualified person to implement and monitor a company’s information security program, periodic risk assessments, implementing security measures to mitigate identified risks such as encryption of customer information and implementing multi-factor authentication for anyone accessing customer information on the company’s system, regular testing of security measures, staff training, monitoring of service providers, updating the security program, a written incident response plan and regular reporting to the institution’s board.

Penalties for noncompliance with the GLBA can include fines of up to $100,000 per violation and $192 per record lost in remediation. This also includes fines for officers and directors of up to $10,000 per violation, criminal penalties of up to five years in prison, and revocation of professional licenses.

Fintechs should conduct scoping exercises to carefully assess whether they provide financial products or services to individuals for their personal, family or household purposes, and if so develop adequate privacy and data security practices and alerts for NPI collected from consumers and customers.

FCRA

The FCRA limits the circumstances under which consumer credit information can be used and gives consumers the right to know what information is used and when it adversely affects them. The FCRA is very broad and covers a multitude of data types used to make eligibility decisions about consumers. It applies not only to credit bureaus and background check companies, but also to “anyone who: (1) collects or evaluates consumer data and shares it for purposes that determine eligibility for credit, insurance, employment, housing, or other eligibility purposes; (2) purchases credit reports, including credit scores; or (3) provides consumer information to credit bureaus.”

Some Fintechs, such as lead generators, data aggregators and debt collectors, as well as those that use algorithms to make decisions about consumers, may also be subject to the FCRA if their services are used to facilitate decision-making about consumers’ eligibility for credit, housing, employment and other qualification purposes. As a result, it is worthwhile for Fintechs to analyze their data collection purposes and practices to determine whether they are subject to the requirements of the FCRA. The Consumer Financial Protection Bureau (“CFPB”) has primary enforcement authority over the FCRA, but the FCRA may be enforced by other federal agencies and applies to companies outside the CFPB’s jurisdiction. The CFPB also recently issued an interpretive rule stating that preemption under the FCRA is “narrow and targeted.” It further encouraged states to pass laws that could go beyond the FCRA to protect citizens.

The FCRA allows victims to seek financial and statutory damages, including attorneys’ fees, court costs, and punitive damages. Companies that violate the FCRA can also be fined by the CFPB and FTC. A willful violation of the FCRA may result in actual or statutory damages ranging from $100 to $1,000 per violation, in addition to punitive damages awarded by the courts. While a negligent FCRA violation may be subject to lower fines and restitution amounts, any regulatory action or lawsuit alleging a violation of the FCRA can be extremely costly for a Fintech.

If a Fintech is subject to the requirements of the FCRA, it should: (i) comply with the privacy and data disclosure requirements of the FCRA, including ensuring that credit report information is used only for permitted purposes as defined by the FCRA (see Orrick’s Insights in the CFPB’s recent Advisory Opinion on permitted purposes under the FCRA here); (ii) have a mechanism to correct erroneous information; and (iii) notify consumers when they are subject to adverse housing, employment, credit and other decisions based on the credit information collected by the Company. For example, a company cannot use credit information for targeted marketing purposes, as targeted marketing is not one of the permitted uses set forth in the FCRA.

Fintechs should be particularly aware of situations where algorithms are used to make automated decisions about consumers. In such situations, companies must ensure that they have implemented reasonable procedures to maximize accuracy and provide consumers with access to their own information and an ability to correct errors. When an algorithm is used to make unwanted decisions – for example, to charge higher rent or deny credit – the consumer must be given notice of unwanted actions and given an explanation of why the decision was made with a specific character. Furthermore, if not used with care, the use of artificial intelligence (“AI”) may result in discrimination against a protected class, in violation of the Equal Credit Opportunity Act (“ECOA”). The FTC has specifically warned that using algorithms that discriminate against protected classes could be considered an unfair practice subject to enforcement actions and possible fines. In addition, the CFPB recently redesigned its whistleblowing website to encourage whistleblowers with knowledge of “potential discrimination or other misconduct within the CFPB’s authority to report it”. Therefore, to avoid regulatory scrutiny, companies using AI to make credit-related decisions should thoroughly test such algorithms before commercialization and periodically audit and monitor their use thereafter to ensure compliance with ECOA and other equality laws.

What will be next?

In addition to federal privacy laws, Fintechs should be aware of the growing number of state laws that regulate how businesses must handle personal information. In many cases, such state privacy laws have partial exemptions for personal information subject to the GLBA or FCRA. For example, the California Consumer Privacy Act (“CCPA”) exempts from its privacy requirements personal information “collected, processed, sold or disclosed pursuant to the federal GLBA and implementing regulations.” Fintechs should therefore carefully consider whether they are subject to the federal requirements discussed above, and if they are not, or if certain information they collect falls outside the scope of the federal requirements, be sure to consider and comply with the applicable requirements of state privacy laws that they is subject to.

The authors wish to extend special thanks to summer associate Christina Lee of Harvard Law School, 2024, for contributing to this work.

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *