In June, the Consumer Financial Protection Bureau (CFPB) issued a report (Report) summarizing its research into the use of artificial intelligence (AI) in consumer finance. The Report focused on the shift away from "human support" to "algorithmic support" in the banking and consumer finance industry. The CFPB provided analysis of the use of chatbots and outlined risks associated with the technology. In addition to confirming that it will take an active role in monitoring compliance in the context of AI, CFPB made clear that chatbot technology should only be deployed if it can:
- provide "reliable and accurate responses;"
- understand when a consumer is exercising his or her legal rights and react accordingly;
- avoid leaving consumers in continuous loops of repetitive, unhelpful jargon or legalese without ultimate access to a human customer service representative (referred to as "doom loops"); and
- ensure that consumer privacy and data security rights are protected.
Background
The Report follows the CFPB's research of the current use of chatbots and complaints related to chatbots. The Report notes that 37 percent of the US population (98 million) used a financial institution chatbot in 2022, and that number is expected to grow to more than 40 percent (110.9 million) in 2023. The Report outlines a number of issues the CFPB is focused on, including:
Violation of Consumer Finance Laws. The Report notes that federal consumer financial laws apply to interactions between consumers and chatbots, and that chatbots should not provide misleading, inaccurate or legally inadequate responses. In particular, the CFPB emphasized that "[p]roviding inaccurate information regarding a consumer financial product or service … could be catastrophic" and the Report concludes that it is "actively monitoring the market" and "expects institutions using chatbots to do so in a manner consistent with the customer and legal obligations."
Limited Ability to Solve the Consumer's Problem. The Report notes that certain chatbots are inadequate, such as ones which fail to understand the person's request or the message from the person contradicts the system's programming, resulting in unproductive or frustrating consumer experiences. Chatbots should be able to (1) accurately identify when a customer "is raising a concern or dispute;" and (2) be able to adequately respond to consumers with "limited English proficiency." The Report emphasizes that chatbots that are "backed by unreliable technology, inaccurate data, or [are] little more than a gateway into the company's public policies or FAQs" will be considered inadequate.
Hindering Access to Timely Human Intervention. The Report takes aim at technology that provides automated responses and then leaves customers in never-ending, unhelpful "doom loops" which make it impossible for customers to engage in a robust conversation with their financial institution. The CFPB also noted that it may review technology to determine whether it is deployed simply to "grow revenue or minimize write-offs."
Privacy and Data Security. The Report notes various issues associated with privacy and data security risks associated with interactions between chatbots and consumers, and emphasizes that institutions must keep personally identifiable information safe and secure, including information obtained or exchanged via chatbot. In addition, the chatbots may run afoul of federal consumer financial laws that impose obligations on financial institutions, such that "the technology may fail to recognize that a consumer is invoking their federal rights, or it may fail to protect their privacy and data."
Conclusion
The Report by the CFPB is the latest from the CFPB on how it intends to focus on the use of AI in the consumer finance marketplace. As institutions continue to deploy AI in their consumer-facing operations, they should ensure that those systems are (1) are designed to be consistent with the Report, including the requirement to understand when consumers are raising a concern or dispute; (2) tested prior to deployment; and (3) adequately audited for compliance with the myriad of governing state and federal laws.