Submit to Digest

The Trump Administration Wants to Reboot Redlining

Commentary

Seth Frotman is a visiting senior fellow at Towards Justice and the Center for Consumer Law and Economic Justice at the University of California Berkeley School of Law. He is the former general counsel of the Consumer Financial Protection Bureau.

Tara Mikkilineni is a visiting senior fellow at the Consumer Federation of America. She is former special counsel to the Enforcement Director of the Consumer Financial Protection Bureau.


Amidst the blitz of legal challenges to Trump administration actions, it might have been easy to miss a recent decision by a Trump-appointed judge in a little-known case involving the CFPB, currently led by Project 2025 author Russ Vought. Vought had made an extraordinary attempt to reverse a CFPB settlement with Townstone, a Chicago-based mortgage lending company accused by the first Trump administration of violating federal fair lending laws. The judge refused to undo the settlement, calling out the brazenness of this “unprecedented” attempt to back out of the negotiated agreement.

But, while the decision in Townstone is a rare and encouraging sign that even Trump-appointed judges are willing to rein in the administration, the Vought CFPB’s actions in Townstone are a sinister harbinger of the administration’s efforts to unwind protections against discrimination in lending. The CFPB has quietly made a series of moves that would enable an unholy alliance of Big Tech and financial institutions to digitally discriminate against people on the basis of race, religion, or gender. If Vought gets his way, there will be little to stop the tech-finance nexus from using and exploiting its massive troves of personal data to unleash a torrent of surveillance pricing in credit markets and turbocharge modern day redlining.

How did we get here? And what can we do about it?

The gutting of federal fair lending enforcement

The CFPB under the Trump administration is quickly and systematically dismantling federal enforcement of the Equal Credit Opportunity Act (ECOA), which prohibits discrimination “with respect to any aspect of a credit transaction.” When ECOA was enacted in 1974, women often could not obtain credit without a male co-borrower. Congress passed ECOA to ensure “that financial institutions and other firms engaged in the extension of credit make that credit equally available to all creditworthy customers without regard to sex or marital status.” Two years later, Congress expanded ECOA to ban lending discrimination on the basis of several characteristics, including race, color, religion, national origin, and age.

For the last half-century, the federal government and private litigants have enforced ECOA to stop discriminatory lending practices, including redlining. The term “redlining” comes from the red lines drawn on maps used by the U.S. government to designate neighborhoods — almost all Black — that were “risky” for lenders on the racist theory that property values are likely to decline in places where Black people lived. The resulting equity-stripping and exclusion of Black and low-income home buyers from the mortgage markets left a legacy of racial wealth disparities that persists to this day. Since the passage of ECOA and the Fair Housing Act in 1968, federal law has prohibited redlining, as well as acts that discourage borrowers from applying for loans on the basis of characteristics like race, gender, or religion (think: a “whites only” sign on a bank window).

The Trump administration wants to change that.

The Vought CFPB is now terminating settlements, as well as declining to not enforce future compliance with the law prohibiting redlining and discouragement. The CFPB’s public actions in Townstone vividly illustrate the Trump administration’s dangerous worldview.

In 2020, the CFPB sued Townstone Financial and its CEO for discriminating against Black Chicago residents through redlining and discouragement, pointing to derogatory statements made by Townstone's CEO about Black neighborhoods on his commercial advertisement radio show as well as Townstone’s poor rates of lending to Black home buyers. In litigation, Townstone argued that these actions were not illegal because ECOA does not apply until after somebody submits a loan application. In other words, a company could freely prevent or exclude certain groups from applying for loans, and escape liability so long as nobody actually applied.

Townstone’s argument would mean, of course, that ECOA would not apply to redlining, one of the most insidious forms of credit discrimination in our nation’s history. The Seventh Circuit unanimously rejected this argument, holding that Congress plainly intended ECOA to include actions taken by a creditor before an applicant ultimately submits his or her credit application. Soon after that ruling, Townstone settled the case with the CFPB.

The story could have ended there. Instead Vought sought to undo the consent order and judgment in the case, issuing a bizarre press release that, among other things, accused the previous Trump administration of bringing the case to “further the goal of mandating DEI in lending via regulation by enforcement tactics.” Even though this nonsense was roundly rejected by the judge overseeing the case — who referred to the attempt to undo the negotiated settlement as a “an act of legal hara-kiri that would make a samurai blush” — Vought has similarly undone agreements in other redlining cases brought by the CFPB across multiple administrations.

This radical assault on federal fair lending law goes beyond merely reversing half a century of work to revert to a world where banks are permitted not to lend in particular neighborhoods. It also has terrifying implications for a future where more of our financial lives are online and where the surveillance infrastructure of Big Tech and Big Data present, shape, and limit options in a digital marketplace.

Digital discrimination in the modern economy

The market for consumer financial products is rapidly shifting from real to virtual. Fewer people shop for auto, home, student, or personal loans by seeing the rates posted in the bank window or their local paper. Instead, in a world where we shop for nearly everything online, our options are increasingly derived from data extracted from our online interactions and targeted directly at us.

The mechanisms that companies use to dictate our options come into play before we even know we are being offered a product, much less before we apply for it. Behind the scenes, financial institutions are targeting us with specific products tailored to what they perceive as our profile, with the help of tech companies that have vacuumed up data that includes discrete demographic information, as well as data that can be nearly perfect proxies for those demographics. A person doing something online may not realize that they are indicating their age or marital status (for example, reviewing a book indicating as such), or that they are a member of a particular religion (such as shopping for shabbat candles or rosary beads). Companies can show or exclude certain products — and even individualize the price of those products — based on assumptions they make on the basis of the person’s characteristics. A consumer may never learn about better products with better rates, because they do not fit the “profile” of the ideal customer for that product. Or they may be purposely targeted with a predatory price because they do.

This scenario is not a hyperbolic vision of a dystopian future. It has already happened, with technology far less sophisticated than the infrastructure Big Tech is building and using today. In 2022, the United States sued Facebook (now Meta) for approving ads targeting users on the basis of race, gender, and other protected characteristics. Facebook took its user data — including “likes” and “shares” and participation in Facebook groups like “Single Black Mothers” or “Asian Single Women” — and made them central to its ad delivery system. This system allowed landlords to exclude from rental housing ads people who were classified by Facebook as “non-Christian,” “interested in Hispanic culture,” “interested in Judaism,” or who lived in certain ZIP codes.

Facebook settled the case, promising to try to ensure equality in the delivery of its ads. It did not, however, change its general approach to harvesting data for ad sales to financial institutions. And in the last few years, technology has already changed so rapidly as to outpace previous advertising methods used by companies like Facebook. Generative AI models can now aggregate even more data and supercharge companies’ abilities to personalize and microtarget their advertising and pricing. But, unlike older models that required humans to exercise judgment and discretion in segmenting potential customers, there is far less visibility into how generative AI models parse data to decide whom to target and how to target.

When combined with the gutting of ECOA, the endpoint of these trends is clear: a two-tier digital financial system, one which offers prime products, favorable terms, and financial opportunities to some, and another that channels disfavored groups — if served at all — toward higher-cost, less favorable products. Given the way that Big Tech pushes us content, neither group would ever know that similarly-situated consumers are being offered very different products — which creates a self-fulfilling cycle in which companies are able to extract more money from people who may not be able to shop for something better.

A world where online consumers see only what they are fed by complex, invisible algorithms, and not what is being offered to others, gives rise to digital segregation. And this has potentially systemic, catastrophic consequences. One of the direct contributors to the 2008 financial crisis was the development of innovative ways to engage in pricing discrimination and steer entire communities of minority borrowers to subprime loans destined to self-destruct.The advent of new technologies only boosts the destructive potential of these tactics.

Where do we go from here? States must lead the way.

It is clear from the CFPB's recent actions, that we are facing the prospect of generational damage to federal enforcement of fair lending laws. The CFPB has gone so far as to rescind longstanding fair lending guidance, even withdrawing seemingly innocuous advisories noting that creditors who use complicated, opaque, or novel algorithms in evaluating applicants for credit must still comply with ECOA. This new reality means that state and local law enforcement, and private litigants, must act now.

They have the tools to do so.

First, state attorneys general and bank regulators have authority to enforce ECOA directly through the Consumer Financial Protection Act. Congress presciently granted states the power to bring enforcement actions for any violation of federal consumer protection laws, including ECOA. State actors should aggressively enforce ECOA’s prohibitions on discrimination against both applicants and prospective applicants, bank and nonbank lenders, and tech companies that partner with them. It does not matter that under the Trump administration, agencies that enforce ECOA may take a different position on the law’s reach — as the Supreme Court has made clear, courts are no longer required to defer to agency interpretations; what matters is the court’s reading of the statute. Courts have roundly rejected the notion that ECOA or state enforcement provisions in the CFPA are as narrow as those statutes’ detractors would like them to be.

Second, states need to prioritize strengthening, expanding, and enforcing their own fair lending laws to ensure that they address the risks of data harvesting and “black box” technologies that determine the availability and terms of credit. This includes:

  • Ensuring that state fair lending protections apply to the variety of ways that service providers and financial institutions push their products to consumers, including using consumer data to do targeted marketing or pre-screening of applicants for credit;
  • Requiring “explainability” in decision models used for financial products targeted to consumers and making sure that companies cannot hide behind assertions of “trade secrets” to prevent discovery into the methods underlying their technologies;
  • Expanding the categories of private plaintiffs that can sue to enforce fair lending laws — for example, organizations that can sue on behalf of their members;
  • Creating robust statutory remedies, such as statutory damages and deletion of misused or tainted data (and technologies developed with tainted data);
  • Validating the use of statistical evidence as a method of proof to establish discrimination claims;
  • Exploring bans on the use of known proxy data, as has been done in other markets.

States and private litigants can also address lending discrimination through their own prohibitions on unfair, deceptive, or abusive acts and practices (UDAAP) — and should ensure their consumer protection statutes are up to the task. States can also get at the nexus of fair lending and data harvesting through state privacy and data protection laws, as well as laws addressing discriminatory targeting of advertising and surveillance pricing. On this front, it is critical that the states avoid harmful carve-outs to laws addressing consumer privacy and protections related to artificial intelligence that would effectively exempt financial institutions and technology companies from consumer protections.

The Trump administration’s attack on fair lending, with the apparent applause of industry trade groups, is a wake-up call. We need more than just a bandaid for the next four years. We must make a long-term investment in private enforcement of federal civil rights statutes, and states and municipalities must develop and enforce laws that will keep consumers safe from the very real harms of these technologies. This requires foresight, creativity, cooperation, and tangible and immediate progress. It is absolutely necessary to fill the void left by the federal government’s abdication of its duties to protect us.