Episode 10 – 1/25/2021
Are You Ready for a Fair Lending Revival?
Legal expert Jeffrey P. Naimon discusses fair housing rules and other likely regulatory changes on Arch MI’s PolicyCast video podcast.
0:07 Kirk Wilison – Welcome to the Arch Mortgage Insurance PolicyCast. I’m Kirk Wilison. There’s a common refrain in Washington that personnel is policy. Well, we now have a new president and democratic control over both houses of Congress. Soon We’ll have a new HUD secretary and a new director at the Consumer Financial Protection Bureau. If personnel is policy we’re likely in for quite a shakeup from the previous four years. The new administration makes it clear that it is highly attuned to inequality as it relates to access to credit, home ownership, wealth building and social justice. Rooting out the cause and effect of such inequality will be a paramount concern. The eyes of lenders, FinTech companies, mortgage insurers, realtors, borrowers, consumer groups, and lawyers will be a fixed to how the new administration addresses fair housing and fair lending. My guest for the 10th episode of the policy cast is well positioned to offer insights and expertise in this arena. Jeffrey Naman is a partner with the Buckley law firm here in Washington, DC. With more than two decades of experience assisting financial services providers and investors navigate regulatory enforcement and transactional matters. Jeff is recognized as one of the best attorneys in the business. One legal publication wrote there is none better at arguing a disputed point. Fortunately, I don’t expect any disputes with Jeff. He’s been a friend for over three decades.
01:54 – Jeff. Thank you very much for being on my 10th episode of the Arch Mortgage Insurance PolicyCast with the new administration. It’s really a great opportunity for us to talk about an issue That’s going to be really prevalent in this new administration and that’s fair lending.
02:11 Jeff Naman – Well, thank you, Kirk. It’s a pleasure to be here and to chat with you. And, you know, as we were discussing last week, we, the two of us, have been working on these issues for now fully 25 years. So, it’s terrific to have this opportunity.
02:19 Kirk Wilison – Great. Well Jeff, let’s start our conversation at the 30,000-foot level. Are we likely to see a continuation from other administrations or what’s likely to be different this time around?
02:39 Jeff Naman – Great question. I mean, first I think we will see some strong similarities do some prior democratic administrations, less so with the prior Republican administrations and I think there’s going to be very likely to be an effort to draw clear lines and clear distinctions between the new Biden administration and the immediately past Trump administration. I think they’re going to work very hard on that. Even though it’s a new democratic administration, I think there’s a couple priorities that concern the financial services industry. The first of which is the climate agenda. And I think that we’re going to be building that into most, you know, they’re going to be looking at how can we build the climate agenda into various financial services policies, and that’s a little bit off topic for us, but I just wanted to mention that as one of the two kind of breaking points from say the Obama administration to today where I think that’s going to be a much more prominent piece. And the second piece is what I’ll call, broadly, the social justice agenda. And there’s going to, you know, I think there’s going to be an effort to bring social justice concerns and racial justice concerns into basically every financial services issue. I think we’ve seen a lot of advocates, especially focusing on what they call systemic racism. And they are looking at every piece of the of every industry and every part of society as having a part of having created and perpetuated systemic racism and the financial services industry will be viewed from that lens of how have you been perpetuating systemic racism? How are you perpetuating unequal outcomes? And so, I think that’s going to be a big difference between this administration and the Obama administration and where you did not hear a lot of that kind of rhetoric or concern out of the Obama administration.
04:35 Kirk Wilison – A lot of the work in this area in addition to Congress, and you certainly can expect to see Congresswoman Maxine waters who chairs the house financial services committee active in this area, but a lot of the work is going to be done in the agencies, in the offices of this administration. There’s going to be new leadership at the Consumer Financial Protection Bureau. And in fact, president Biden has now nominated Rohan Chopra for that role. What is it that the housing industry ought to be looking at from CFPB going forward?
05:08 Jeff Naman – I think, you know, kind of first and foremost, I think institutionally, we are going to see the office of fair lending within CFPB, which has been run by Patrice Speclar basically since inception. It was, in essence, downgraded by, acting director, Mulvaney, into a much smaller office without its own enforcement power. I think almost everyone I talked to expects that that’s going to be changed and it’s going to go back to having both regulatory and enforcement pieces to the Office of Fair Lending. And so, it will become a higher priority within the CFPB. within that though, I think there will be, you know, alignment from the director on down to focus on the use of statistical claims and fair lending. Most particularly to use what’s called disparate impact theory of liability which has been pretty controversial in a number of respects. Industry lawyers like me have, you know, written scholarly articles against the use of disparate impact. I wrote an article on that, it shouldn’t be recognized under the Fair Housing Act. However, the Supreme court may not have read my article or they disagree with me. So that is the law of the land is that you can make disparate impact claims under the Fair Housing Act. There is still some argument, whether it’s available under the equal credit opportunity act or not, although we know without any doubt of where the CFPB will be, you know, how they will view that issue. They view disparate impact as being clearly available under the Equal Credit Opportunity Act. So, they have an essence, two different laws, you know, kind of available to them. The CFPB can only use EECCOA. They don’t have authority over the fair housing act, only the Department of Justice and HUD have that authority. So we’ll see the CFPB using Ikawa to bring disparate impact claims.
07:22 Kirk Wilison – You talked a bit ago about disparate impact and that’s focused a little bit on how it relates to the Department of Housing and Urban Development. Consumer activist seemed to be really engaged, very excited by having some new vision, a new administration at HUD. In particular, looking at the disparate impact situation to really kind of make a stand for the administration on leveling inequities. Can you walk us through the recent history of that topic? And then what’s at stake today for the housing industry?
7:57 Jeff Naman – Yeah, well, the disparate impact is, you know, kind of a really interesting concept because it goes a little bit beyond the issue of, okay, well, someone came into my office and they wanted to rent an apartment and I saw that they were African-Americans so I decided I’m not going to rent them an apartment. And I said, you’re African-American I don’t rent to African-Americans and that’s overt discrimination. I literally don’t know anyone who doesn’t think that that should be illegal, right, that’s clearly illegal. However, you could also have a landlord saying, Hey, I have a policy that only people with FICO scores over 720 can get an apartment in my complex and that’s a neutral-criteria and that’s not black or white, that’s not male or female. It’s, you know, none of those, it doesn’t use a protected class as part of that rule, it’s a neutral rule. However, we know from the statistics that that rule will have a disparate impact, it will negatively affect many minority groups more than it would have on white groups. And so it would be viewed as having a disparate impact on them. And that’s where you get into this greater complexity of, is that against the law? When is a neutral policy against the law? And there’s kind of, to my mind, there’s kind of two visions of disparate impact that over the years, because it started in a case in the seventies from the Griggs versus Duke power, where all of a sudden Duke power company decided that it absolutely needed everyone in the company to have a high school degree. Even if you’re pushing a broom in a power in a power generation facility, and the point of that rule, you know, their new rule that they instituted once they realized that, you know, title seven and equal employment opportunity applied to them too, was to reduce the number of African-Americans they would hire. That was why they kind of did it and it would have had that effect in their labor market that they were pulling people out of because there were many fewer African-Americans who had graduated from high school at that time. This is in the mid 1970s and the Supreme court said, no, no, no. That’s kind of like, that’s disguised discrimination. That’s like an artificial barrier to employing somebody, it’s arbitrary. Why does someone need a high school degree to push a broom in your factory? That’s ridiculous. So that’s where this whole idea came from. And so, on the one hand, there’s a view of disparate impact where, hey look, when people kind of make-up policies, really, it’s just a disguised, slightly veiled effort to discriminate against African-Americans or women or old people or any other group of people who are a protected class. And that’s, you know, kind of unfair. And our country unfortunately has a long history of that kind of veiled, you know, veiled neutral policies and a lot of new deal legislation had similar pieces in it that were snuck in by segregationists to perpetuate segregation. On the flip side, another view of disparate impact is that it’s really kind of a tool to create real equality that anytime you have a disparity between a protected class and a non-protected class, you know, whatever institution, whether it’s a governmental institution or a private entity whose policy has created that differential. They’d better be able to explain exactly why they have that policy and why it’s a good policy if it has that effect, right? And so, we go back to the landlord with a 720 FICO rental policy. Why do you have that policy? And the landlord would say, well, you know, I don’t have an army of lawyers to go after people for rent. I don’t want to have to evict anyone. I only want to rent my apartments to people with very good credit who I’m very confident will be, you know, paying the rent timely. I won’t have to have the cost of evictions. So that’s who I want to limit it. And that’s their kind of business justification. And then on the flip side of it in a disparate impact case, even after the industry party can say, well, here’s my business justification, this wasn’t irrational. I’m not trying to keep out any kind of person. I have a good business justification for this policy. The plaintiff can say, well, wait a second. Aren’t there other ways? Can’t you look at my rental history and see that I’ve never been evicted before? I’ve never made a late payment on my rent, even though my FICO score is 715 and not 720. Like why, you know, this is ridiculous that I can’t rent your apartment, right? So, the effort is to look for what is a less discriminatory alternative and why didn’t the institutional party that’s being sued, why didn’t they use that less discriminatory alternative rather than the more discriminatory mechanism that they did choose?
13:34 Kirk Wilison – Interestingly, on this topic, when industry has begun to try to use technology in some cases, in order to try to weed out some of these discriminatory practices, they also have some risks. And just the other day, for instance, Federal Reserve Governor Brainard gave a speech on the subject of using artificial intelligence. And one of the things that she talked about, even though she said it can create great promise for doing good, it has some significant risks too. And she said if AI models are built on historical data that reflect racial bias or optimized to replicate past decisions that may reflect racial bias, the models might amplify rather than ameliorate racial gaps in access to credit. And then she actually warned that it could lead to digital red lining. So, I thought maybe, you know, you could maybe give a little background on what are the implications of using AI for FinTech companies and lenders and other providers of financial services because the goal seems to be from her vantage point that we have to increase our focus on equity, not just optimizing for efficiency in the operations of a financial institution.
14:52 Jeff Naman – Yeah. And, and this has been an issue that’s been percolating for years at this point as industry parties, particularly FinTech’s and particularly kind of like, you know, credit underwriters and credit pricing entities that are saying there is a wealth of new data we can use to figure out who’s going to pay their loans back. And I saw an interesting study out of Germany that based on what device you use to access the website, how quickly one typed and one other kind of seemingly stupid criterion. They were as good as a credit score in determining who would and wouldn’t pay back a loan. And because they have the ability to reach in and find correlations in the data that are, you know, go way beyond what we think. On the other hand, of course, you know, whenever you get on the internet and you’re chased by ads. You looked for something three months ago and they’re still chasing ads after you so, you think, Oh, maybe that big data isn’t so brilliant after all, right? But this issue has come up and for the most part it’s been smaller entities, these new startups have been on it and they are using it to try to determine credit, worthiness and pricing on the basis of machine learning and artificial intelligence. Well beyond the kind of credit scoring algorithms that we’re all comfortable with. Banks haven’t gone that far, bank regulators have been very hesitant about it. They’re very worried about these outcome issues from this data. And one of the things that the bank regulators have always been interested in has been, you know, what the economists call heuristics, which is like, does it make sense? So when you say in your credit score, well, how many times has someone been 90 days late on their credit? Well, 90 days late is, you know, that seems like that might be relevant to whether you be laid on another credit obligation. What if one of these super smart FinTech people find out that people who buy red sneakers never pay their loans back, or they pay other loans, but they don’t pay auto loans, right? I mean, one of those are good correlation that they can show statistically is really strong and they just discriminate against people who buy red shoes. Guess what? The bank regulators are going to say, what does buying red shoes have to do with it? And the data analyst or data scientist or whatever we’re calling people doing that effort, I don’t care why it’s true. I know that it is true so if I’m an auto lender. And I know that people who buy red shoes don’t pay back auto loans, I’m going to find People who buy red shoes and not make them auto loans. And so that’s been kind of a, that’s been a kind of a push and pull for the last three or four years, especially it’s been going on a little longer than that. and we don’t know, but I think one of the concerns here, and it’s a very legitimate concern is that one of the nation’s biggest companies was using a hiring algorithm that you know, that looked at well, who’s been successful in engineering jobs in our company. And it turned out that the most successful engineers of that company had been men. Well, their hiring algorithm figured that out and decided, you know what, I’m going to toss out all of the women from my hiring pool, because obviously being male was an important attribute for being a successful engineer in this company. Now I’m sure that my telling of the story has oversimplified it in 11 different ways. But that’s the concern is that when governor Brainard said, well look, there’s this existing data may embed past discrimination or past concerns. That’s what they’re talking about, they’re saying, yeah, your machine may have figured that out, but if you follow that rule, we are never going to ever hire a female engineer and find out if she can do the same job as the man can. So we can’t trust these algorithms a hundred percent. And governor Brainard is saying, look financial institutions, we expect you to be looking at these things. And governor Brainard is widely expected to become the next chair of the federal reserve board. I think most people in Washington think that’s what’s going to happen. So, she’s a very influential policymaker. She’s very well thought of as who will be in the white house. So that speech is something that’s worth it for everyone who’s interested in these issues to give a read.
19:55 Kirk Wilison – governor Brainard talked about digital red lining, but you know, there’s other new theories of red lining that are emerging from policy debates and court cases. And I want to know, you know, what are some of those cases that have caught your eye and how are they changing all of our perception of red lining?
20:14 Jeff Naman – There’ve been kind of different arrows of red lining as it were and the kinds of theories. If you look to the 1940s, FAJ published maps and they have colors, red, yellow, blue, green, and basically, they didn’t want you to show up and ask for insurance on a property that was in the red area at all and probably not one in the yellow area. And so, you could follow the neighborhoods that were red and yellow were very much the African-American neighborhoods. And if it happened to be back then, a city that had a big Hispanic neighborhood, the Hispanic neighborhood would be, you know, put in the yellow neighborhood, you know, kind of thing. And the red would be for the black neighborhood. and you know, it was pretty explicit discrimination on the basis of race and so these lines, if you look at the first or second big Department of Justice fair lending case was against Chevy Chase a bank here in Washington, DC. And, you know, they had a rule like, okay, we only make loans secured by real estate, West of 14th street and East of 14th street, We do not make loans secured by real estate. Well guess here in Washington, who at the time who lived West of 14th street? That was where the white people live. And who lived East? That was where the black people lived, right? It was a pretty, you know, it was still reasonably segregated, but it was a very segregated thing, that was the traditional line. And that meant that, you know, one of the bigger local banks that did real estate lending just didn’t lend east of 14th street, that was a problem. That’s what I would call traditional redlining where a bank draws a line and that line is a line of racial significance, right? That was the issue. we kind of moved on from there, right? To new versions of red lining where it’s like, you don’t necessarily have a direct line that was driven, you know, that you drew, but it just turned out that all your lending was West of that line. And none of your lending was East of that line. And the government say, looks like you kind of have a red line there even though you don’t have an official policy saying that. and that’s what we call in our office, Oops! Red line, right? It’s just like, Oh golly, sorry. I didn’t mean to do that but that’s how those cases look, they look a lot like, in essence, the traditional red lining cases, but they’re without an explicit policy saying I don’t lend on that side of the railroad track on that side of the highway, on that side of that street. it’s more about, that’s just kind of what happened and now what we’re seeing mostly in our office, I would say the vast majority of our cases that involve red lining in our office or what you call comparative red line. And that’s a new theory, new as of, you know, seven years ago. But relatively new in the course of red lining where the government says, look, your competitors in that market, your competitors make twenty-five percent of their loans in minority neighborhoods, and you’re only making 10% of your loans in minority neighborhoods. Why is there such a big difference between you and your competitors? So, they’re not saying, well, you normally you’d say, well, I can’t make loans in that area because there’s no lendable collateral and so on, but they’re not saying you didn’t do any lending at all. They’re saying you did less than your then your peers, whoever your peers are. And obviously when you defend these cases, you say, wow, that’s, you know, that bank isn’t my peer or that lender isn’t my peer. And so, you argue about what is a peer group, but in essence what they’re doing is inferring discriminatory and tact. An effort to redline based on the fact that, you know, you just didn’t get the applications there. You did not get applications from minority neighborhoods. And so they say, yeah, that must’ve been, you must have intended it when you’re that far off from your peers.
24:58 Kirk Wilison – Final question, Jeff. What advice are you giving to your clients to help them mitigate the risk and prepare for this new day in Washington, when it comes to a fair lending?
25:15 Jeff Naman – I’m not sure it’s one piece of advice. well, I guess first I think it’s important that you try to understand what is the government trying to do? And that, you know, they’re not trying to do it to ruin the country or to ruin the industry. They’re doing it because they have a, you know, a policy vision of how they want the world to look and it’s not a crazy or anything policy vision. So, you kind of start with, you know, try to understand what’s coming in at you and why. The second thing is it’s important that everyone in the company know, hey, look, we’re trying to be non-discriminatory. Our whole company is trying not to discriminate and it’s all of our jobs together. Whether you’re a credit analyst or a customer service worker, whatever you’re doing, we’re all in. We’re all trying to work on this together and getting people, in essence, on board. Third is, it’s important to, you know, to have a good compliance management system in which you’re doing a lot of testing and monitoring. Are you actually applying discretion in this, you know, equally and fairly when you look at your lending statistics? Are loans priced similarly for similar risk customers? That’s the crux of the issue. And to the extent you’re not necessarily hitting the marks you hope you would hit, you need to be digging in. Why is it? Why am I missing my Mark? What is causing this to occur? and my last piece of free advice is make sure your rate sheet really accounts for loan size. I mean, when I see the biggest set of issues comes from rate sheets that don’t adequately account for the limited profitability of very small loans and the limited margins available and much larger loans. and when your rate sheet doesn’t account for it properly, your loan officers ended up having to make exceptions in different ways and those exceptions get scrutinized and they end up being difficult to defend. So, but when you have it in your rate sheet and its policy, you put yourself in a better position in terms of the analysis of your lending.
27:35 Kirk Wilison – Jeff very informative discussion. I’ve learned a lot today. I’m sure our viewers have as well. Thanks for taking the time to walk us through. It’s going to be a fascinating four years under the Biden administration on the issues that you’ve talked about today.
27:48 Jeff Naman – Thank you so much, Kurt.