Breaking the Black Box
Basecamp co-founder David Heinemeier Hansson sparked a national controversy this week when he posted a series of livid tweets about how his wife received a much lower credit limit than he did on their Apple Cards, despite applying with the same financial information. What began as a rant against opaque algorithms turned into a regulatory investigation and more. In this episode, Dr. Ruha Benjamin of Princeton University and entrepreneur Mara Zepeda, co-founder of the XXcelerate Fund and Zebras Unite, talk about how the tech and financial sectors perpetuate systemic inequalities and how to start repairing the damage—or building something more equitable and inclusive from the ground up.
- Apple Card - 1:31
- DHH's Twitter thread - 1:53
- Steve Wozniak's response - 1:58
- The New York Department of Financial Services' announcement of its investigation into Goldman Sachs - 2:11
- "About the Apple Card" (Jamie Heinemeier Hansson) - 2:28
- Ruha Benjamin's website | Twitter - 2:57
- Race After Technology by Ruha Benjamin - 2:59
- University of Michigan article about the failures of the state's MiDAS system - 7:31
- "Racial bias in a medical algorithm favors white patients over sicker black patients" (Washington Post) - 8:18
- "Biased bots: Artificial-intelligence systems echo human prejudices" (Princeton University) - 12:40
- "Amazon scraps secret AI recruiting tool that showed bias against women" (Reuters) - 14:15
- Ruha Benjamin's Resources - 15:29
- Joy Buolamwini on Twitter - 16:52
- "Atlanta Asks Google Whether It Targeted Black Homeless People" (NYT) - 17:40
- Tuskegee Study, 1932-1972 (CDC) - 18:35
- Atlantic article about J. Marion Sims and his experiments on enslaved women - 18:51
- NYT article about China's use of facial recognition technology to oppress the Uighur minority - 20:10
- "Somerville Bans Government Use Of Facial Recognition Tech" (WBUR) - 20:38
- "Can you make AI fairer than a judge? Play our courtroom algorithm game" (MIT Technology Review) - 22:31
- "Brooklyn students hold walkout in protest of Facebook-designed online program" (New York Post) - 23:55
- Data for Black Lives - 26:25
- "St. Paul, Ramsey County to end youth data-sharing agreement after withering criticism" (Pioneer Press) - 26:58
- Mara Zepeda's website - 30:20
- Mara Zepeda's tweet - 31:00
- Mara's company, Switchboard - 30:39
- Business for a Better Portland | Zebras Unite | XXcelerate Fund - 30:47
- "Where Are the Start-Ups? Loss of Dynamism Is Impeding Growth" (NYT, 2018) - 33:21
- "Funding for Female Founders Stalled at 2.2% of VC Dollars in 2018" (Fortune) - 33:42
- Portland Business Journal article on "capital chasm" for black entrepreneurs - 34:37
- "2014: An Important Anniversary for Women and Credit" (NerdWallet) - 36:54
- Washington Post op-ed on redlining and other historical factors contributing to the racial wealth gap - 40:26
- "First Women's Bank would be Chicago's first bank startup in 12 years" (Crain's Chicago Business) - 42:52
- The Equal Opportunity Credit Act - 43:06
- Colorado Lending Source - 43:41
- "Here's Why So Many Americans Feel Cheated By Their Student Loans" (BuzzFeed News) - 44:10
- "Why Women-Owned Startups Are a Better Bet" (Boston Consulting Group) - 44:50
- Kauffman Foundation - 46:02
- Jason Fried's episode on Zebracast - 49:26
- Sen. Ron Wyden's PROGRESS Act - 52:38
- America's New Business Plan (PDF) - 53:40
- The Verge article about the proposed Algorithm Accountability Act - 54:15
The Full Transcript:
[00:00:00] Anyone You Meet Normcore Remix by Clipart plays.
Shaun: [00:00:02] Rework is a podcast by Basecamp. Introducing Basecamp Personal, the freeway to run Basecamp for freelancers, students, families, and personal projects. Give it a try at Basecamp.com/personal
Ruha: [00:00:15] Our starting assumption should be that automated systems will deepen inequality unless proven otherwise. So it’s kind of like guilty until proven innocent approach. Whereas, I think before 2016 many people just assumed that they were going to be better, more neutral, more equitable than their human counterparts.
Mara: [00:00:36] And it’s not just Apple, it’s not just Goldman Sachs. Those are two products and companies that are sitting on top of a financial system that is inherently biased and flawed and discriminatory.
[00:00:48] Broken By Design by Clip Art plays.
Wailin: [00:00:49] Welcome to Rework, a podcast by Basecamp about the better way to work and run your business. I’m Wailin Wong.
Shaun: [00:00:54] And I’m Shaun Hildner. We have a special bonus episode for you this week because Basecamp’s co-founder and CTO, David Heinemeier Hansson landed in the national news.
NBC Nightly News: [00:01:03] Software developer and millionaire David Heinemeier Hansson says he and his wife share assets and income, but Apple Card gave him a credit line 20 times higher than hers, even though she has a higher credit score.
David: [00:01:17] It seemed very discriminatory that I would get 20 times the credit limit, even though our stats were the same.
Wailin: [00:01:24] To recap what you just heard, here’s what happened. Both David and his wife Jamie applied for the Apple Card. This is the new zero annual fee digital only cash back credit card that Apple developed with Goldman Sachs. They put down the same household income and Jamie has a higher credit score, but David got 20 times the credit limit that Jamie did and then when they contacted Apple’s customer service to ask about this, the first representative told them it’s just the algorithm.
Shaun: [00:01:52] So David sent out a series of very angry tweets. This tweet storm went viral. Steve Wozniak, the cofounder of Apple, piped up and said he also got a much higher limit on his Apple Card than his wife did. Jamie Heinemeier Hansson did get her credit limit raised, but the angry tweets had already caught the attention of New York State Financial Regulatory Agency. It announced an investigation into Goldman Sachs, which is the bank behind the Apple Card.
Wailin: [00:02:14] David’s already made the rounds on national media about this, so you won’t be hearing from him today. Instead, we’re going to talk about some of the bigger issues that came up as a result of this whole controversy.
[00:02:27] Jamie Heinemeier Hansson wrote an essay about her experience where she said, “This is not merely a story about sexism and credit algorithm black boxes, but about how rich people nearly always get their way. Justice for another rich white woman is not justice at all.”
Shaun: [00:02:41] On today’s show, we dive into injustices faced by the many people who are not rich and white. We talk about the unjust systems that perpetuate oppression against marginalized communities and how this inequality gets baked into the algorithms that govern our lives. We’re going to kick things off with Dr. Ruha Benjamin. She’s the author of a new book called Race After Technology and an Associate Professor of African American studies at Princeton University.
Wailin: [00:03:05] How did you first get intrigued by the idea of looking at the intersection of race and technology?
Ruha: [00:03:13] A few years ago I noticed a bunch of headlines and hot takes about so-called racist robots. And that piqued my interest because I’d been studying the social dimensions of science, technology, and medicine for a while, but not in the context of the data sciences. So I was interested in how the problem was being framed. This sort of shock that seemed to go along with a lot of the stories that people couldn’t believe that technology could actually be biased. Whereas someone who understands the history and sociology of these fields would probably be less surprised that technology inherits, in some ways, its creators biases.
Wailin: [00:03:53] Right. So you were seeing surprise from kind of the general public and some of the people tasked with covering these issues. Surprised that technology is not neutral. Like, you were seeing holes being poked in that fallacy.
Ruha: [00:04:06] Precisely. And also the sense in which people who were discussing it in journals and magazines and on TV were not necessarily drawing upon the fields that have been looking at these issues for a long time. It’s as if the technology was inaugurating some new problem rather than an extension of an existing problem. So what motivated me was to bring this study of systemic inequality in conversation with the data sciences to kind of generate some new insights.
Wailin: [00:04:40] When you were doing your research and diving into the systems and trying to kind of resurface the history of them, what was some of the earliest examples of an algorithm that you could find? I mean, have algorithms been shaping our lives longer than we might think they have been?
Ruha: [00:04:59] I think so. And I think that in some ways we’re reading algorithmic talk back onto a whole set of practices that have to do with automated decision making, quantification statistics. And so I do sense that the word algorithm and what we name in terms of algorithmic discrimination is kind of, there’s a porous way in which we’re applying it to many things that at the time may not have been described in those terms. And so today I think when we were really concerned about automated decisions systems that seem to black box, what goes into the decision, they seem to hide, the variables, the ways in which predictions are created. And I think it’s that sense of not being transparent, not knowing what’s going into the decision that elicits the caution and the kind of harm that goes along with our current state of play.
Wailin: [00:05:58] If you look back at the way, let’s say insurance rates used to be calculated by someone pouring over an actuarial table, I mean, that’s kind of like a set of data from which you extract certain like prediction models and things. And that would have been done maybe by a human with a calculator and that’s automated now. But, certainly even then, those systems would have contained the assumptions and biases of their creators. Right?
Ruha: [00:06:22] Absolutely. And so that’s exactly the way in which we could… we could describe that as an algorithm or algorithmic decision making. And so part of what we want to think about is not just the abstraction, but also the way in which what goes into it. The decisions that go into it are hidden from view and fewer people have access to the decisions that are shaping their lives, whether it’s in finance or healthcare or criminal justice. And so there is a definitely a continuity with this older form of abstraction that you’re describing.
Wailin: [00:06:57] Can you dive into some real world examples of how you’ve seen modern day algorithms play out in unjust ways? You named a few different industries, healthcare and criminal justice. Can you tease out some of what you’ve observed in those arenas?
Ruha: [00:07:13] Sure. And so we look to the way automated decisions systems have been employed in the context of administering public benefits like unemployment and unemployment benefits. We know that the state of Michigan, a few years ago, adopted a system called MiDAS that was going to identify cases of unemployment fraud and it did so in tens of thousands of cases. Now we know, after the fact that in about 93% of the cases it incorrectly identified individuals engaged in unemployment fraud. Meanwhile, people were hit with charges. They lost their homes, they filed bankruptcy, some people filed divorce, committed suicide, all because of this prolonged process of being incorrectly identified of sort of trying to defraud the state. And so this is an example where the assumption that this automated system would do better than its human counterparts took a real toll on people’s lives.
[00:08:10] Another example is in the context of health care, a few weeks ago a new study was published that showed that a health care algorithm that identifies patients who need more services and more attention was under counting black patients. That is, white patients were much more commonly identified for this digital triaging service. And, in this case, however, the researchers were able to access what was a proprietary algorithm, which is not usually the case. And they were able to identify in some ways the mechanism that was creating this racial disparity. In this case, the main predictor that was used to decide which patients would receive services was cost. That is patients who had incurred more costs in the past were thought to be sicker and so they would receive more services in the future. But on average, black patients for a variety of reasons incur fewer costs, and so they were being under identified and so they were not receiving services that would’ve kept them out of the hospital.
[00:09:11] This is one of those examples in which there was no explicit identification of race in the algorithm. It was a race neutral algorithm and it was precisely by ignoring the histories of race and racial exclusion or marginalization in the healthcare system, the algorithm reproduced the inequality. Not by malicious intent but by indifference to this social reality.
Wailin: [00:09:34] Yeah, I mean, I found that example particularly sobering because I feel like there must’ve been people who looked at that and with said, well, you just train the robot not to look at race, but that’s actually how it was designed. And it still came out with this result.
Ruha: [00:09:48] Absolutely. In part that reflects a kind of fundamental, and in some cases, willful ignorance that many people continue to adhere to that we assume that discrimination is always ever based on explicit markers and explicit attempts to discriminate. That is, if you don’t put up a job ad that says no women need apply or no blacks need apply, if it’s not that explicit, we try to write it off as thereby not discriminatory, when in many more cases we have coded ways of identifying types of people who we don’t want to be part of whatever the issue is, whether it’s a job or housing or healthcare. And so it’s being able to discern the coded ways in which discrimination has always operated that will allow us to understand how it’s now being encoded into our computer systems. And so it’s partly that moving away from an attempt to look for just the most obvious forms of discrimination as the hallmark of bias.
Wailin: [00:10:51] Right. So how are these systems generally trained? They have to be fed some kind of data, right? And so can you talk about the problems that are inherent in just getting the systems off the ground?
Ruha: [00:11:05] So there are many systems that grow require training data. So let’s say if you’re building an algorithm to identify people who you want to hire for a job, a recruitment algorithm. You’re likely to use as your training data, people who are already doing well and working in that place of employment. And so the existing employee profile becomes the training set to identify more of the same, which means that, historically, if a particular profession or workplace has discriminated against women or discriminated against members of the Latinx community, that means that that’s going to comprise the training data for who you’re going to identify as future recruits.
[00:11:46] And so that’s a much more, it’s in some ways a cleaner, more identifiable source of the bias. Then there’s a lot of machine learning that the training is less explicit. So, natural language processing algorithms that are trained on human writing online where the algorithm is making associations and connections that are not necessarily identifiable, where you ask the people who’ve designed it. Can you tell me how X, Y and Z came to this conclusion? And the actual programmers have a hard time actually looking for the chain of decision making. And so, in that case, we see again that with natural learning language processing algorithms, they tend to make the same bias associations that psychologists have identified in humans.
[00:12:31] I have some colleagues over at Princeton, computer scientists who tested to see whether, one very popular natural language processing system would make associations with black sounding names and white sounding names that have been shown in humans when they’re deciding on who to hire for different jobs. We know through classic audit studies that employers often use our names as a code for race and have shown to discriminate against black sounding names. So my colleagues wanted to see if this popular algorithm would make the same associations.
[00:13:04] And indeed, it did. It did associated black sounding names with more negative sounding words. And white sounding names with positive sounding words. And in this case, it was not the very deliberate design of the programmer that made that connection, but it was through learning through the associations that it found online and in our writing that reproduced this pattern. And so, in that case it becomes even harder to think about the intervention. We can see through the outcomes, the bias, but how do you actually train the system to not make those associations that are so prevalent in our speech and in our writing online?
Wailin: [00:13:42] And you’ve talked about how ingesting more data can sometimes result in even greater harm. I can kind of see how that happens. Even though it sounds counterintuitive. You think, oh, the more data they have, the smarter they can be. But it seems like in the example you just talked about, if they’re ingesting more and more data, but all of that data comes from that same place of bias, that it just then reinforces that bias.
Ruha: [00:14:07] Exactly. So I’m thinking about a case a few years ago in which the recruitment algorithm that the company Amazon used for hiring was found to discriminate against women. Now, even when they took explicit mention of gender off the resumes, the algorithm made more and more intelligent associations based on words that were coded in very gendered ways. So, for example, the way that women tend to talk about our work versus men, the algorithm learned this pattern of association and thereby excluded based on that rather than on explicit mention of gender.
[00:14:46] And that’s why when people say, we’re going to ignore gender or we’re going to ignore race and we’re going to take out of the equation, that assumes that only explicit mention of these categories is how discrimination operates rather than on a much subtler level and encoded into various kinds of speech. And so, in some ways, as you say, the more intelligent these systems become, the more racist and sexist they’re likely to become.
Wailin: [00:15:12] We’re really looking at retraining these algorithms, right? Are there some practical ways to start doing that?
Ruha: [00:15:20] There are a number of people who are working on various auditing methods. I have a whole list of resources on the resources tab of my personal website RuhaBenjamin.com that points you to organizations and initiatives both in academia, in industry, and the public sector. Thinking critically about what it will look like to develop public interest technology because so long as the driving force behind tech development is the bottom line, that means other types of values are going to take a backseat or given lip service to. And so there’s a whole host of initiatives that are working on not just sort of tweaking algorithms but actually transforming the social infrastructure in which algorithms operate.
[00:16:07] Again, these automated systems aren’t creating a new problem, they’re extending a long-standing problem and there are organizations that have been working on those problems that need to be part of the conversation. The issues are too important to just leave it to the technologist to handle. And even just people who think of themselves as ethicists, there are so many fields and so many experiences in terms of communities who are impacted by these systems that need to be part of the debate and the conversation.
Wailin: [00:16:34] Does some of the conversation also revolve around the datasets that get fed into these systems. Is there talk about collecting different kinds of data, better data, and kind of starting at that level before you’re even piping them into the machines?
Ruha: [00:16:49] Certainly, my colleague Joy Buolamwini at MIT who was one of the researchers who exposed the discriminatory processes that are part of facial recognition systems. She is one of those who’s working to develop a diverse training set for facial recognition systems or encourage that because these systems are being used in so many different areas of our life and they have a hard time detecting individuals who have darker skin and especially black women. That means that there’s a likelihood of false positives when it comes to a lot of these systems, so she’s one of many who is working on sort of creating a more diverse training set.
[00:17:28] Now there’s a caution here because a few weeks ago, Google bright before its Pixel 4 phone was coming out, hired contract workers to target homeless black people in Atlanta and get facial images from them in order to try to diversify its training data.
[00:17:45] And so it’s important to recognize that the ends don’t justify the means. You can’t try to work for an inclusive product through a coercive process. They were giving people $5 gift cards and not telling them what their facial images were going to be used for. And we only know about this because a contract worker spoke out because they thought something was wrong, and Google pulled the project. And so we can’t just sort of race for diversity and inclusion and still rest that on processes that really undermine people’s agency and autonomy in making decisions about whether they want their faces or other biometric info to be part of these systems.
Wailin: [00:18:25] Gosh, that just seems like some new twisted dystopian version of Tuskegee, you know, like preying on vulnerable populations in the name of science.
Ruha: [00:18:35] And I thought of that and that’s sort of hint hinting at. Communities that have had this history, whether it’s of Tuskegee, whether it’s of prison experiments, whether it’s black women who are enslaved, whose bodies were used to hone gynecological techniques. Marion Sims. All of these examples from in this country and abroad in which the most vulnerable, their bodies have been used to hone science and technology. It’s surprising you don’t have one person around the table, at that Google table, to raise their hand to say, well, you know, this might not, even if it’s just about optics, even if you really don’t care about people, just to know that this is going to go sideways. It would be useful to have someone that has even a little bit of historical literacy or sociological literacy around the table to say that building your product on the backs of the most vulnerable through this process is not a good way to go. And it also speaks to the fact that they don’t have enough black employees at Google to do the same and offer their facial images. And so there, it says something about the diversity of the tech workforce as well.
Wailin: [00:19:40] I mean, on the subject of facial recognition. I saw something that really, really just like chilled me to the bone the other day. I believe it was someone on Twitter shared a screenshot from, I believe it was like marketing materials from a Chinese facial recognition technology company where the screen showed two faces, two Chinese faces and one said Han Chinese with a green check mark. And then on the other side was a different face and it said Uighur with like a red X. And I thought, well that’s terrifying. And then when you were talking about facial recognition earlier, I was thinking are there certain automated systems that, because we seem to only use them for ill ends should not even exist? Like it’s not even worth the effort of retraining them. They should maybe just not exist.
Ruha: [00:20:34] Absolutely. And that’s the conclusion that many cities are coming to at this point from San Francisco to Somerville, a number of municipalities around the country are banning, are creating moratoria around facial recognition systems, especially by use by the police. And so certainly on one end of the spectrum there are technologies that we either know are likely just to have so much harm that they’re not viable or that they’re just, they’re faulty. In some cases it’s not even simply that they target and harm, in this case, racialized populations, but also that they often make incorrect decisions and incorrect identifications. And so facial recognition has both of those things going on. And I would say let’s focus on that, but that’s also take and train our attention on technologies that are promised to do good.
[00:21:24] There’s a whole host of technologies that I describe in my book as technobenevolent. That is, they’re framed, and many people think of them as having a benevolent impact on society. So for example, those AI recruitment technologies, a lot of those companies frame those technologies as skirting bias and getting around the human bias that’s often part of the employment decision. And yet we should look as critically at those technologies that are “for good” as we do that seem more obviously to harm people. Because there’s a lot of ways in which the harms can come in through the back door of benevolence. And so we should train our attention on that as well.
Wailin: [00:22:02] Right. And I know you’ve also spoken before about how technology is often used as a way to address social problems, as kind of like a short-term substitute for addressing the economic or racial or whatever kind of injustices that perpetuate in society. And I think you’ve talked about algorithmic bias in the criminal justice arena and how the technology that judges use, is very racially biased. And it got me thinking, because that technology was probably created because the criminal courts’ case load was overwhelming, so they needed to bring in something to help them. But then the technology, it makes it worse. And it also is not a fix because the technology doesn’t address all the things that are contributing to an overly criminalized society.
Ruha: [00:23:00] Absolutely. And so in many cases we need to ask whether we really need a technology for that or an app for that or a new software program for that. It’s not about being against technology, but it’s about being more discerning about trying to employ technical fixes for what our really deep long-lasting social problems. And too often we revert to technical fixes that not only don’t address the underlying problem but create new problems that we then have to deal with. In many cases, what we need our people to ask the obvious question, do we really need to develop it just because we can? Whether it’s in the context of criminal justice or health care, or education.
[00:23:43] I’m thinking now about so many educational technologies that have been rolled out across the country and now we see students, families, teachers revolting against them. There was a walkout among students in Brooklyn last year and they were basically sick and tired of sitting behind a screen. They had about 15 to 20 minutes of time with their teacher a week. And for them, they didn’t regard that as real education. And so, I’m sure whoever thought to introduce that technology and that system into the schools had all kinds of great buzzwords, all kinds of great ways in which the education was going to be tailored to this group and that group. But the experience of the students was that it was a very poor substitute for true human connection in the context of education.
[00:24:29] I think we can say that was likely to be true in the context of health care and the context of other spaces in which these systems have a lot of promise in their framing, but when we come down to it, we know what to do in terms of creating a quality education. That’s not rocket science. It’s more about the political will to pay teachers what they deserve, to invest in schools no matter the socioeconomic demographics or the neighborhood. We know what to do to create quality education. We don’t need a new shiny system to promise us something that in fact it will not actually deliver.
Wailin: [00:25:05] That’s neat, the example you shared about the students in Brooklyn walking out in protest over the rollout of this technology? Reminded me of what you said earlier about the contract workers at Google, saying something when they really objected to the data collection for the facial recognition technology. It seems like you’ve named a couple of ways where we’ve seen individuals or small groups of people opting out or resisting and so it seems like there are ways to do it. We’re not just engaged in some hopeless struggle against faceless machines that have been set in motion in are continuing kind of inexorably towards like some bad future.
Ruha: [00:25:42] Yeah. We need to move now from a kind of paranoia and paralysis around thinking that the problem is so big we can’t tackle it, to really engaging the forms of power that we know to work. Whether it’s community organizing, whether it’s organizing workers. There’s a movement happening both within the tech industry, in communities at the national level. Again, so many different organizations and initiatives underway and so now it’s really just a matter of people and everyone, not just people who think of themselves as tech savvy or tech oriented. Everyone. Because these technologies are affecting all of our lives, to really get involved, whether it’s at the local or national level. There’s a wonderful organization called Data for Black Lives that’s building a network of technologists, community activists, people and working on policy to come together to build a social infrastructure that can build technologies that are truly in service to the needs of communities as they define them. Rather than as by companies whose driving motive is the profit imperative.
Wailin: [00:26:47] Have there been some interesting projects out of that initiative that you’ve already seen be deployed?
Ruha: [00:26:51] Yeah. There’s a great example out of St Paul, Minnesota in which the public school system and the police department created a controversial joint powers agreement and they called it the Innovation Project. And what this data collection project was intended to do was gather data on young people in St. Paul in order to predict whether youth were at risk for dropping out of school or committing crimes, etc. Basically labeling youth at risk through this predictive analytics. And people in the Data for Black Lives network got wind of it, including those local in St. Paul. And over 20 local organizations joined together and over the course of a year had community gatherings, meetings, showed up to city council meetings, and eventually had the city dissolve this agreement in favor of a more community led approach that was catering to the needs of youth. Rather than labeling them at risk based on some abstract data, actually serving them and what they actually need to thrive and do well.
[00:27:55] And so this is an example of the existence of this network allowing for quick mobilization at the local level to deal with this particular initiative. And now, the organizers in St. Paul are offering trainings and learnings that other locales can use in order to keep track of data tracking. And so in some ways what we need our communities to mobilize and to gain the capacity to see these systems and speak back to them before they’re actually rolled out. Because once a system is in place, it’s much harder to actually get rid of it for a variety of reasons. Because a lot of different people become invested in its continuance and so it’s important before these things are implemented, whether in the public sector, education, healthcare, for communities to be empowered to actually question and resist them.
[00:28:44] I would just encourage people to think about the social inputs of technology, not just the social impacts, and to question the inevitability of technology. When you only talk about the social impact of X, Y, and Z technology, in some ways you’ve already given over your power. You’ve already assumed that the technology is a given. It’s inevitable. We can’t question it. Now, we just have to mitigate the harms. But what I would really like to encourage more people to be aware of is the story before the technology actually is developed. What are the social inputs? Who is around the table? What forms of knowledge and experience are going into the creation and the design of the digital and material infrastructure that impacts all of our lives?
[00:29:28] Anyone You Meet Normcore Remix by Clipart plays.
Shaun: [00:29:29] After the break, we’ll dive into the inner workings of one unjust system in particular, but first, let me tell you about a new offering from Basecamp called Basecamp Personal. Basecamp Personal is a completely free version of Basecamp designed with freelancers, students, families and personal projects in mind. Use it for hobbies, wedding, small events, side projects, or volunteer gigs. Learn more and sign up for free at Basecamp.com/personal.
Wailin: [00:29:58] You heard from Dr. Ruha Benjamin about how the automated systems we’ve created can perpetuate social inequalities. In the case of Apple Card, Goldman Sachs and the disparity in credit limits between David and his wife. The algorithm was reinforcing barriers to capital that women and people of color have faced for generations. In part two of today’s episode, I sit down with Mara Zepeda. She’s the tech founder and CEO who’s helped start a number of initiatives around supporting women entrepreneurs and building more sustainable, ethical, and inclusive businesses. Here’s my conversation with Mara about our broken financial system and what we can do about it.
Mara: [00:30:36] I’m Mara Zepeda. I am the co-founder and CEO of a company called Switchboard. In the process of starting that company, I am the co-founder and founding board chair of three organizations. Business for a Better Portland, Zebras Unite, and XXcelerate Fund.
Wailin: [00:30:53] I wanted to start our conversation with a tweet in response to David. You said, “If I could flip a switch and have everyone care about one issue, it is honestly access to capital. The environment, democracy, transportation, healthcare, affordable housing, education, every decision is made by people who have or do not have power and capital is power.”
[00:31:16] When did you get interested in this very core question of access to capital? Who has it? Who doesn’t? Was it pretty early on in your career? Did it spark a change in career? I think I’ve been interested in this question my entire life in many regards. I came from a family of artists and artists have a very particular way of allocating capital and resources. It’s very democratic. It’s a feast and famine mentality and so you’re constantly watching capital cycle through community in creative ways. And then in my adulthood, I became an economic reporter for National Public Radio and I started to do investigative economic reporting for shows like planet money and marketplace.
[00:32:05] I was reporting out of Philadelphia and that is really where I started to see that every story is an economic story foundationally. So fast forward, once I started my business and became a woman entrepreneur through a long and winding path, I started to look into how to capitalize my own company. I needed financing and my investigative reporter hat led me to look at the financial systems that are the foundation of opportunity in our country. When I started to look under the hood of those systems, I started to see the massive inequality and lopsided distribution of those resources that really prevents people from imagining and creating a new future for all of us. And so, concurrent to running my company, I’ve started a lot of other initiatives to try to increase access to capital and engage local communities in this issue that I feel really passionately about.
Wailin: [00:33:06] Can you talk about some of the ways that you tried to look for capital in those early stages of getting your own initiative off the ground and what barriers you faced?
Mara: [00:33:15] Yeah, I mean, contrary to the popular misconception, I guess, we’re actually at historic lows for entrepreneurship. So that really comes down to a lack of access to capital. And so why this has happened is sort of two forces, two among many forces. But on the one hand, I’m sure that your podcast listeners are familiar with venture capital and David and Jason have done such a great job of debunking that system. But you know, women receive about 2% of venture capital to start and grow their companies. So generally speaking, venture capitalists are looking for outsized unicorn returns, unicorn-sized returns, and they are pattern matchers. So they’re really looking for the next Mark Zuckerberg. And if you don’t match that pattern, and if you don’t tell that story, you don’t have access to that capital.
[00:34:05] The recourse then, is to look to the traditional commercial banking industry. So people would say, okay, well why not get a loan? And ever since the financial crisis, commercial lending has consolidated. So primarily, it’s big banks that are doing commercial lending. Why that’s a problem is because it doesn’t make banks money to underwrite and service and issue loans that are really under $250,000. And so what that means is in the state of Oregon, where I live, as an example, we are down $1 billion in small and medium sized business lending. And when you’re looking at women and people of color, when it comes to having them have access to capital, that’s really the size of loan that they need. And it’s the size of loan that’s not out there for them to get. That’s a capital chasm where women receive 2% of venture capital, they receive 5% of commercial bank loans.
[00:34:59] So it’s just a minuscule amount that they receive in the commercial lending industry. And that’s really where this inequality sits.
Wailin: [00:35:06] When you look at lending and the way that these banks determine who is credit worthy and who should get what size loan, this might apply more in the personal credit worthiness space, but there is five C’s of credit, right? Can you talk about some of that criteria and why both historically and persisting today, why women and people of color usually come out of that equation looking less credit worthy to the banks.
Mara: [00:35:36] The five C’s of credit are character, which is the person’s integrity capacity. And that’s that they have sufficient cash flow to pay the loan back. Capital, and that’s their net worth. Collateral, which are the assets to secure the debt and then the conditions of the borrower and the overall economy, including their credit score and other factors.
[00:35:58] At the end of the day, really it’s the credit score and the collateral that’s most important. And so what you have essentially are these two metrics, these two qualifiers for getting access to capital. And the two of them are inherently biased. On the one hand, women and people of color lack collateral. So they do not have this accumulated generational wealth and assets that the status quo often possesses. And so they are coming in with less collateral to begin with. Because of that, it’s a chicken and an egg where that might mean that they have received less access to credit overall in their lives, which means that their credit scores might not be as high or as built up. And so you kind of have this chicken and egg where you can’t get access to capital without credit and collateral and you can’t get credit and collateral without access to capital.
[00:36:49] And many people don’t know this, but it was only in 1974 that women could even get credit of their own without having their fathers or their husbands co-sign for those loans. And so it’s really within our lifetimes that this is even been a change. So I think what David’s tweet storms surfaced and what this whole issue has surfaced and what I really want to drive home is there are many forces at play here, but two of them are the inherent bias in the lending industry against women because of these factors that we haven’t updated and because of the lack of transparency. And also that is colliding with big data AI and the algorithms and that conversation as well. But it’s really a collision of both of those forces. It’s not only algorithms and it’s not only the credit issue. So that’s just what I want to drive home.
Wailin: [00:37:44] If you, let’s say 40 to 50 years ago, you were a woman and you walked into a bank to get a loan and you had your dad with you, your husband, because otherwise you definitely wouldn’t get one. The bank manager would probably take some personal information, right? And kind of make an assessment based on who he saw sitting before him in that conversation. And now we’ve outsourced that process to big data and to algorithms, right? And this is what David was complaining about, right? Like now it’s like, oh, it’s the algorithm because it’s like you feel like you took the human out of it. And so there’s like even more of like a sense of powerlessness.
Mara: [00:38:18] There’s a couple of different challenges at play here. The first is that studies have shown that more diverse lenders make more diverse loans. So if you’re just looking at humans that issue loans to humans, studies have shown that when you have more women and minorities are in positions of decision-making power, more loans will be issued to women and minorities. And so we don’t have enough women and people of color in the financial lending industry to begin with. So that’s a problem on the human scale. On the algorithm scale, absolutely. You have all of this baked in bias around the person’s credit worthiness. But again, I just want to underscore the bias. It’s not because you’re inputting your race and it is biasing because you are black or because you are a woman. It is that your access to capital as that type of person in this society is inherently limiting your credit worthiness. And so the bias is the fact that you are that type of person to begin with. Not that someone is like reacting to that, if that makes sense.
[00:39:26] And so you’re coming in with a whole set of circumstances and the algorithm is mashing it up, you know, and creating different hierarchies around these scores. And so a lot of this is really wonky work. That’s the other thing that I want to underscore is, there was a really critical aspect of the Dodd-Frank bill that was never passed. And that bill would’ve made it possible for banks to report and be transparent about the total number of people who were applying for loans to begin with and what their demographics are. So right now, other than the government-funded SBA programs that require collection around demographics, traditional standard commercial lending does not require that type of data collection. We don’t have a sense of what the aggregate whole is, like how many people have applied so that we can figure out what the overall disparity is.
Wailin: [00:40:21] When you talked about all of the historical forces at play, I was just thinking about redlining. I was thinking about like the GI Bill. I was thinking about like the housing crisis of 2008, I don’t know. I feel like in order to fix some of these issues, like you’d have to go back in time and like not have redlining.
Mara: [00:40:39] Oh, it’s so true. Yeah. I mean you bring up such a good point, which is that home ownership and having a real estate is one of the primary easiest ways of providing capital to secure that loan. Well, if you don’t have a house or an asset like that, something to collateralize that’s tangible, then it’s very difficult to get a loan. And that goes back to a redlining. It goes back to the financial crisis. It goes back to communities of color that were extraordinarily disadvantaged and abused during the financial crisis with No-Doc loans, which then led to a massive decrease in generational wealth for communities of color.
[00:41:17] And so, absolutely, it’s like this domino effect. And so essentially what you have is this gigantic system that is in place to make it very difficult to accumulate assets, to accumulate wealth. And it’s that wealth accumulation that then leads to an increase of your credit worthiness. So exactly to your point, it’s like this chicken and an egging all the way down. And it can feel bleak at times, but there are such promising programs and initiatives and people that are working on trying to fix this at many levels. And so I do want to say to people, do not despair, but we actually need everyone listening to play a part in this and to care about it. And I just encourage everyone to become really curious and ask questions about the system itself and not get my optically focused on demonizing one company. It’s not a company issue. This is a broad issue.
[00:42:08] The people that have the privilege of accessing products and services from Apple and Goldman are honestly the least of our problems. It’s people that are applying at small community lenders at character-based loan programs to get micro finance. These are the people that we really need the attention of David and people listening to this to become curious about the entire system and not just focus on the elite products and services offered through Goldman and Apple because that itself is a very tiny subset of the population.
Wailin: [00:42:43] Right. And you had mentioned some examples in your tweets over the weekend. You had pointed, I remember, at the example of a woman started and owned bank right here in Chicago where I’m based, right?
Absolutely. Yeah. There’s a former Obama Administration official has started a woman owned bank. In fact, back in 1975, after the Equal Opportunity Credit Act was passed, the first thing to pop up was a bank by and for women.
[00:43:13] So a lot of what we’re seeing is initiatives that are started by and for women to solve these challenges. Here in Oregon, I started a fund called XXcelerate and that is a character-based loan fund. So what that means as you’re looking at the standing of the woman in the community, you’re not looking at credit and collateral as primary forces. You’re looking at the financial performance of the business. And we’ve seen really huge opportunity for funds like this. We partnered with a fund out of Colorado called Colorado Lending Source. The reason why you need alternative capital and alternative financial instruments is a couple fold. The first is if you go into a tech company, you don’t have anything to collateralize, right? So other than your computer, you don’t have anything to put up for collateral. So for tech companies that are not a good fit for venture capital or for small business lending, they need an alternative path.
[00:44:05] The second reason why is because a lot of entrepreneurs are Millennials and their student loan debt is considered in the credit worthiness overview when they are going for a loan. And so we really need to inspire and encourage Millennial entrepreneurs, younger entrepreneurs, to start those businesses, to take a chance. And they are absolutely unable to because they’re tethered to this unconscionable student loan debt.
[00:44:05] So it’s really important that we create new financing mechanisms that are outside of the traditional commercial lending and venture capital industries to meet this massive need.
[00:44:45] So the other thing I want to drive home is this isn’t a charity thing. It’s not a handout. Women-owned businesses outperform men when it comes to revenue and growth. And yet, they receive such a fraction of the available credit that they’re not able to grow those businesses and contribute as much as they could to the economy.
[00:45:04] So this is a huge opportunity for our economy. And that’s another important aspect to understand about this equation.
Wailin: [00:45:11] The student loan issue is really remarkable. It’s when I had not thought about before until you brought it up, because this collective student loan debt is really like a millstone around the neck of an entire generation. It’s like you’re dooming an entire generation to economic failure, I think. If you don’t both address the debt and give them ways to make a living, that takes into account that burden.
Mara: [00:45:38] Exactly. And that’s where jobs come from. You know, the majority of jobs come from small and medium sized businesses. And so when you have this overexposure of overcapitalized businesses, like WeWork that implode and you have these massive layoffs. Small and medium sized businesses are the bulwark against that type of economic force in local communities, which is why they’re so critical.
[00:46:02] The Kauffman Foundation has done a ton of research on this and the Millennial piece and the student loan piece is really important. Our partners at Colorado Lending Source, we’re seeing this time and again that those younger entrepreneurs just couldn’t take a chance on accessing credit and they weren’t credit worthy because of that indebtedness. And so, exactly to your point, all of these ideas are not coming to market, are not being realized in our communities because you have that stranglehold and just this wall that’s preventing them from moving forward.
Wailin: [00:46:34] Yet for your fund, how do you get the word out that your fund exists, that you’re open for business and that you’re looking for people to give money to? Because I imagine that when the system is stacked so much against women, against people of color, that a lot of them who have business ideas might have kind of given up on maybe looking for capital a while ago or something.
Mara: [00:46:56] Totally, yeah. You’re spot on. The fund was born out of the lived experience of me and then three other women entrepreneurs in Oregon. We’d had a really hard time accessing capital to grow our own companies and so we went out into the ecosystem and we were told, well, if you don’t feel like there are enough women getting funded, you have to start your own fund. We were like, okay.
Wailin: [00:47:17] Like, more work for me.
Mara: [00:47:17] I know. Yeah. Thank you. Yeah. It was a huge gift, though. It’s been one of the most profound experiences of my life in terms of learning all of these intricacies of how to go about doing and then all of the barriers in the way. I feel so privileged. The group of women that started this, you know, we have extraordinary privilege and education to be able to push this forward.
[00:47:40] I cannot imagine someone else trying to do this. But in answer to your question, we started an education program called XXcelerate and the XXcelerator is a program for Oregon women entrepreneurs where they go through a three-month curriculum to get them capital ready so that when they’re ready to apply for that loan product in partnership with Colorado Lending Source, they have everything that they need. And what we’re seeing is that women have a lot of questions. To your point, if you have been on the venture circuit for the last year and you’ve been rejected by a hundred investors, your confidence is super low. If you’ve been focusing on making pitch decks, that’s really different from creating financial forecasts and profit and loss statements. If you have never really had to realistically interrogate what’s possible with your business, it requires a different skillset to develop a marketing strategy and sales strategy.
[00:48:34] And so our capital coach at the end of that program prepares women to be loan ready. And the goal of that program is really for as many women as possible to be profitable and sustainable, bankable or investible. And so our capital coaches are helping to prepare them to make it possible, but that is not an algorithm. It’s not a traditional bank. It’s really people and coaches taking a deep, sincere interest in the growth of their business.
Wailin: [00:48:58] Yeah, and it sounds like a different conversation around growth and sustainability and profit is also needed, right? Because it’s like to make a true alternative to let’s say, a current venture capital system, you also have to examine the whole incentive structure and what these founders are prodded to do in the name of satisfying their investors. Right?
Mara: [00:49:20] Totally. Yeah, and that’s where the Zebras Unite movement comes in and Jason was a guest on our podcast called Zebracast, so I encourage all of you to listen to that.
[00:49:30] The Zebras Unite movement is an international movement of founders that are exactly advocating for what you’re talking about. Which is, we have to create new capital systems, corporate structures, community, to advance the interests of companies that are trying to be profitable and sustainable, and there are about repair and not this massive disruption that we’re seeing all around us right now with the implosion and underperformance of unicorns.
[00:49:59] As you look under the hood of that challenge, you have to create new financing options and new corporate structures to accommodate for more shareholders for more sustainable, and sane growth. And for founders of the different stripe, which is why it’s called the Zebra movement. We’re looking for profit and purpose. And these are companies that are very values aligned with Basecamp and other companies that I’m sure are listening out there. But absolutely, you have to invent these things from scratch.
[00:50:27] And a lot of the systems that you’re talking about don’t exist. We have commercial lending, we have venture capital, we have to whatever degree you have the access to friends and family and crowd funding, but that leaves so many people out of the entrepreneurial equation.
Wailin: [00:50:42] What are you looking for on the policy and regulatory side, whether it’s on your local level, state level, or all the way up to the federal level, kind of what’s on your wishlist? Like you’re doing all this work here on the ground. What are you hoping for from other folks who are looking at these issues?
Mara: [00:50:59] I’m so glad you asked. Yeah, thank you for asking because that’s really where the rubber hits the road and that’s really where we need to drive all of our energy. As David said on his Twitter thread, you know, yelling into the ether of Twitter is not going to solve any of these challenges.
Wailin: [00:51:15] Oh, it doesn’t?
Mara: [00:51:16] Unfortunately. Yeah. We actually need a plan, so I would love to share some resources. First of all, in your local city, you have an Economic Development Office. Find out who is the Executive Director of that office and ask them, what are you doing to increase access to capital for women and people of color. Become curious about the lending rates in your state. So, and you know, in Oregon we’ve done research about the number of women and people of color that have received SBA 7A bank loans. Ask your commercial lender, what is your lending rate? You’re not going to get an answer because that piece of Dodd-Frank wasn’t passed, but it’s still worth people asking questions.
[00:51:53] Finally, on the policy level, each one of your states has an Economic Development Office that is overseen by our legislators and lawmakers and treasurer and governor. And each state has a policy and a strategy. So generally speaking, those policies and strategies are either incentivizing really big businesses to come in by giving them tax credits and P.S. that has been shown not to work. Or they are trying to figure out how to create a more broad economic prosperity and look at small and medium size businesses. And so become curious about what that economic development policy is.
[00:52:31] And finally, at the federal level, it couldn’t have been timed out better. Our senator here in Oregon, Senator Ron Wyden, just proposed the PROGRESS Act. That is an act that is focused on funding women entrepreneurs and making up that massive capital disparity. And so federal legislation at that level is now in creation to advance those interests, also. Here in Oregon, our legislature, this legislative session is taking up this issue and they’re starting to become a lot more curious about how we can support more businesses that are owned by a more diverse population and increase that access to capital.
[00:53:09] So that comes down to really tactical things. For example, making it easier to have a loan loss reserve. A loan loss reserve is like the backstop that allows new capital innovation to be created. So with larger loan loss reserves, you’re able to take greater risks and start new initiatives. That might look like grants for entrepreneurs that have been left out of the ecosystem and it could look like policy research. And so on that one, I encourage all of you to check out the Kauffman Foundation. They just launched something called America’s New Business Plan. And there you’ll find a ton of tips and suggestions for how to get more involved in the access to capital conversation.
Wailin: [00:53:49] Well that’s fantastic. I love ending on a more hopeful note and with kind of like a list of resources and things to look into.
Mara: [00:53:56] The algorithm question is one that is so important to understand. It’s so big, it’s so, so big. And so I would really encourage everyone to both investigate the algorithm issue and everything that can be done. Senator Wyden has also proposed an Algorithm Accountability Act and so that’s absolutely critical, especially for all of us working in tech. But we really can’t miss the forest for the trees here and we have to look at what can be done at our local level on a very practical level that is actually moving the ball forward and getting more capital in the hands of more people. And that comes down to a lot of policy work. So if anyone is interested in learning how to do some of this work, please feel free to reach out to me. Mara, M-A-R-A at switchboardhq.com visit our website zebrasunite.com and a lot of this work actually needs values-based businesses like Basecamp to advance these issues through advocacy and policy work at the local level and that’s where a model like Business for a Better Portland, which is the Chamber of Commerce I helped to start. We would love to see Business for a Better Chicago. We would love to see a Business for a Better whatever your city is, so please check that website out and learn how you can get involved at the policy and advocacy side.
[00:55:15] Broken By Design by Clipart plays.
Shaun: [00:55:18] Rework is produced by Wailin Wong and me, Shaun Hildner. Music for the show is by Clipart
Wailin: [00:55:23] We packed a lot of information into this episode, so we’ll have supersized show notes for you, which you can find at rework.fm. I’ll link to news articles on the websites for Data for Black Lives and the XXcelerate fund and all the organizations mentioned.
[00:55:37] Ruha Benjamin’s book is Race After Technology and her website is ruhabenjamin.com you can find her on Twitter at @ruha9. Mara Zepeda is on Twitter at @MaraZepeda, that’s M-A-R-A Z-E-P-E-D-A. Her website is MaraZepeda.com and you can learn more about the Zebras Unite movement at zebrasunite.com.
[00:56:02] Special thanks to Mark Grimes of NedSpace and also Avinoam Henig for their help with this episode.
Shaun: [00:56:09] Rework is a podcast by Basecamp. Projects are a struggle when stuff’s spread out across emails, file services, task managers, spreadsheets, chats and meetings. Things get lost. You don’t know where to look for stuff and people put the right information in the wrong place, not good. But when it’s all together in Basecamp, you’ll see where everything is. Understand what everyone’s working on and know exactly where to put the next thing everyone needs to know about. That’s great. It’s the modern way. The Basecamp way to work, learn more and sign up at Basecamp.com.
[00:56:50] [Audience laughter.]
Trevor Noah (TDS): [00:56:51] And if you’re going to be a credit card that discriminates, don’t do it based on gender. You should do it based on what people buy. Like if you’re using your credit card to buy a beanbag chair and a lava lamp, you shouldn’t have a credit card. That’s what it should be. But I do think that unlike other forms of discrimination, this one might actually be addressed. And I say that because the victims are the most millionaire white people I have ever seen in my life. Look at that couple, huh? The only thing whiter than them is a Panera gift card. That’s the only thing. So I’m excited they brought this out.