Skip to main content

Keep Me Posted: Standing up to anti-Blackness in tech

Even though public criticism of the biases baked into technology is becoming more commonplace, public interest technologists have been making the connection between critical race theory and tech for decades.

“The idea that technology can change your life and in a way that you don't intend it to is something that has been a very old thought for me as a fan of Octavia Butler and other Afro-Futurists,” explains Mutale Nkonde, CEO of AI For The People, on the newest episode of Spitfire’s podcast Keep Me Posted.

Keep Me PostedEach episode of Keep Me Posted features short conversations with leading experts, advocates and organizers working at the intersection of race, rights, democracy and justice in the digital age.

As our country tries the navigate both a public health crisis and a police violence crisis, it is clearer than ever that discrimination encoded in our technology helps perpetuate anti-Blackness. The tech sector itself suffers from a gross underrepresentation of Black voices, Black ideas and Black leadership. But how can we address this in a way that actually overcomes these systemic issues instead of the usual lip service?

AI For The People, the organization Mutale founded, is committed to ending the underrepresentation of Black people in the American tech sector by 2030. By increasing the number of Black leaders in companies, the government and other policy creators, Mutale hopes to begin to address and prevent a whole swath of “-isms” from becoming encoded into systems. AI For The People is explicitly dedicated to changing false narratives that tech is bias-free and neutral and instead empowering communities to advocate for the development of anti-racist policies to govern the design and deployment of AI systems.

“While our mission is around Black technologists, it’s really about the practice of critical technologies,” Mutale says. “Our only barrier to working with us is, ‘Do you have a radical view of an equitable future?’ And if you do, then we want to work with you.”

Even though AI for the People is relatively new, Mutale is a well-known leaderMutale Nkonde Headshot in this field. A former broadcast journalist who produced documentaries for BBC, CNN and ABC, Mutale now writes on race and tech as well as speaks about this intersection at conferences around the world. She is also currently a fellow at the Berkman Klein Center for Internet and Society at Harvard University.

You can learn more about Mutale and her work with AI for the People at mutale.tech.

A full transcript of our chat is below.

You can listen to Keep Me Posted right here or on any of your favorite podcast platforms. Follow us @KeepMePostedPod and stay tuned for more!

 

 

   Spotify   Libsyn   Apple Podcasts

 

Transcript:

Keep Me Posted - EP 03 - Mutale Nkonde

[Music]

Jen Carnig: Welcome to Keep Me Posted – a podcast from Spitfire Strategies about the intersection of race, rights, democracy and justice in the digital age. I’m your host, Jen Carnig, Chief Advocacy Officer at Spitfire. You can follow the show on Twitter @KeepMePostedPod.

Each episode of Keep Me Posted will be a short conversation with leading experts and advocates in law, civil rights and technology.

In the wake of the police murders of George Floyd, Breonna Taylor, Tony McDade, Elijah McClain and so many other Black people, our society faces a reckoning - 400 years overdue - about anti-Black violence and white supremacy.

Tech companies are beginning to express their support for racial justice, but what effect does corporate rebranding have if Black people continue to be underrepresented in the tech world while overly impacted by anti-Blackness hard-wired into our algorithms and artificial intelligence?

This week, I am lucky to sit down with Mutale Nkonde, CEO of AI for the People, Fellow at the Berkman Klein Center for Internet and Society at Harvard University, and yet another bad-ass leader changing the tech world.

[Music]

Jen Carnig: With me now and Keep Me Posted is Mutale Nkonde, an expert on race and technology and the CEO of AI For The People, an organization committed to ending the underrepresentation of black people in the American tech sector by 2030. Mutale, it is a pleasure to have you here with us today.

Mutale Nkonde: Jen, thank you for inviting me. I'm so, so delighted.

Jen Carnig: You know, you and I have known each other for a little while. We met just over a year ago when we were in the same small convening of tech and human rights, organizers and researchers. And when the subject of race came up, it was like a record scratch, right? Folks looked at us like we were from another universe. And it's amazing to think that just over one year later, after the series of national uprisings, that the tech industry has suddenly discovered race. When you look at the national political landscape, what are the spaces where you think we can actually make some big wins in terms of protecting civil rights and ending what you call anti-black algorithmic bias, if we press hard enough right now?

Mutale Nkonde: Yeah. Um, great question. And you were right. When we, when we met people were so appalled that you were saying that white people should really stand in the gap and stand as allies and stand as kind of co-liberators in this space because of the recognition that white interest in any issue is going to make it more compelling. And they were like, ‘The world is perfect, Jen. Stop it with your nonsense!” And so, obviously you and I didn't think it was so, and that's why we're friends, right?

Jen Carnig: [Laughs] Right.

Mutale Nkonde: In terms of your question, I think that anywhere that the Movement for Black Lives goes provides really fertile ground for how we can think about technology and society through a critical race lens. So one of the things that happened after the unfortunate lynching of George Floyd and not to forget that Breonna Taylor, Ahmed Aubrey, Rayshaun Brooks and others - there will be others, so this isn't going to end - but one of the things that I think the Black Lives Matter movement had done, specifically since 2014, was introduce this idea that advocates would go to the streets and make it very known that racial justice had to become part of the American Dream as we understood it. And we're now in a situation where, according to Pew, 67% of white Americans now agree that Black lives matter. And that was really getting to what you were saying in that initial meeting that you and I had that white people have to agree. There is a responsibility here. And so I think when we think about carceral technologies, so at AI For The People we're really, really honored that the previous work that we had done on regulation around No Biometric Barriers Act that was introduced to the House in 2019 was then being repurposed and reused. And those ideas revisited for the Justice and Policing Act. So here we were a year later when we couldn't get co-sponsors, we couldn't get the bill out of committee, we were basically told that this was nonsense. That statute that had been written can now go into this other push for justice because Black Lives Matter, a multiracial, multiethnic, multigenerational movement after George Floyd as it became, were saying “Defund The Police.’ So, whenever people ask me, you know, where should I put my money? I say that we owe so much to that movement and the people who chose that as we push forward, we shouldn't be working in silos. We should be thinking about the great people that are doing work across the board and then how that work can be in conversation with each other. However, the people that I really do see being left behind are school kids because we're in a national conversation about what online learning does and should look like whether people should be in school, whether tele teaching is viable and we're not having a conversation about access, we're not having some of these very, very “digital divide” conversations that have been going on since the 1990s. And we're not acknowledging that if you are an Indigenous kid or if you are a white kid living in Appalachia or a Black kid living in Brownsville, New York, you may not be able to take part in education. And so I would really encourage educational activists, as we know them traditionally, to be in conversation with folks like me and to figure out how we can point out those very specific harms. And that's another place where tech companies here in New York - Apple gave the New York Department of Education, a bunch of laptops. I remember right at the height of COVID some huge number. It was something like 50,000, 20,000, I don't know exact, but what happens with those machines? What generation, where they are, what OS were they? None of that is known. And that's something that I think is a national crisis.

Jen Carnig: Yeah. It absolutely is. I mean, we in, plus just having them on machines, right. It doesn't actually fix anything. Cause if you can't have access to broadband, you know, The point is kind of moot anyway.

Mutale Nkonde: Right, right. And that, that is a discussion that may not be as trendy as facial recognition. It may not be as new, it may not have the media attention, but that is something that we have not solved. And specifically not in these Indigenous, on reservations, or in deeply rural areas, which are part of the United States as well.

Jen Carnig: Yeah, I mean, you're really hitting on something, which is kind of like the flip side of what we're talking about, right? On the one hand we have these tech CEOs who are just discovering race, just learning about it, having their own, you know, whatever you want to call it, “awakening.” But then we also have advocacy groups who, for the first time, are thinking about that technology exists. From your perspective, and the work that you do really in both worlds, what do you think that advocacy groups and really those traditional rights organizations need to know as they begin doubling down on kind of what you're talking about?

Mutale Nkonde: So one of the many remarkable, uh, things that have happened for us as an organization is, with this awakening about race and this awakening about race and technology, all of a sudden we're in conversation with the UN and we're asked to join multi-stakeholder meetings. And one of the things that was really interesting about some of those conversations were, one of the best calls I think I've ever been on was with the Special Rapporteur, UN Rapporteur for Race and Xenophobia and Hate, which sounds like a really fun job.

Jen Carnig: Easy!

Mutale Nkonde: Real easy. And she just released a first report called Techno-Racism. And this had gone in front of the UN Human Rights Committee in June of 2020. And that was a great moment, but it was also like, ‘United Nations, hello! How could it possibly be that this is the first time that this report, that this is a focus for you?’ Particularly when you think about how electronic IDs are used in refugee settings, where people who are stateless also, their privacy is taken from them. These issues are not new, like I could go on and on and on, but the study of critical race in technology, which is sometimes described as Critical Race and Digital Studies, that's about 15 years old. There have been canaries in the coal mine for a long time. And that gets to the point of your question around nonprofits in the NGO world. You can go into a nonprofit who will host you, you do your project, they give you some money and they give you institutional backing and access to a network and an email address that validates you. Right? So I've been on a number of these on my route of public interest technology, but within those environments, there was immense hostility to speaking about anti-blackness, hostility towards speaking about race. And I think we saw some of that in that meeting where people just felt very uncomfortable because in their minds, they're white, they're liberal, they voted for Obama, they walked past a black person, you know? And so they're fine. And it's like, no, we're talking about decisions about which research gets commissioned and which research doesn't. We're talking about decisions about, outside of Color of Change who have been doing this work for 15 years amazingly, we're talking about ‘can a campaign center and be about blackness?’ And the answer has often been ‘no.’

Jen Carnig: So enter your organization, AI For The People, and kind of walk us through how your organization and what your perspective is on getting to that, right? How do we turn that around and ensuring there's greater representation of black people just in the tech sector? What will that help do?

Mutale Nkonde: I'm a journalist. I come from storytelling. I've always been passionate about narrative change and I've always been passionate about the power of edutainment, so entertainment for education. So our mission is to have Black leaders within public interest technology, but technology more broadly, because tech companies also have ethics teams and responsible tech teams. And we don't want to limit people from that. That government also need people that are thinking about how sexism, racism, ableism, and all the evil ‘isms’ becoming coded into these systems that are being procured and used within public services. And then the NGO world: there has to be a place for people who are committed to nonprofits, that they can come and do this work. And while our mission is around black technologists, it's really about the practice of critical technologies. So anybody can apply to come work with us. Our only barrier to working with us is, ‘do you have a radical view of an equitable future?’ And if you do, then we want to work with you. And if you don't, then we definitely don't want to work with you. And as long as we can all agree on that, then we’re good.

Jen Carnig: [Laughing] I love it. When you talk about that radical view for an equitable future, kind of give us a little insight, a sneak peek about what we're going to see when you're successful. What's that world look like?

Mutale Nkonde: So, a couple of things in the very short term, we have a disinformation project that's looking at Black actors specifically, and why what happened in 2016 should happen and will continue to happen if we don't get to this problem of race in America. I'm really excited about that because I think it really recasts that conversation at a time when folks are ready for the conversation too. And then in the long-term, I would love to endow the organization and have it be a place where this is a recognized training ground for those of us that do have that radical future. And that to me is the most radical of all to say, ‘We want to have a Black organization that is financially sustainable and it's only reason for existing is to make sure that this type of leadership is going to be in the world.’

Jen Carnig: What's the one thing that you would want folks to know about your work right now?

Mutale Nkonde: I think that this work is old. It sounds like it's really new that we have, I know, as a journalist, I've been looking at facial recognition and thought it was a joke, actually. I remember I ever saw a test for it was with the 2002 Super Bowl. And it was a test for facial recognition and it came across, I was working at the BBC and somebody pitched it, and we were all in the meeting being like, ‘You are crazy, you're never gonna make it in this world, give us some real news!’ And you know, obviously that came to bite me. But the idea that technology can change your life and in a way that you don't intend it to is, is something that has been a very old thought for me as a fan of Octavia Butler and other Afro-Futurists, I would also like them to know that our work has saving our own lives. So we don't really have the luxury of switching off. I changed my whole living room around recently, so that my couch could be facing my door because of the no-knock warrant of Breonna Taylor. And so these stories that we're thinking about, uh, also impact us.

Jen Carnig: I'm so grateful for you, that you exist, and for your time. You helped me imagine what liberation can look like for us. So, Mutale, thank you. That's a perfect place to leave it. Mutale Nkonde is the founder and CEO of AI for the People. Thank you so much for joining me today.

Mutale Nkonde: Thank you, Jen. Thank you so much.

[Music]

Jen Carnig: I want to again thank Mutale Nkonde of AI For The People for joining me today on Keep Me Posted. You can learn more about her work at mutale.tech.

That will do it for this episode of Keep Me Posted. Please follow us on Twitter @KeepMePostedPod and rate, review and subscribe to us on Apple Podcasts.

Keep Me Posted is produced by Spitfire Strategies. Our production team is Gabriel Rodriguez, Kristiana Jordan, Duncan Bartok, Aaron Zeiler, Hannah Berkman, Maggie Cass and Nima Shirazi. To learn more, visit us at spitfirestrategies.com. The music is by Ui. I’m your host, Jen Carnig. Thank you so much for listening. Stay well. 

***

This entry was posted on Monday, September 28, 2020 at 20:10 pm and is filed under Combating disinformation, Digital strategy and Ethical and visual storytelling. You can follow any responses to this entry through the RSS 2.0 feed. Both comments and pings are currently closed.