Transcript:
Keep Me Posted - EP 02 - Hannah Sassaman
[Music]
Jen Carnig: Welcome to Keep Me Posted – a podcast from Spitfire Strategies about the intersection of race, rights, democracy and justice in the digital age. I’m your host, Jen Carnig, Chief Advocacy Officer at Spitfire. You can follow the show on Twitter @KeepMePostedPod.
Each episode of Keep Me Posted will be a short conversation with leading experts and advocates in law, civil rights and technology.
In the midst of COVID-19 and uprisings calling for the end of police violence across the U.S. and around the world, lawmakers and leaders are turning to technology for a cheap and decisive solution. But what should we do when these solutions increase surveillance, unjustly placing eyes on Black and Brown people?
Joining me to explore this is a bad-ass organizer - Hannah Sassaman – Policy Director at the Movement Alliance Project. Based in Philadelphia, MAP connects communities and builds power for working families at the intersection of race, technology and inequality.
Jen Carnig: With me now on Keep Me Posted as Hannah Sassaman policy director for Movement Alliance Project in Philadelphia. Hannah, thank you so much for joining me today.
Hannah Sassaman: Very nice to be here.
Jen Carnig: You and I are speaking in the middle of a global pandemic and domestically amidst and uprising or a series of uprisings. Movement Alliance Project works to build power on behalf of Black people, people of color, working families, poor families, the same people most impacted by both State-sponsored violence through policing and this public health crisis. Issues of surveillance are at the heart of our response to both. How has this moment impacted your work and what do you think are key issues that privacy advocates, or really anyone who cares about basic civil rights ought to be concerned with?
Hannah Sassaman: Sure. That's a really good question. So when the pandemic hit, we were facing a massive crisis in our jail population. Uh, the courts shut down and stopped processing any cases. So the normal sort of operations that would help incarcerated people be able to come home, like being able to get a hearing to reduce their bail or being able to move to some kind of community supervision, those stopped suddenly. And so we looked at the jails up on State Road here in the northeast part of our city as basically the worst cruise ships ever. Like a massive opportunity to spread coronavirus in a population of almost exclusively Black and Brown people, mostly with many underlying conditions caused by generations of divestment, generations of poverty and racial oppression, just at major threat of getting sick. And then also have that illness spreading deeply to the community, because we have both prison workers and social workers and other members of the city moving in and out of those prisons every single day and moving that infection into the community. So there were a number of major barriers to getting them out, like both trying to force the courts to open enough, to process their applications and forcing the District Attorney's office and working with the defender association to do that. But we found also that within lockdown, within shutdown that access to Internet and access to cell phones, especially smartphones were a huge barrier. We have a huge opioid epidemic here in the city of Philadelphia, and so often a court will not release someone who is caught up in that epidemic, caught up in addiction unless they have access to a treatment bed or access to some kind of support system or a therapist or someone else who can work with them on their addiction. But we were finding that all of the different institutions that would take people in who have addiction, most of them were not accepting new folks. So we found ourselves scurrying to try to find a way to get smartphones available, so human beings could get out of jail so they could have video access to a support circle or to a treatment counselor or to another kind of ability to have their addiction ameliorated, otherwise the courts wouldn't release them. And so we were finding that access to technology was hugely ,hugely a piece of getting people out of jail. Another thing that we're finding as a Movement Alliance Project has been working furiously with many other organizations and with literally hundreds of thousands of people in the streets of Philadelphia and around the country to defund the police and to reduce the footprint towards abolition of police oppression in Black and Brown communities, that there's a lot of reforms that different political leaders will put forward that they say are engaging with the problems of police brutality - deep over policing in Black and Brown communities, deep racial oppression - that they say are a solution to that, but actually ended up being a form of surveillance or incarceration and pushing like both more money into the police apparatus and also re-doubling or tripling the way that police eyes on Black and Brown communities can exist. So, not long after the uprising started, we saw a whole bunch of elected officials come together to announce expansions of programs called Operation Pinpoint here in Philadelphia, which are used like data-driven, algorithmic predictive technologies to try to find “hotspots” to be able to, you know, deploy police in the right way. And then the sort of rebirth of programs called “focused deterrence”, which is now in Philadelphia called Group Violence Intervention, which is also predictive tools meant to try to identify people most likely to either shoot or be shot and then to, you know, threaten them with extraordinary amounts of punishment if they are caught doing or being accused of doing anything out of line. But it's also supposed to offer resources like, ‘Oh, these are folks who really have a lot of burdens on their back and that's why they're engaging or potentially engaging in these activities, according to the system, and so let's give them the resources they need,’ but in most iterations of focus deterrence, including the last one that came down here in Philadelphia, there was only like $150,000 ever allocated to actually provide these community members, which they were targeting, literally rounding them up and sending them to City Hall and saying, ‘We're watching you guys!” Like literally that, and not actually providing them with the actual resources to ameliorate the challenges or whatever. So I think what we're finding is the sort of twin ruptures caused by both the coronavirus pandemic and how society is breaking in response to it, as well as this global uprising against police oppression and police brutality and to defund and dismantle and abolish the police. What we're finding is that technology in those moments of crazy austerity and crazy change is often being pushed as a structural solution to those instances. And so I am terrified of a new iteration of that happening now, especially with the big sort of amplification of the economic struggle that we're facing and the sort of Fascist racial character of the particular government that we're looking at now.
Jen Carnig: I mean, you've really outlined just the dark and terrifying nature of technology and how it is impacting so many of our communities. And yet the line that we're sold all the time is that it is our savior: technology is going to be what frees us. It is going to end violent policing. You know, body cameras or the panacea. And it makes me wonder if there is such a thing as just technology. What would it look like for there to be just AI? What steps would we need to take to help ensure that bias really is checked in that we can use these tools to cultivate the future of equity and justice that, you know, at least we are all dreaming of, and that Movement Alliance Project is actively working toward?
Hannah Sassaman: I think that I'm probably not the right person to answer that question. Like I would take a look at folks like Joy Buolamwini and Mutale Nkonde and Timnit Gebru - Black women who have been at the absolute forefront of doing hard and rigorous science demonstrating that something like facial recognition can be extraordinarily biased, both in its internal design and in its application. But I think it's important to understand that communities that are deeply impacted by the deploying of these technologies into these alive systems of incarceration and punishment, aren't Luddites. I would just make sure that we like really listen to and study the leadership, both of Black women and other rigorous data scientists who have worked to lift up the structural bias inside both the academy and the field that designs these tools and deploys these tools and can talk about various ways to both ameliorate their harm primarily by giving extraordinary power to the communities that are impacted by these tools or that, you know, have visions of technologies that they want to deploy. Like, there's been some conversation and I've even think some application of, like, how do we use predictive algorithms to be able to know when a cop is going to kill somebody, right? That's certainly a thing that makes a lot of sense, but we also just have to be really, really nakedly honest about the imbalance of power in our societies, like most decisions to deploy technologies and predictive tools happen in the dark. They happen in closed conference rooms where mid-level bureaucrats working for court systems and Sheriff's departments are like 'that one' and they've picked something and then start to sort of calibrate it. And so a lot of the work at MAP has been about putting sunlight into those processes, building the popular education and accountable listening with communities who have been judged and sorted by tools like this for many generations, to be able to lift up their actual analysis and vision of transformation into those rooms and make sure that those rooms maybe shouldn't exist anymore, you know? And we look forward to continuing to take leadership from those communities and from the data scientists, primarily the Black women and other data scientists of color who have been extraordinarily clear in how the structure of the entire economy and academy that builds these technologies is not a bug, but a feature of why they're racist.
Jen Carnig: So in this moment of that twin rupture, what is MAP up to? What are you encouraging members of your community or where are you being led by the folks within your organization? What is most urgent at this time for y'all?
Hannah Sassaman: So we've always looked at, you know, technology and organizing around, you know, racial and economic justice, as it intercedes with technology as part of a conversation about human liberation and in trying to answer the question of how can our people actually govern our society. Like we don't want it to make it a little better. We don't want to, you know, incrementally change at the edges. We want to radically transform how the world works, so the people most oppressed by it are actually the ones in charge of it. So we're really, really focused, to be honest, on defunding the police. We had been working and preparing for six months with base-building organizations and national groups that focus at the intersection of media, narrative and political economy and technology. Groups like Free Press and the Mic Center, to be able to start to challenge how policing was told in the local press. And because of that we were actually at a place where we were able to help people in the City of Philadelphia, as the uprising began to flower, understand that we had a live budget process and that we could defund the police right now. And we weren't able to get as far as we want, we were able to basically keep the police to be flat-funded and to set ourselves up for the Fall. But at the same time, as I said earlier, we see reforms being put forward that are really about putting predictive technologies and new surveillance and new positions for data analysis that are sold as making policing better, smarter and less biased, but just the structural facts on the ground about when these have been deployed, as they don't make police better, smarter, nicer, friendlier, cuter, more like puppies, they just make them more powerful. They give them more money. They make them better at brutality. And so it's going to be trying to build a local campaign that both defunds the police and that makes sure that we don't replace the police with horrifying surveillance technologies that will be really hard to pull out. But it's also working nationally on this. And so we've been working closely with national networks of other city leaders, leaders, meeting community leaders, focusing on what this sort of analysis is in their place around you fund, how have these technologies been deployed. And I'm really, really grateful for like the networks that we have, not just with base-building organizations and humans who are extraordinarily profound storytellers about their own experience and the experience of their communities, but also like accountable legal scholars and practitioners who help to build arguments for what winning looks like in the current legal apparatus that we have and policy experts as well, and the national level. And, yeah, I'm really grateful to be a part of it.
Jen Carnig: Yeah, you do have this really unique kind of local and national view and perspective. If there's just one thing you want someone listening to take away, what is the one thing?
Hannah Sassaman: In order to try to transform the systems of oppression that we're facing, we have to be willing to be transformed in the work. And so the work of examining, in yourself, patterns of oppression that you perpetuate and making that commitment to transform those patterns, like relinquishing power, if you're a white person - but not blindly - in the leadership and as part of an organization. Joining an organization that is focused on deliberation of our society, whatever color of skin you are, joining an organization and learning with a large group of people, what your division of labor is and what you need to do to transform yourself and what your work will be in that larger plan. That's how revolutions have happened in our world. Revolutions have happened in our world when we transform despair, alienation and anger into a coordinated plan with a target that's based on an actual analysis of the actual terrain, to win. So I would encourage local people to find a movement organization that is focused on that question and where you can be one soldier in that army. And I promise it will transform you and your life forever.
Jen Carnig: Transformation. Thank you. That's a perfect place to leave it. I've been talking to Hannah Sassaman, policy director for the Movement Alliance Project in Philadelphia. Hannah, thank you so much for joining me today.
Hannah Sassaman: Thanks to you as well.
[Music]
Jen Carnig: I want to again thank Hannah Sassaman for joining me today on Keep Me Posted. You can learn more about the Movement Alliance Project at movementalliance.org.
Next time on Keep Me Posted, I’ll speak with another amazing woman leading the movement for just and equitable public interest technology. I’ll be joined by Mutale Nkonde, CEO of AI for the People and a fellow at the Berkman Klein Center for Internet and Society at Harvard University. I hope you’ll join us.
In the meantime, you can follow us on Twitter @KeepMePostedPod and rate, review and subscribe to us on Apple Podcasts.
Keep Me Posted is produced by Spitfire Strategies. To learn more, visit us at spitfirestrategies.com. The music is by Ui. And I’m your host, Jen Carnig. Thank you so much for listening. Stay well.
***