Introducing… Keep Me Posted: A New Podcast from Spitfire
It’s true that technology has the power to foster connection, community, learning and promote equity and justice. But it can also easily be used as a tool for surveillance, division, discrimination and to amplify inequality.
At Spitfire, we have the distinct pleasure of collaborating with some of the foremost experts, advocates and organizers working at the many intersections between technology, civil rights, democracy and justice. The knowledge they hold is crucial to navigating toward a future where technology actually serves the public interest – something that is perhaps more important than ever.
Now, we’re bringing these powerful voices for transformation, liberation and (dare I say) revolution directly to you in a new Spitfire podcast I’m honored to host called Keep Me Posted. Each episode of Keep Me Posted will feature a short conversation with leading experts and advocates in law, civil rights and technology.
We learn from these leaders every time we hear from them, and we know you will too.
At a time when we’re facing multiple, interconnected and compounding crises, it can be tempting to think of technology as a panacea. Trying to contain a pandemic? Many have suggested we institute high-tech yet ineffective, inaccurate and invasive contact-tracing apps. Trying to prevent police violence? Some have misguidedly recommended we replace police with citywide surveillance technologies that may stop short of physical brutality or murder but still deeply infringe on civil rights and discriminate along racial lines.
My conversations on Keep Me Posted dissect these issues and imagine a better world, one that’s more harmony than horror show.
And I’m thrilled that my first three guests on Keep Me Posted are all bad-ass women leading the world of tech and equity, justice and organizing.
On our first episode, I speak with Neema Singh Guliani, senior legislative counsel at the American Civil Liberties Union, where she focuses on surveillance, privacy, and national security issues. Follow her on Twitter @neemaguliani. A full transcript of our chat is below.
Prior to joining the ACLU, she worked in the Chief of Staff’s Office at the Department of Homeland Security under President Obama, concentrating on national security and civil rights issues. She has also worked as an adjudicator in the Office of the Assistant Secretary for Civil Rights in the Department of Agriculture and was an investigative counsel with the House Oversight and Government Reform Committee, where she conducted investigations related to the BP oil spill, contractors in Iraq and Afghanistan, and the Recovery Act.
You can listen to the first episode of Keep Me Posted right here or on any of your favorite podcast platforms. Follow us @KeepMePostedPod and stay tuned for more!
***
TRANSCRIPT
KEEP ME POSTED: EPISODE 1 with Neema Singh Guliani
[Music]
Jen Carnig: Welcome to Keep Me Posted – a podcast from Spitfire Strategies about the intersection of race, rights, democracy and justice in the digital age. I’m your host, Jen Carnig, Chief Advocacy Officer at Spitfire. You can follow the show on Twitter @KeepMePostedPod.
Technology has the power to foster connection, community, learning and promote equity and justice. But it can easily be used as a tool for surveillance, division, discrimination and to amplify inequality.
We know that Amazon’s facial recognition software has difficulty identifying female and darker-skinned faces. Studies have shown that AI technology used for job recruiting often favors male candidates, as the AI models and algorithms are developed and tested using men’s resumes.
In past moments of crises, both real and manufactured, we’ve seen that policymakers too often put our basic civil rights and civil liberties to the side with disastrous – and often deliberately destructive – consequences for immigrants and the Indigenous, people of color and those living in poverty.
Each episode of Keep Me Posted will be a short conversation with leading experts and advocates in law, civil rights and technology. And I’m thrilled that our first three guests are all bad-ass leaders in the world of tech and equity, justice and organizing. Unsurprisingly, they’re all women. So let’s get to it.
Today, I’m speaking with Neema Singh Guliani, Senior Legislative Council at the American Civil Liberties Union.
[Music]
Jen Carnig: With me now, our first ever guest on Keep Me Posted, is Neema Singh Guliani, Senior Legislative Council with the ACLU. Neema, thank you so much for joining me today.
Neema Singh Guliani: Thanks for having me.
Jen Carnig: To start off, let's set the terms of what we're talking about here. Technology and what we call data are often hailed as objective and agnostic; neutral things that are perhaps somewhat manipulated by bad actors, but are inherently unbiased. In your work at the ACLU, is that what you find?
Neema Singh Guliani: So what I've found is that, you know, technology often reflects inherent biases that are in communities and in society, right? And this often happens because technology is obviously created by individuals and those individuals hold their own biases. And sometimes it relies on data and data sets that also reflect and biases. So think of, for example, information around arrest data, that's going to reflect biases in policing. And the fact that many communities of color are more likely to be arrested regardless of their criminal activity. And so when we think about technology, I mean, I think what we see is technology, which often itself can reflect biases. And it can also be used by actors that, um, have their own biases, which can create secondary effects on communities that are affected by it.
Jen Carnig: I'd love to pull on that thread a little bit more because we've been told this story for years, this notion that technology is our savior and everything from contact tracing to telehealth and remote learning is going to save us right now. At the same time, we know that social bias is reinforced and amplified by the all-male culture that too often pervades big tech. When you were here, all the promises that are being made by Silicon Valley, what steps should we be taking to help ensure bias is checked; that we are actually unleashing the power of tech to advance equity and not using it to accidentally usher in an era of even greater injustice?
Neema Singh Guliani: I mean, I think we have to start with the premise that technology is not perfect and it's not going to be a silver bullet. So whether we're talking about contact tracing to address concerns with COVID-19 or we're talking about education, technology in and of itself, isn't going to resolve the problem, it isn't going to resolve deep inequities that already exist in society that require much broader public and policy responses. Now that's not to say that technology can't be helpful and can't assist in these matters. But I think that what we have to do is we have to make sure that we're looking at it at the front end to make sure that the protections are built in. And that we're using technology in a way that's responsible and we're using actually the right technology that's going to be effective. So there's a couple of things that I think that policy makers and technology developers have to keep in mind. The first is, you know, at the outset who's involved in making the decisions about the technology and how it will be used. One of the things that we found is really important is the ability for the communities who are going to be most impacted to have a seat at the table, to highlight either concerns with the technology being deployed, better ways that it can be developed and how it can better suit the needs of the population. Without having impacted communities in the room, you risk really losing that valuable insight that can help shape exactly the technology you're using for these problems. The second is we should really start talking about effectiveness on the front end and what the benchmarks are for effectiveness. So for example, some people have talked about using technology to assist with contact tracing, right? This idea that technology could help provide notifications to individuals if they might have come in contact with someone who's infected with COVID-19. At this point that technology does not have proven efficacy, and we know that it is likely to have problems, that it may not for example, discern when two people are in close proximity, but actually separated by a wall and unlikely to transmit the virus. And so we have to make sure that there are clear benchmarks and that there are clear processes in place for measuring effectiveness, and those benchmarks also need to include particular communities that may not have access to technology and thinking through what the effect is on leaving out potential individuals in any solution.
Jen Carnig: Yeah, it really does feel like, you know, I think for most individuals, like we really have zero power. And so without being naive, what kinds of protections or promises are essential to demand an extract from government and tech companies responsible for implementing these technologies?
Neema Singh Guliani: So I think when we're talking about, you know, the response to COVID-19 and technology specifically, um, related to the pandemic, one of the things that should be made clear is that any information collected and used for public health authorities, um, and for public health purposes, um, you know, this is necessary both, I think, from a privacy perspective, but also just to ensure that any public health effort has effectiveness. If people are worried that, you know, data they're providing to it's, you know, a private company working with a public health agency or a public health agency alone, if they're worried that that information isn't safe-guarded, they may not provide all the detail needed to help respond to the pandemic. This idea of exposure notification or contact tracing is really great example, um, that relies on trust, right? What we're asking people to do, if they are found to test positive is we're asking them to share really what could be intimate details of their life, reveal who they might have come in contact with, and to help us track and notify those other individuals. That whole system relies on trust and for us to develop that trust and make sure people feel confident participating in it, we have to make sure that they know that that information is going to be safeguarded and used only by public health authorities for public health purposes. And so, as we seek to sort of look at different digital tools, right, and, you know, putting aside the question for a second about whether those tools are effective, I think we have to sort of know that at the outset, one of the things that we need in order for people to feel comfortable using those tools is really a clear sense that they are in control of their data. It's not transmitted unless they provide consent and they're using the tool voluntarily. And also once it's transmitted, it's not going to be accessible to law enforcement or immigration enforcement or a company who wants to use it for advertising.
Jen Carnig: I mean, that's one thing that I think the pandemic has, um, helped create an interesting space for as a side effect that we are even having conversations like this now, you know, privacy advocates, like you have been talking about these issues for years. And a lot of folks just haven't felt it part of their daily lives and now suddenly, you know, we all know terms like contact tracing and geolocation tracking. I'm curious, just kind of on a human level for you at the ACLU, how has the pandemic impacted your work?
Neema Singh Guliani: Now, one thing that has been a result of the pandemic is that it's really shined a light and exacerbated existing problems in ways that I think were hard to anticipate prior to the pandemic. So prior to the pandemic, you know, we raised concerns about the digital divide, right? Um, the, the reality that there were a lot of technology haves and have-nots; some people who lacked access to a smartphone or access to high speed internet, and that those discrepancies were actually quite notable. So for example, if you look at data, even around something like a smartphone, 30% of individuals who make less than $30,000, according to a Pew Institute paper, don't have a smartphone, right. Something that I think a lot of us take for granted, and what the pandemic I think has highlighted, is the real effect of that digital divide. Now we're asking questions of, well, how do we actually do remote learning when there were so many children who may live in homes where they don't actually have access to technology or the internet necessary for that remote learning and how will we catch up, now that they've may have lost months of schooling. We're also asking, I think, important questions of even if we have, you know, some great technological solution, for example, something like an app that actually proves to have efficacy, if indeed these tools do prove to be effective, how do we deal with the reality that many of our most high risk populations simply don't have a smartphone are going to be unlikely to be able to use these tools? So nearly half of people over age 65 don't have a smartphone - that's one of the most high risk populations when it comes to COVID-19. Same thing with many low income areas, right? So all of these equity issues that we were very aware of before pandemic, I think now have been magnified and because of the nature of disease spread, we recognize that a failure to reap some of these most at-risk communities, doesn't just hurt those communities, it's going to hurt all of us writ large. So that's certainly, I think, you know, one issue that has been magnified by the current pandemic. And then I think the second issue that has been magnified, sort of what you rightly highlight was this fact that now that everybody, large numbers of people are doing so much more in their homes and online than ever before. People are starting to think about privacy and the effects of that in different ways, whether it's communicating via Zoom or buying your groceries online or just simply spending more time on the internet because you're more limited in what you can do outside of your home. All of that I think is, is raising questions and the effects of the lack of a strong privacy regime, because we're not really equipped to be able to help people as they're doing these essential things. We can be certain that your privacy is safeguarded and even as we develop tools that we hope, you know, may help to combat the pandemic and tools that, you know, we would hope people would feel comfortable using. We are less resilient and able as a community to provide those assurances, because they're not preexisting in our laws. And so in some cases, you know, we're playing catch up and you see this in efforts at a state and federal level where there have been various proposals to deal with data privacy, focused specifically on COVID-19 and the pandemic. But we're playing catch up. These protections should already have been in place. And so all of these problems that existed before the pandemic, in some ways have been magnified, and I think are now starting to, in some cases, could potentially hinder our, our public health response in ways that I don't think people could have anticipated. You know, one of the stats I read, was that a contact tracing app that was only used by 40,000 people, cost several million dollars to develop in the state of Utah. We should be asking, is that a good use of public health resources, especially considering some of the privacy and civil liberties concerns with that tool? So, you know, those are probably like the two issues that I think have emerged more, most forcefully on the privacy front. Then separately, these questions of remote education, accessibility, how to make sure people have the tools to live life, especially considering that some of the changes wrought by the pandemic are probably not going to be short term changes, they're going to be long-term changes. And so how do we make sure our technology infrastructure serves all communities when it has root inequities. Those are probably some of the issues that we're certainly keeping an eye on.
Jen Carnig: Yeah. I mean, you just said some really wild, creepy things there, you know, surveillance and a potential for digital discrimination to just go to the Key Food or the Publix is really eye-opening and makes the hair on my arms stand up. But even with that being part of the world we're living in now, what is giving you hope these days? What is sustaining you and making you think that change is possible now?
Neema Singh Guliani: I think what is giving me hope is that there are technologies that could help, right? Remote learning isn't necessarily a bad thing, right? We just have to learn how to use the tools effectively. I think the second thing is, you know, now more so than ever, people recognize that, um, we all have to be in this together and that if we leave behind an entire segment of society, there's not only health consequences and economic consequences. We're not actually as a community dealing with the various challenges that the pandemic has brought. And so I think this sense that we have to address everybody's needs if we are going to be able to move forward as a community and get past some of these challenges is helpful because it really forces us to address head on the equity issues. When I talked about contact tracing, one of the things said was a lot of people aren't gonna be able to use an app, even if we have a successful exposure notification or contact tracing app, whatever term companies are using. I think people understand that and they're trying to sort of think through, ‘Well, even if we had this technology, what other tools do we need to ensure equity? How do we either address the technology divide to make sure everybody can use the tool? Or how do we develop other health resources to make sure that people are still not left behind?’ And I think similar similarly outside of technology, there are things like, you know, paid health leave, that all of a sudden have become at the forefront because we recognize that, you know, if an individual doesn't have access to health resources or employment resources, then that has sort of downstream ripple effects on the overall health of the community. So those are things that give me hope in that we are acknowledging this problem in a way that we have not always before, and there seems to be a commitment by many to address these problems. Both because it's the right thing to do, but out of a sense that if we don't address the community wide problems, we all will be impacted.
Jen Carnig: Neema , I am incredibly grateful. It is a great place to leave it, on a note of hope. I've been speaking with Neema Singh Guliani, Senior Legislative Counsel with the American Civil Liberties Union. Neema, it has been so great to talk to you today. Thank you, thank you. And, uh, we appreciate you joining us on Keep Me Posted.
Neema Singh Guliani: Sure. No problem.
[Music]
Jen Carnig: I want to again thank Neema Singh Guliani for joining me today on Keep Me Posted. You can learn more about her work at aclu.org.
Next time I’ll continue my conversations on public interest tech with Hannah Sassaman, the Policy Director at the Movement Alliance Project in Philadelphia. I hope you’ll join us.
Please follow us on Twitter @KeepMePostedPod and rate, review and subscribe to us on Apple Podcasts.
Keep Me Posted is produced by Spitfire Strategies. To learn more, visit us at spitfirestrategies.com. The music is by Ui. And I’m your host, Jen Carnig. Thank you so much for listening. Stay well.
***
This entry was posted on Monday, September 14, 2020 at 11:56 am and is filed under Brand identity and strategy and Combating disinformation. You can follow any responses to this entry through the RSS 2.0 feed. Both comments and pings are currently closed.