Skip to main content

Keep Me Posted - Episode 9: Azza Altiraifi

Azza Altiraifi

Spotify   Libsyn   Apple Podcasts

 

Transcript:

 

Jen Carnig: Welcome to Keep Me Posted, a podcast from Spitfire Strategies about the intersection of race, rights, democracy and justice in the digital age.

Each episode of Keep Me Posted has a short conversation with leading experts and advocates in law, civil rights and technology. I'm your host, Jen Carnig, Chief Advocacy Officer at Spitfire. You can follow the show on Twitter at @KeepMePostedPod.
 

As the use of surveillance technologies continues to rise, our day-to-day lives continue to be affected. From education to employment, web searches to doorbells, countless studies have shown that surveillance technologies are inherently biased and discriminatory, and that's especially true for people with disabilities. What are the consequences of deploying this technology in the workplace and online learning, and how can advocates working to regulate the use of surveillance technologies, better speak to its impact on people with disabilities? This week I'm joined by Azza Altiraifi. 

 

Azza Altiraifi is a disabled organizer and researcher and senior program manager at a progressive economic messaging organization. Previously, Azza was a research and advocacy manager at the Center for American Progress's Disability Policy Initiative, where she spearheaded advocacy campaigns, as well as researched and published articles on mental health policy, surveillance and advancing economic security for disabled people. 

 

[Music]

 

Jen Carnig: I'm thrilled to be joined now by Azza Altiraifi. Thank you so much for joining me today on Keep Me Posted.

Azza Altiraifi: Thank you so much for having me. I'm excited to be here, Jen.

Jen Carnig: For a while now, anti-surveillance and privacy advocates have been sounding the alarm about how some of the country's largest companies, and really the world's largest companies like Amazon, are deploying bossware to monitor and supervise employees on the job. Can you talk about the implications of workplace surveillance on people with disabilities?

Azza Altiraifi: Surveillance of workers has really exploded over the past years and it's part of a broader context over which, through was that have been passed through policies that have been pushed through court decisions, um, that have been decided over the past years, worker power has been systematically eroded. And this shows up in a lot of different ways. Workers lack meaningful bargaining power and most workers in the U.S. aren't represented by a union, and they lack meaningful legal protections, particularly when it comes to this sphere of surveillance and this intrusive encroachment on workers' lives. And for disabled workers in particular, there are really steep consequences to this. So, disabled workers already face a lot of barriers within the workplace. They already face various forms of discrimination. They tend to be concentrated in roles and because of occupational segregation, they tend to be in roles that pay less.

They tend to be in roles that expose them to harms, whether in the context of COVID-19, right, making it more likely that they become ill with COVID-19. They tend to be in roles that have less benefits and protections. They tend to also get ensnared in the forms of work that don't offer protections, like gig employment, and that kaleidoscope of labor conditions make it much more dangerous for disabled workers. When all of these different forms of surveillance are becoming more and more ubiquitous and more and more technologically advanced. So, a lot of attention has been rightly given to the surveillance practices of giant corporations like Amazon, like Walmart. And then there are different forms of surveillance. And I think examples might be helpful here. So everything from the ways that drivers are surveilled, such that how long it takes them to get deliveries done, how many deliveries could be crammed into a single set of time to the actual surveillance that happens in warehouses that determine how long someone might be sitting, how long they're away from their station.
 

And then there's also the forms of surveillance that are happening in corporations that haven't gotten as much scrutiny, but for remote workers — and there are now many more because of the pandemic — who might be facing the kinds of surveillance that track keystrokes, that might track your eye movements, that might randomly take screenshots of your computer to assess whether or not you're doing the right work. Those are all actual examples and for disabled workers who experienced the world in a variety of different ways because there are a multitude of ways that disability manifests, this has real consequences. One example is for the people who have Crohn's disease, or other kinds of disabilities that require frequent bathroom breaks, they're going to be measured against these norms and baselines that these systems create. And those norms and baselines are designed with the able-bodied, neuro-typical, generally white, generally male worker in mind. It creates an ideal archetype of the most productive worker that systematically, therefore disadvantages people who are not going to be able to meet those kinds of baselines. So everything from taking frequent bathroom breaks might result in a disabled worker being seen as less productive than their non-disabled counterpart. Someone who, for example, like me have disabilities that result in unusual eye movement might be flagged by an eye tracking software and thus be potentially subject to greater scrutiny, potentially discipline. And while there are laws that are in place to protect workers from unlawful discrimination, they do not sufficiently address the ways that a ubiquitous and expansive surveillance regime can systematically and structurally disadvantage workers and particularly workers with disabilities. And I think it's also really important to note within that context of eroded worker power. The balance of power is so asymmetrical.

 

It is extraordinary the extent to which employers have all of this power and information at their disposal and workers have far less of it, right? So everything from at will-employment terms, that mean a worker can be terminated without just cause. And that this systematically undermines worker's ability to speak up about mistreatment, to speak up about how these practices harm them, therefore perpetuates these inequities. It also means that unionizing and bargaining become even more difficult because if all of the workers' activities are being tracked, then it suppresses the kinds of organizing necessary to form and to wield power the way that unions might allow.

Jen Carnig: Certainly the stories of the workers at the Amazon factory or the shipping plant really brought that home - just the impossibility of actually being able to organize. It really felt like a story from another time, right? Just so awful to think of what it would be like to imagine in that condition. And of course that's how so many workers in the United States are. That's the situation. Thinking about how this impacts students, you know, throughout the pandemic, we have seen a massive shift to online learning, of course, and with it a growing backlash against the use of exam proctoring software. Advocates argue that this software violates privacy rights and is inherently discriminatory, and we've also heard stories of students facing repercussions for simply moving the wrong way or even just shifting in their seat during an exam. What does this say about barriers to access for students with disabilities? It was so helpful to hear the stories that you shared. Are there stories that have stood out to you from students who are struggling in this kind of environment?

Azza Altiraifi: Absolutely. And I think it's helpful to bear in mind, right, that these are broader structures of ableism of racism and anti-blackness that don't exist in a vacuum, but really undergird every institutional arrangement in this country. So, whether it's in the labor context and the ways that workers are disempowered or within the educational context and the way that students have been disempowered and have faced these barriers. I think it's helpful to note that in the context of online monitoring and proctoring, those companies, these ed tech companies have existed for a long time, but much in the same way that the transition to remote work really accelerated the kind of scrutiny and questions that we're seeing now, the transition to remote learning also galvanized this student movement to push for more equitable and more just processes to administer exams. And so these systems are doing something that is particularly pernicious and is rooted in ableism and eugenics.
 

What they are doing is encoding students' bodies and body minds because bodies and minds are connected and not separate. And it's encoding them as either being normative or deviant. They are either safe or they are threatening. And again, this is being done on these baselines that are inherently prioritizing and designating as a preferential on his normal, the white non-disabled neuro-typical often male straight person, right? That is the ideal. And thus, anything that deviates from that is going to be flagged by these systems. And as I'm saying this, what becomes clear then is that ableism, anti-blackness, CIS hetero patriarchy, these are structures that are being embedded into these algorithmic assessment tools. And as a consequence, it is making it so that students who show up in ways that are just different than that baseline that has been created because it encodes the past, they are just systematically being excluded and are being forced to learn in environments that are hostile to their very identities and the ways that they show up to the world.
 

And so there's a researcher, Shea Swauger, who notes, it promotes something called the eugenic days, right, where everything is seen through this lens of what is, and is not the dominant and therefore preferred societally constructed norm of excellence. Much in the same way that this occurs in the labor context, in the student context for students who already have trauma based to disabilities, who already have mental health disabilities, the extreme stress and the extreme anxieties that are produced by systems that require you to behave and to function in ways that are “normative” are really dire and really extreme and can actually be disabling themselves or exacerbate people's anxiety disorders, exacerbate their symptoms related to trauma. That form of encroachment and surveillance is inherently harmful.
 

Students with disabilities are already navigating an environment that is structurally really burdensome for them to navigate. There's so many ways that ableism shows up. Everything from the difficulty to access reasonable accommodations to the high costs of learning in these environments, to the increase in the use of policing in those environments and the ways that that exposes them to danger. All of these things make it an already burdensome and difficult situation for disabled students, and this has only been exacerbated by the use of these ed tech tools which are codifying and re-entrenching these sorts of forms of discrimination. One thing I am heartened by as well is seeing the extent to which university students have successfully organized and created petitions and gotten various universities to actually agree to reevaluate the use of these tools or to develop new approaches that aren't based in punishment and based in compulsory ways that people should show up in order to actually have access to learning.

Jen Carnig: I mean, the stories you share really do feel like it sounds like a science fiction novel to me, or out of just another time. I mean, just your eye movement, the way you move your body and the punishment that comes with it. It's very difficult to really fully appreciate how challenging and just frankly, unfair that kind of system is, you know, when we think about some of the surveillance technology with regard to policing, and when we talk about predictive policing through AI and surveillance, there, we know very much, it has a disproportionate impact on black people and all communities of color, but the implications that has on those with disabilities really is often left out of the narrative. What would you share with our listeners? You know, what should we know about the intersection of these issues?

Azza Altiraifi: Yeah, I think within the context of policing and the prison industrial complex and criminal legal system, generally, perhaps here more than anywhere else, it is especially clear that race and disability cannot be analyzed or understood separately. These are structures that are codependent and they are structures that work together to fuel the kinds of criminalization and violent forms of social control that Black and brown people and particularly Black and brown disabled people face. I think within the context of predictive policing, one example that immediately comes to mind because we were just talking about this school context, is that in Tampa in 2020, there was a report that had come out that had shown that the police department there had created this secret list of students that could “fall into a life of crime” using a whole set of indicators including things like if they scored a D in this particular class, that could potentially be an indication of them being prone to criminality, right?
 

That is a disturbing example of where predictive policing takes you and nd there are lots of examples of this that provide these like place-based systems of predictive policing that you zip codes and these other proxies in order to attempt to identify who might be affiliated with gangs, for example, the notorious gang databases, or who might be prone to violent activity. It's really important to just say right here, studies have shown pretty conclusively that predictive policing does not actually correlate with crime rates at all. It correlates with arrest rates because to the extent that there are higher arrest rates when predictive policing is applied, that is because there's just higher rates of contact between police and those communities that they are subjecting to greater scrutiny and criminalization, and for disabled people. It's important to note that those proxies, when it comes to income, when it comes to race are necessarily going to disproportionately impact disabled folks.
 

This is because disability is overrepresented in every other marginalized community, right? So disabilities overrepresented among communities of color because of all of the other forms of environmental racism, because of the ways that segregation has lumped communities into spaces that result in adverse health outcomes because of socioeconomic barriers and all of the other sort of structures that have collectively resulted in higher rates of disability, among communities of color and higher rates of trauma due to the impacts of centuries of racial violence and terror, and thus any sort of predictive policing tools that are disproportionately harming communities of color are necessarily also disproportionately harming disabled people. But if you peel it back even further and go another layer into this for disabled people in much the same ways that we talked about in the labor and educational contexts, disabled people are showing up in the world in ways that are cast or categorized as being non-normative as being deviant and potentially dangerous.
 

And as a result, disabled people are more likely to face fatal encounters with law enforcement because they are not able to comply for example, with an officer's orders in the way that an officer might expect and they are encountering police more often than their non-disabled counterparts because of the way that these tools are deployed. One example that I think really speaks to these dynamics and the way that these tools entrench racist and ableist assumptions, is the use of violence prevention programs in the Department of Homeland Security, something that the Biden administration has actually increased funding for. And it's based on the totally debunked countering violent extremism program that was launched under the Obama administration and it's empirically disproven. But what it does is it wrongly claims that there are these identifiable markers that will predict who's going to commit an act of terror or violence and included in those markers are things like feelings of loneliness and feelings of depression and acts of like religiosity, like wearing a scarf or going to the mosque, but really important to note, there's like feelings of isolation and depression and loneliness.
 

Those are things that are being described by this program as being worthy of, of reporting, right, and keeping tabs on those folks as indicators of potential violence. Those are just symptoms of mental health disability. And so it is literally criminalizing the disability and it is criminalizing and inviting bias by codifying and encoding these sorts of markers that are in no way actually tied to violence at all. 

Jen Carnig: The number of times you've said social control is really stunning to me, and that's such an important term here and truly terrifying. So I just appreciate you walking us through this. As the movement against surveillance technologies grows and does begin to gain the attention of state and federal policy makers. How can we ensure that disability rights are actually really included in that message and even centered there because it will certainly make things better for all of us. How do we center disability rights in this work?

Azza Altiraifi: Absolutely. I think one of the important starting points in any conversation about algorithmic rights and justice, about privacy protections and the sorts of regulatory and legislative tools available to us to better protect marginalized communities, it's important to note that race and disability cannot be separated. And so to the extent that there has been important and increasing attention to the way that these tools have harmed communities of color, without also simultaneously accounting for the ways that these tools are impacting disabled people, we will fail both communities and we will especially fail those who live at the intersection of those identities. And one of the ways that I like to think about this is in order for us to craft solutions that will actually meet the needs of those most directly impacted. We need to understand that there's a historical throughline, right? So you've mentioned that a lot of these kinds of tools are applied in ways that sound like they're from another time, but they really are the legacies of centuries worth of systems of social control that I keep noting in the labor context.
 

For example, the modern workplace surveillance structure is rooted in a historic distrust of workers, but it also has explicitly racist foundations in slavery. And those kinds of throughlines are important. Not only to ensure that we craft tools that actually protect people sufficiently, but those through lines better equip us to identify how racism and ableism are operating jointly to cast certain groups as being the norm and being exceptional and being more productive and being worthy of support and other groups as being the opposite of that — deviant not worthy of investment, not worthy of support, not worthy of having access to safe learning environments, not worthy of bargaining and having power in their workplaces, not worthy of being free of criminalization and state violence. 


By examining disability and race jointly, we will be better equipped to actually craft policy interventions and community-based interventions that address those who have been most harmed by these sorts of tools and these algorithmic systems.

Jen Carnig: That's such a great place to wrap up. Azza, you’re a phenomenal speaker, and I'm incredibly grateful that you spent some time with us this week on Keep Me Posted. It is a real privilege to talk to you and just — thank you so much.

Azza Altiraifi: Thank you, Jen. It's been such a pleasure talking to you.

Jen Carnig: I want to again thank Azza Altiraifi for joining me today on Keep Me Posted. We'll be back soon with more conversations with experts, organizers and movement leaders working at the intersections of tech rights, race and justice.

Until then, please follow the show on Twitter at @KeepMePostedPod and rate, review and subscribe to us on Apple podcasts.
 

Keep me posted is produced by Spitfire strategies. Trendel Lightburn is our senior editor. Our production team is Gabrielle Connor, Ben Christiason, Maggie Cass and Nima Shirazi. To learn more, visit us at spitfirestrategies.com.
 

The music is by UI. And I'm your host, Jen Carnig. Thank you so much for listening. Stay well.