Transcript:
Jen Carnig: Welcome to Keep Me Posted, a podcast from Spitfire Strategies about the intersection of race, rights, democracy and justice in the digital age.
Each episode of Keep Me Posted has a short conversation with leading experts and advocates in law, civil rights and technology. I'm your host, Jen Carnig, Chief Advocacy Officer at Spitfire. You can follow the show on Twitter at @KeepMePostedPod.
In recent years, more and more companies — big and small — have deployed AI powered tools in the workplace. While these tools are ostensibly intended to make hiring and supervising workers easier for managers, there's tremendous risk of discrimination embedded within what is effectively automated surveillance technology. The harms of algorithmic bias, the systematic discrimination born of artificial intelligence software are becoming more well-known. What is less familiar are the deep systemic harms AI can have on people with disabilities. How can tech-focused advocates ensure that the growing reliance on AI doesn't leave people with disabilities behind? What policies have to be in place to protect everyone? This week, I'm joined by Alex Givens.
Alex Givens is the president and CEO at the Center for Democracy and Technology, which works to promote democratic values by shaping technology policy and architecture. Alex is an advocate for using technology to increase equality, amplify voices and promote human rights. I'm thrilled to be joined now by someone I really admire. Alex, thanks so much for joining me today on Keep Me Posted.
Alex Givens: Thanks for having me.
Jen Carnig: There's been a massive increase in the use of algorithm-driven tools in the workplace, gaining even more steam throughout the pandemic, of course, but not all workers are considered or treated equitably by the various tools employers use. From hiring to on-the-job activities, what can you tell us about the impact these tools have on people with disabilities?
Alex Givens: There are a whole range of new tools being explored and a lot of them, unfortunately, raise pretty significant concerns. One area that we've spent a lot of time looking at at CDT is the use of technology in hiring. Obviously that's hugely important because that's the onramp to economic opportunity. And what we're seeing as companies get more and more resumes, more and more candidates that they need to filter through, they're turning to automated tools to help speed up that process. It comes at a real cost because a number of these tools essentially learn by studying existing employees within the company and looking for common traits amongst them, and then screening new applicants as to whether or not they have those same traits. That's a recipe for perpetuating exclusion and discrimination and existing inequities in the workplace that are already deeply ingrained within American society.
Jen Carnig: I mean, it's really stunning when you think about how it really can just sift out so many qualified candidates. And you've said that making hiring technology accessible means ensuring that a job candidate can use the technology and that the skills it measures don't unfairly exclude candidates with disabilities. Tell me about pymetrics and how AI hiring games — and it is called a game — can deeply discriminate against candidates with disabilities.
Alex Givens: Sure. So this is one of the types of tools that we're worried about, although I'm going to spend time telling you about other ones as well, like resume screening tools or recorded interviews. These games are increasingly being marketed as a way to look beyond the resume to judge candidates based on the traits or the attributes that they exhibit over the course of playing a game, but often there's a real problem from an accessibility perspective. I'll give you two examples. One popular company uses a game where they have somebody repeatedly hit the spacebar on their laptop to blow up a balloon and what that is ultimately measuring, apparently, is your appetite for risk. About, you know, how inflated you let the balloon get before the risk of it popping and exploding. We can have a specific conversation about whether that's actually a real way of measuring someone's appetite for risk and whether an appetite for risk is an attribute that we should be screening job candidates for.
Even before you get to those far bigger questions, there's the simple accessibility of the interface, right? What if I'm not able to hit the spacebar repeatedly? What if I have to do so slowly because of a mobility impairment or because I'm visually impaired? Another one that's been written about in the literature and that we've written about as well, tries to have candidates click red and green dots as they surface on a screen. And the player is asked to press the spacebar only when they see the red circle. And the company says, well, this tests for focus and impulsivity and attention control. Again, we can stop to say well, to what degree are those essential functions of the job and is this the right way to test for them?
But the accessibility threshold is off there. You know, if you're color blind, you can't see the difference between red and green. So there are some really fundamental questions about just, can people even work with the interface? Then there's the deeper question about what are these games actually measuring? Is the game truly a fair measure of that and is that the right to be screening candidates for anyway? And all of that race is really complicated questions that end up excluding people who aren't able to properly engage with the tests.
Jen Carnig: So thanks for telling us about the games. What are some of the other tools being used?
Alex Givens: So some examples are resume screening tools. This is one of the tools that's gotten some of the most attention in kind of popular culture. There are some famous examples of resume screening tools that looked at the common characteristics amongst current employees. And they found out that the AI screening tool was punishing people if their name wasn't Jared and they didn't play high school lacrosse because those have been the dominant characteristics in the main employee pool. So that has some racial connotations and gender connotations, but you can think about screening tools that punish people because there's a gap in someone's resume, perhaps because they took a very understandable medical leave at one point, or that doesn't recognize a university like Gallaudet, one of the most famous, excellent universities in the world, but it happens to be accessible to people who are deaf. And so it might not be recognized as, you know, a prestigious institution that other employees of the companies have gone to. So there were the resume screening tools. Separately, there's a suite of companies that emerged in recent years that recorded video interviews as a way of speeding up time so that you don't have an in-person interview.
And some of those companies purport to then do AI analysis of that video recording, presumably looking at natural language processing. So the phrasing and the words that you use, and some early iterations also very publicly said that they were looking at people's facial movements and speech patterns. That's just on its face, a massive risk of discrimination against people, not only with darker skin tones, where we know that facial recognition analysis does not work so well, but anybody with a speech delay or facial paralysis who might be blind and therefore not making eye contact with the camera. A world in which somebody is getting judged on those attributes really is a problematic one and one that we need to watch for. And then the final category is actually far more widespread and not particularly high tech, but very important. And this is the increasing use of personality tests, in particular for service level kind of customer facing jobs, where there is plenty of evidence, and this is at retail stores across the country, of potential employees being asked questions like on a scale of one to five, I feel happy when I wake up in the morning.
Jen Carnig: Oh boy.
Alex Givens: Think for a minute about how that correlates to economic opportunity, to family burdens any person might have, let alone of course, to mental health conditions that somebody may have. And again, what you're seeing in that instance is that you're asking questions that probably don't actually relate to the essential functions of the job. Does a service employee need to wake up feeling happy every morning? No, they might have to act happy in the course of a customer interaction, a manager might be able to claim that, but that's very different than how you actually answer a questionnaire that asks your true feelings. So there just really, again, are just so many hidden barriers and hidden assumptions embedded in this test and these are the types of things we're trying to raise awareness of.
Jen Carnig: So much to think about with that. And my background personally is with the ACLU, so I always go to the law. We have federal laws in place like the ADA, the Americans with Disabilities Act, and it sounds like from what you're sharing, that that is clearly just out of date. Can you just tell me a little bit about that and why the laws and protections we're supposed to have really aren't up to snuff right now and what you think needs to be done to ensure that we actually are protecting and promoting people's basic rights?
Alex Givens: Sure. So, you know, the funny thing about the ADA, it has really powerful language. In fact, more powerful language, I think in some degrees than the Civil Rights Act and Title VII, which we typically think of in these scenarios for discrimination on the basis of race and gender. The Americans with disabilities act expressly says that you cannot use a hiring test that screens out or tends to screen out people with disabilities. So it looks very specifically at the design of the test, what you're testing for, and if that's likely to have a disparate impact on a disabled person. It also says that hiring tests need to look for and assess someone's ability to perform the essential functions of the job. That's why I was saying that so frequently in that previous example. What they're trying to do is make sure that in a job description you've really whittled down what actually needs to exist for this person to succeed in this job and do they have those traits?
And then the ADA also says, let's not just look at my assumption of what somebody who is blind or paralyzed can do, look at what they can do. If their disability is reasonably accommodated. If they have a support animal, if they have an aide who is able to help them, if the pace of work is changed to better accommodate someone with ADHD. So all of those protections are actually already enshrined in the law, but at the same time, to your point, we're not seeing a ton of litigation over this. So clearly something is going wrong. I think one issue is that people may not even know that they're being discriminated against by a test. Oftentimes you would engage with a test and you may not even know exactly what it is that is being assessed or that it matters if you're hitting the spacebar at a slower rate than your non-disabled peer might be.
So people might not even know that they're the victim of discrimination to be raising concerns or to be asking for accommodations. And then separately vendors aren't thinking about what the off-ramp looks like. So what happens to the person that is color blind in this scenario, or isn't able to engage? What does the plan B look like for them to be assessed in another way, which is what the law requires. But again, if people don't know they're being profiled, it's really hard. We know, just as a general matter, how hard it is to be a plaintiff filing a civil lawsuit against a massive employer. That is not an easy feat. So one of the areas that we focus on is really making sure that the equal employment opportunity commission is empowered in this space and looking closely at the types of concerns that can be raised. And then of course, just reminding employees, it shouldn't take litigation or the EEOC pressing charges for employers to be taking these concerns really seriously, particularly at a time when employers are paying lip service to equity and inclusion and how important it is, let's call them on that and say, if you mean it, then you are really going to give deep thought to what your onboarding process and your hiring processes look like. And you're going to think about the unintended consequences for marginalized communities.
Jen Carnig: It's so much to think about it, and it really does in many ways, just feel so big, and how do you get a hold of things? You know, I know your organization, the Center for Democracy and Technology is part of a really growing movement that's focused on the risks of bias embedded in algorithms. What can you tell us about the work you all are doing to ensure that people with disabilities are not overlooked within this movement?
Alex Givens: Well, as you say, we're part of a big community of civil society organizations, civil rights organizations, and others that have been focusing in on these questions. And we saw that work happening. We were part of it — CDT itself — but what we noticed is that the early academic scholarship on this and the advocacy on it really focused, overwhelmingly, on racial and gender based discrimination. That is a huge issue in this space, don't get me wrong. We need deep focus on it, but we can't focus on those forms of discrimination alone. We need to broaden the lens for the reasons that I gave, right, that these tests can equally be discriminating against people on the basis of disability. You think about the basis of age, about people with non Western names or non-American accents and how that might play in an interface where somebody's analyzing your recorded video interview, for example.
So we need to take this broader frame. How do you do that in an authentic way? It is deeply important to me to honor the mantra of the disability rights community, which is nothing about us without us. And so we, as a tech first organization at CDT have been very committed to partnering with leaders in the disability space to really raise awareness of these issues. We did it in part through hiring. We have a phenomenal disability justice advocate who comes from that background and field and lives with disability themselves join our staff to lead this work. We partner very closely with the American Association of People with Disabilities, as joint thought partners thinking about what risks might be and trying to guide the advocacy on this. And then we spend a lot of time going out to educate other groups about it and to try and be good allies, particularly for the disability rights groups that are so busy fighting many other battles as well.
We tried to say, okay, well in this space, we want to be vocal. We want to help surface these stories and lift them up and raise broader awareness. And the types of things we're doing, fighting for change in the law is the one thing, that matters, I talked about urging the equal employment opportunity commission, the justice department, kind of other enforcers to be paying attention to this, but we also need to spend a lot of time empowering candidates to know their rights. How that software that is interviewing you might actually be evaluating you. What questions should you ask? Do you need to ask for an accommodation? How do you go about doing that? Who do you go to if that wasn't an easy process for you? How do you report those types of concerns? And then we're spending a lot of time talking directly to the vendors, trying to have them make smarter design decisions and to the employers to say, ultimately, no matter what, the buck stops with you. You are running this hiring process. You need to make sure that you are treating your candidates fairly and equally, and thinking about what the unintended consequences might be.
Jen Carnig: Hearing about this movement that's growing and that you all are an integral part of. Can I ask, when we're thinking about this growing movement, what is making you feel excited or hopeful right now? What do you see that is something that lets you know that the collective that's working in this space is really on the right track?
Alex Givens: There's far more attention being paid to these issues than even a couple of years ago and that is thanks to the large group of people, academic researchers and civil rights advocates, who began this work a long time ago and have been working to mainstream it. We're now seeing in Europe, there's major legislation on the use of AI that is moving through the system. And many of the examples they talk about are uses and kind of sensitive use cases like hiring. We see in the United States, Congressmembers on both sides of the aisle and in both chambers asking real questions about this, considering perhaps legislation, but also talking about what the enforcement agencies can do. We've seen some really encouraging statements from commissioners at the Equal Employment Opportunity Commission about the work that they want to do in this space. So we're getting through to the right audience at the policymaking level.
Now I'm a former hill staffer. I've lived in the DC bubble for a long time. I am conscious that regulators talking about something doesn't actually mean that it is infiltrating the decision-making of individual businesses, and that's the part that needs to happen next. We are spending a lot of time with mainstream media outlets, trying to get them focused on these topics so that your average HR expert or CEO knows about the risks as well and can be informed consumers when a vendor comes to them with a tool like this. They know what questions to ask, they know what the risks might be, and ultimately they know that they might be setting themselves up for legal liability as well.
Jen Carnig: I’d love to stop there. Alex Givens, it is such a pleasure. Thank you so much for joining me today on Keep Me Posted.
Alex Givens: Thanks for having me.
Jen Carnig: I want to again thank Alex Givens, President and CEO at the Center for Democracy and Technology. Next time on Keep Me Posted, I'll speak with Lydia X. Z. Brown. I hope you'll join us.
Until then, please follow the show on Twitter at @KeepMePostedPod and rate, review and subscribe to us on Apple Podcasts. Keep Me Posted is produced by Spitfire Strategies. Trendel Lightburn is our senior editor. Our production team is Gabrielle Connor, Ben Christiason, Maggie Cass and Nima Shirazi.
To learn more, visit us spitfirestrategies.com.
The music is by UI and I'm your host, Jen Carnig. Thank you so much for listening. Stay well.