The emergence of generative artificial intelligence (AI) tools has captured our collective attention – dominating headlines, ad campaigns, legislative action and cultural commentary. The novelty and hype surrounding these tools has facilitated increased integration of AI technology across varying areas of our lives. Technology corporations and the billionaires backing are talking out of both sides of their mouths – working to quickly advance these tools and marketing their potential to change the world, while also warning about the perceived, larger than life power of AI and far-off existential risks the technology may pose on our society.
The truth is, AI is already causing real-world harm for people today, and those harms can be felt across nearly every aspect of our lives, in every region of the country and around the world.
When an issue such as the risks posed by AI is rooted in the future or as impossible to solve (read: sentient robots and human extinction), it can feel difficult to hold accountable the people and companies who are responsible for current, real-world harms. If we allow Big Tech to continue distracting us by focusing on future robots or abstract AI issues, we miss opportunities to build urgency now for civil rights policy solutions and governance.
Advocates across the field of public interest technology calling for greater governance of artificial intelligence (AI), both generative and predictive, have named the importance of communicating the current, real-world harms of AI. For leaders calling for greater governance of AI and working to rein in its use, Spitfire created a resource to help communicate the current harms that AI is already facilitating throughout our society.
This fact sheet, “Stories and Talking Points About the Ways AI is Creating Harm, Today” includes talking points and story examples of current, real-world harms from AI. It is a tool for you to incorporate in your presentations, writing, interviews, briefings, hearings and other communication materials. Download the resource below.