Close
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Coding.Care: guidebooks for intersectional AI
(USC Thesis Other)
Coding.Care: guidebooks for intersectional AI
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
CODING.CARE:
GUIDEBOOKS FOR INTERSECTIONAL AI
by
Sarah Ciston
A Dissertation Presented to the
FACULTY OF THE USC GRADUATE SCHOOL
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF PHILOSOPHY
CINEMATIC ARTS (MEDIA ARTS AND PRACTICE)
May 2024
Copyright 2024 Sarah Ciston
Acknowledgments
Community-based practice is the accretion, translation, and distillation of many voices across
communities—who have shown me how to do this work, what the work is, and how to be in
community more gracefully and more radically. For the last few years, I have been incredibly
lucky to have space to think, build, and scheme abundantly. The material traces of this space can
be found in the shared PhD workspace, SCI 211, where projects in progress join projects past:
crochet blankets, acoustic dishes, spray paint, soldering irons, origami kites, a table full of wood
shavings and collage supplies, cables upon cables upon cables. This space overflows in ways that
barely hint at the joys and struggles of creative research, scholarly rigor, personal and collective
trials, and the occasional faculty–PhD dance-off. Like SCI 211, an acknowledgments page may
appear to be a scattered collection, easily dismissed. But the acknowledgments are the
foundations of this text, the forces that make possible all the efforts described within.
To my dissertation committee, who affirmed this work in all its forms and pushed it
forward with sharp, specific feedback, I am incredibly honored to be part of your communities:
To Holly Willis, thank you for fourteen semesters as a steadfast mentor, expertly moving
from queer handmade praxis to generative AI to bookbinding and back again. I’m grateful to be
able to connect as writers, makers, and researchers. You continually teach me how to look and
listen and offer with more nuance and grace. Thank you for expanding my research practice and
ii
describing it with such clarity, even when I’d call it a flailing tube man. Thank you for being a
subtle mountain-mover and fierce advocate for students and for weird art.
To Tara McPherson, thank you for showing me how to live my values as an academic and
citizen, how to create new tools for thinking differently, and how to teach and think with both
intellectual precision and absolute whole-heartedness. Your advice is always spot-on and
delivered with all necessary dry wit. Thank you for modeling polymathic, polyvalent feminism in
the software lab, in the classroom, in scholarly journals, and in the streets.
To Rita Raley, thank you for our in-depth discussions of publishing, practice, and
tactics—from technical details of natural language processing to poetic affordances of electronic
literature to forms of resistance. I am grateful for the chance to think deeply with you across all
these modes and for your guidance in combining them.
I am also grateful to Scott Fisher and to Alice Gambrell for your contributions and insights
as part of my qualifying exams committee, which laid the foundation that helped this work take
shape and grow. Thank you, Scott, for encouraging me to explore generative AI through MEML
projects and for making your classroom a space of broad experimentation. Thank you, Alice, for
inspiring discussions about analog–digital publishing and for seeing something in my work that
made you encourage me to go to grad school all those years before I did.
It is hard to explain the magic of Code Collective, which has been both the support system
for this work and simultaneously its inspiration. Thank you to everyone who has been a part of
Code Collective, including Katherine Yang, Elea Zhong, Ada Toydemir, Samir Ghosh, Garrett
Flynn, ender, Robby Feffer, Jay Borgwardt, Karen Abe, Amy Wu, Bita Tanavoli, Katie Luo, Katie
Liu, Johans Saldana Guadalupe, Nicole Carrara, Tim Zhai, Tia Kemp, Kyle Ang, Adena Park, Kris
Yuan, Remi Wedin, Dani Takahashi, Ilona Bodnar, Emily Fuesler, Carey Crooks, Lee Thiboudeau,
Nico Pizzati, Ethan Kurzrock, Sylvie Howton, Allison Ross, Bill Russell, Zachary Mann. Whether
iii
or not your name is here, whether you joined as an in-person regular or as a curious online
visitor, thank you each for reminding me how valuable Code Collective is as a community we
make and keep making together.
Thank you also to Dr. Elizabeth Ramsey for your unflagging support of that wild idea. I
could not have made work about care without a caring place like MA+P, and you embody that
care. I am grateful for your presence at every event, whether supporting zine club, organizing the
affective computing reading group, or including me in your courses. Thank you for teaching me
the secret to fostering community (it’s snacks) and for helping me foster it more.
Thank you to Dave Lopez for making all our iMAP ideas materialize, to Sonia Seetharaman
for making those ideas organized and beautiful, and to Stacy Patterson for keeping us well-fed
and funded. Thank you each for navigating so much institutional bureaucracy on our behalf.
Abundant thanks go to the teachers who taught me about teaching, mentorship, and
community, among them Gabe Peters-Lazaro, Evan Hughes, Safiya Noble, Jeanie Jo, and Anna
Joy Springer. Thank you to Qianqian Ye, who welcomed me into the p5.js community by letting
me know I was already part of it. Thank you to Brett Stalbaum, who knew I could code before I
did and pointed out a scrappy path. Thank you to Vicki Callahan for generous dissertation
support, both emotional and editorial, and a zen approach to critical theory. Thank you to Steve
Anderson for so much heart-centered real-talk about academia. Thank you to Mark Marino,
co-author, collaborator, and friendtor, for introducing me with unlimited bravado and including
me in your brilliant schemes.
I also wish to acknowledge the additional years of PhD support I received from USC
Mellon Humanities in a Digital World Program, which allowed me to expand this research and to
design my own course, with special thanks to Amy Braden, Will Cowan, and Sean Fraga.
iv
At the Knowing Machines research project, I offer warmest thanks to Kate Crawford for
the opportunity to think with you across disciplines, space-time dimensions, and formats. I am
grateful for your generosity, trust, encouragement, and discerning eye—as a mentor and editor,
and also as an artist and researcher. Thank you to Mike Ananny for rigorous, kind, and
considered perspectives on critical datasets and for your generosity of spirit in convening MASTS
and AIMS. Thank you to Vladan Joler and Olivia Solis for thoughtful, big-hearted conversations
and insights into visual storytelling. Thank you to every member of Knowing Machines for
sharing a friendly, dynamic, interdisciplinary intellectual community: Christo Buschek, Franny
Corry, Melodi Dincer, Annie Dorsen, Hannah Franklin, Ed Kang, Jake Karr, Sasha Luccioni, Will
Orr, Jason Schultz, Hamsini Sridharan, Jer Thorp, and Michael Weinberg.
I am so thankful for the generous support of the Akademie der Künste Berlin. Thank you
to Clara Herrmann and Nataša Vukajlovic for curatorial genius, creative support, and friendship;
to Maya Indira Ganesh and Nora Khan for AI with care and without BS; to my fellow AI Anarchies
Fellows D’Andrade, Walla Capelobo, Sara Culmann, Petja Ivanova, Sonder, Tin Wilke, Lara Fong
Prosper, and especially Pedro Oliveira and Aarti Sunder for countless process chats over coffee. I
am grateful for the hospitality of the staff and residents at Zentrum für Kunst und Urbanistik,
with particular thanks to Anita Rind, Mandy Seidler, Carmen Santesmases, Vaas de Wit, Ariane
Boucher, Noa Enghardt, and Agnes Schyberg for all your tireless work turning a transitory public
space into a creative flourishing home.
At the Humboldt Institute for Internet and Society, thank you to Daniela Dicks and
Theresa Züger, to the AI & Society Lab, and to everyone at HIIG for welcoming me as a Fellow
and for supporting the Intersectional AI Toolkit in its development and hosting its launch. Thank
you also to everyone who has participated in Intersectional AI Toolkit zine-making workshops
v
from 2021–2024 at HIIG, USC, Mozilla Festival, re:publica, transmediale Vorspiel (Halfsister),
Akademie der Künste, ZK/U Berlin, and Ars Electronica Founding Lab Summer School.
Thank you to the members of the USC Libraries’ “Visual Datasets for Inclusive Research”
project, including Mike Jones, Bill Dotson, Curtis Fletcher, Caroline Muglia, Hujefa Ali, and
Manasa Rajesh for sharing your knowledge, for encouraging intersectional perspectives, for
planting the seeds for future research projects, and for being the happiest people on Zoom.
To Alena Giardina, thank you for making the roof over my head a neighborly home and a
launchpad for new endeavors. I am humbled and continually inspired to pay it forward.
Where would I be without Fidelia Lam and Lisa Müller-Trede, my weirdos-in-arms, and
my fellow iMAP PhDs: Catherine Griffiths, Emilia Yang, Triton Mobley, Szilvia Ruszev, Noa
Kaplan, Romi Morrison, Ben Nicholson, Kumi Iman, Luke Fischbeck, Sultan Sharrief, Dana Dal
Bo, Chantal Eyong, Faye Yan Zhang, Michelle Salinas, Selwa Sweidan, Curtis Tam, EvB, Andrea
Kim, Zeynep Abes, Ellie Schmidt, Sichong Xie, Halo Starling, David de Rozas, David Kelley, Karl
Baumann, Biayna Bogosian, micha cárdenas, Brian Cantrell, Laura Cechanowicz, Behnaz Farahi,
Todd Furmanski, Aroussiak Gabrielian, Samantha Gorman, Juri Hwang, Geoff Long, Clea Waite,
Jeff Watson. Thank you for making SCI 211 a place to belong in transdiciplinary eccentricity.
I am grateful to many creative, intellectual connections across and beyond institutions,
whose perspectives enhanced this work: Ashley Dailey, Ariana Dongus, Katy Gero, Mario
Guzman, JH, Shawné Michaelain Holloway, Olivia Jack, Swantje Lichtenstein, Maurice Jones,
Christine Meinders, Miller Puckette, Lubna Rashid, Tiara Roxanne, Diana Serbanescu, Lena
Wegmann, and Xin Xin. For endless, uncategorizable, above and beyond support, deepest thanks
go to Åste Amundsen, cypress masso, Emily Martinez, Leigh Montavon, Jason Perez, Susan Ring,
Rob Shafer, Tavia Stewart, and Isabel Wanger (all the information is on the task). With profound
love to my family, especially Anthony Ciston, Tony Ciston, and Mary Jo Fisher.
vi
Table of Contents
Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ii
Abstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viii
1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1 Hello, World => Another World Is Possible . . . . . . . . . . . . . . . . . . . . . . . 1
1.1.1 Why AI Alone Can’t Create the Worlds We Want . . . . . . . . . . . . . . . . 4
1.1.2 AI Needs Critical, Intersectional Approaches . . . . . . . . . . . . . . . . . . 6
1.1.3 Critical, Intersectional AI Needs Creative, Care-full Approaches . . . . . . . 11
1.2 An Invitation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
1.2.1 Coding.Care: Field Notes for Making Friends with Code . . . . . . . . . . . 16
1.2.2 Crafting Queer Trans*formative Systems, a Theory in Process . . . . . . . 18
1.2.3 A Critical Field Guide for Working with Machine Learning Datasets and
Inclusive Datasets Research Guide . . . . . . . . . . . . . . . . . . . . . . . 19
1.2.4 The Intersectional AI Toolkit . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
1.2.5 Interstitial Portals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
vii
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Abstract
Critical AI researchers appreciate the urgent need to understand and rethink how AI systems are
defined, developed, regulated, and mitigated. Coding.Care: Guidebooks for Intersectional AI
argues that integrating critical approaches into AI systems requires building inviting, inclusive
spaces. It requires space for more people to engage creatively and critically with each other, and
to engage creatively and critically with machine learning as a set of practices and malleable
materials. This project demonstrates craft-based, process-oriented approaches to AI that can
help meet this challenge.
The dissertation presents guides for fostering critical–creative coding communities as
spaces of radical belonging—activated by an ethos and a politics modeled by queer and trans*
communities that embrace radical difference. This creates a basis for deep interdisciplinary
thought, interrogation of formative principles, and an openness to co-creation and alternative
forms necessary to reimagine AI.
With these approaches, it becomes possible to reengage emergent technologies as craftable
materials, rather than unassailable forces, and to respond with impactful, sustainable,
intersectional interventions. Artists, activists, scholars, and technologists can recast our
relationships to emerging technologies by reframing them as crafts like crochet—deflating AI
hype, lowering barriers to learning, building community, and emphasizing sustainability and
viii
process. Thinking craft-as-technology honors the inherited knowledge of many outsider
communities. Thinking technology-as-craft provides a framework to implement those theories,
ethics, and tactics as intersectional critical AI.
Coding.Care collects five publications that enact the strategies it theorizes. These works
address and unite a wide range of communities by relying upon different voices, forms, formats,
and media. They address various stages and processes involved, including machine learning
datasets, intersectional AI, coding communities, and embodied algorithmic art:
Coding.Care: Field Notes for Making Friends with Code gives courage to pick up
unfamiliar tools, find resources to kick off a new programming project, pose questions critically,
or solve problems creatively. Its pocket guide discusses how to build a cooperative,
interdisciplinary community for co-learning coding like the one I have facilitated since 2019.
Crafting Queer Trans*formative Systems: A Theory in Process grounds the surrounding works
in an alternative approach to AI systems. It details a theory for the handscale process-based
methods, the queer and trans* embodied ethos, and intersectional tactics I use for working
toward AI systems as crafted, in-process materials. The Intersectional AI Toolkit’s co-authored
zines are accessible guides to both AI and intersectionality, bringing together artists, activists,
academics, makers, technologists, and anyone who wants to understand the automated systems
that impact them. The work argues that incorporating established but marginalized tactics is
necessary to reimagine more critical and equitable machine learning. A Critical Field Guide for
Working with Machine Learning Datasets translates critical AI theories and data science
concepts into practical tips for dataset stewardship. Along with the Inclusive Datasets Research
Guide, it combines technical skillbuilding and critical thinking for scholars and practitioners
beginning to work with large datasets. Lastly, a collection of essays imagine dialogues with five
ix
20th century artists, using the artists’ analog practices as pre-responses to the contemporary
digital concerns raised by living in an algorithmic age and across this collection.
Together, the guides in Coding.Care want to meet readers where they are—as
non-academics or those bridging into new fields, looking for a common vocabulary to engage
conscientiously with the urgent concerns amplified by AI systems. Because of this, they reach
beyond this dissertation format into continually evolving forms, and they can be found at
https://coding.care. Coding.Care offers an expansive invitation: to deepen interdisciplinary
conversation, to apply intersectional approaches, and to rework AI systems from
critical–creative–caring perspectives.
x
1 Introduction
“I’m stuck here inputting and outputting the data of a story I can’t change.”
–Italo Calvino “The Burning of the Abominable House” (1976)
1.1 Hello, World => Another World Is Possible
I came late to both queerness and coding. Perhaps you can relate. Both delays were rooted in
scarcity and fear. I did not know, and could not access, the variety of ways to be queer or to be a
coder. Once I learned some ways, I still worried that my version (of queerness, of coding) was
“not enough” to count.
My interest in code and in AI systems grew from wanting to understand how I might tell
stories in new forms using and building digital tools, to write books that expanded books. I was
conscious of the growing impacts that automated systems were having on my voice and on others
who might also feel left out. I spent a long time struggling to learn in tech (and queer)
communities without feeling I belonged. I did not yet know how much each of these communities
would shape my understanding of the other.
Over the last decade, I have begun developing the approaches shared across the works
collected in Coding.Care: Guidebooks for Intersectional AI as practices of exploration and
investigation, experimentation and imagination, repair and resistance toward and with
1
technologies like machine learning. These are practices that ask how to better use technology
toward relationality and building the systems and worlds we want.
No intervention in disproportionately harmful algorithmic systems is effective without
critically aware approaches to technologies from deeply plural perspectives. Meanwhile, no such
proliferation of perspectives is possible without inviting spaces to understand, interrogate, and
reimagine the infrastructures that support those systems. Coding.Care argues for the essential
entanglement of critical, intersectional AI approaches and creative-critical coding communities,
demonstrating how each needs the other. It shows what intersectional, interdisciplinary,
creative–critical approaches to AI systems and other emergent technologies can look like. Its
multimodal guides apply these approaches as in-practice experiments—in different contexts, for
different audiences, for different aspects of these urgent issues.
The stakes are high: technologies like machine learning urgently require transformative
interventions that recalibrate these systems’ values and their stakeholders. Automated
decision-making systems disproportionately harm the marginalized majority (Benjamin, 2019;
Browne, 2015; Buolamwini & Gebru, 2018; Noble, 2018). The communities most impacted by
them, and best poised to intervene, currently go unheard as powerful actors profit from their data
and labor. Large companies warn of hyperbolic coming dangers in order to distract from the
clear, current dangers they perpetuate (Davies et al., 2023; Kapoor & Narayanan, 2023; Nedden
& Dongus, 2017). Already-toothless policy recommendations are watered down and ignored
(Heikkilä & Ryan-Mosley, 2023). Meanwhile more and more data represent less and less
diversity (Bender et al., 2021), more and more processing power destroys more and more planet
(Dodge et al., 2022). Glib fun with AI on one side of the world relies on extractive labor for
pennies a day on the other (Perrigo, 2023). But why must it be this way?
2
“We must begin with the knowledge that new technologies will not simply redistribute
power equitably within already established hierarchies of difference. The idea that
they will is the stuff of utopian naivete and technological determinism that got us here
to begin with.” (Sharma, 2020)
We cannot expect technology to solve the problems of technology. To face these
challenges, those who already have access and aptitude with tech must embrace a wider range of
essential perspectives from the marginalized majority. In order to change systems to suit more
communities, these communities must be able to participate on their own terms. Working
together with critical engagement, we as users, makers, scholars, and arbiters of tech can reclaim
more collective agency and access by considering tech at tangible scales while also grappling with
its systemic impacts. When learning programming and engaging with tech is often intimidating,
we must reclaim technology as a widely accessible craft. We can transform technologies
themselves rather than accepting their current shapes:
“Anyone who has ever woven or knitted knows that one can change patterns […] but,
more importantly, they know there are other patterns. The web of technology can
indeed be woven differently, but even to discuss such intentional changes of pattern
requires an examination of the features of the current pattern and an understanding
of the origins and the purpose of the present design.” (Franklin, 2004)
I argue that such changes in tech require fostering critical–creative coding communities as
spaces of radical belonging. I have seen and helped build spaces that effectively shift
conversations, implement cooperation, and produce innovative tools. What distinguishes such
spaces is their emphasis on process and materiality—a set of practices I trace back to technology
as handcraft—and care and relationality—an ethos of interdependence and radical difference I
3
find modeled in queer, trans*, and intersectional feminist spaces. Craft methods become
expressions of these theories, ethics, and tactics as enacted through technology. Together, these
practices help build approachable spaces that provide a basis for the deep interdisciplinary
thought, interrogation of formative principles, and openness to co-creation necessary to reshape
AI systems. Finding common vocabularies across diverse communities, using these combined
approaches, makes it feasible to reengage emergent technologies as craftable materials, rather
than unassailable forces, and to respond with impactful, sustainable interventions.
1.1.1 Why AI Alone Can’t Create the Worlds We Want
A large interdisciplinary community is pushing for understanding, critiquing, and rethinking how
we define, develop, deploy, regulate, use, and mitigate the effects of machine learning tasks,
datasets, models, algorithms, architectures, and agents, which we collectively and nebulously
understand as ‘artificial intelligence’ or ‘AI’ systems.1 Tech industry implementations of so-called
ethical AI reduce complex concepts into flattened ideas of fairness and representation (Ovalle et
al., 2023). Machine learning, as a mass production, produces a false sense of certainty out of
uncertainty, argues political geographer Louise Amoore (2020), describing the millions or
billions of small uncertainties that it reduces and presents as acquired knowledge. As artist and
researcher Mashinka Firunts Hakopian (2022)points out, these uncertainties are also claims
about “what we should know, how we should know what we know, and how that knowledge
should be deployed. Each exposure to a dataset occurs because someone concluded that the
information in that dataset should be used to determine a possible future.” In short, current
versions of AI systems, as well as current attempts to improve them, end up using reductionist
logics to limit both knowledge and the values that structure it.
1
I say ‘AI systems’ or ‘machine learning’ to refer collectively to an entire set of tasks and operations including and
especially the human decision-making involved at every stage of this pipeline. Despite the term ‘pipeline’, it is a
non-linear set of processes, conventions, and histories that feeds back into itself and is rarely straightforward.
4
While the need for AI oversight is clear, many are calling for total overhaul that moves
beyond audits and inert critique, like computer scientist Joy Buolamwini (2017) who argues for
what she terms algorithmic justice. Overhaul and even oversight can appear out of reach when
the scale of AI systems seems unfathomable and entangled. The valid criticism that algorithmic
systems are biased because their data are biased—often summed up “garbage in, garbage
out”—sets up a quest already doomed to fail. Yes, in many cases it would be preferable to have
more, better data. But what would be better data? Or an optimized system? For what goal, and
for whom exactly? There is no ‘unbiased’ data or system; there is only ‘good enough’, and only for
some tasks, in particular settings. As digital media researcher Yanni Alexander Loukissas (2019)
says, “all data are local,” meaning they come from specific contexts and are shaped by human
processes into ‘data’ and produced as ‘datasets’. There are many ways to do any task, informed by
minute choices at every step. As these choices scale exponentially with computation, their
impacts magnify exponentially too.
Yet local data do not scale up in ways that are well-suited to the massive tools currently
being created by AI companies. These generative AI models now frequently rely on foundation
models, because models have become so large and because the processes for training new models
has become so slow and expensive. Foundation models are previously built models, usually
designed for different or ‘general’ tasks, which are then used as the building blocks for new
models. Like a sourdough starter, foundation models carry with them the histories of how their
datasets were designed and for what purpose. They retain marks of whose data was included or
excluded, and the choices their creators made when preprocessing them. The latest systems still
rely on often decades-old datasets that leave traces of debunked or erroneous information in their
outputs. Their errors are compounded by the computational speeds that allow thousands of
operations to run per second.
5
AI hype and automation hype helps normalize, naturalize, and neutralize these very
subjective decisions. Critical AI researcher Kate Crawford (2021) argues, “We can easily forget
that the classifications that are casually chosen to shape a technical system can play a dynamic
role in shaping the social and material world.” These classification choices emerge from material
culture and feed back into it, and they cannot be solved with technical tweaks. Artist Hito Steyerl
(2023) argues, “the supposed elimination of bias within datasets creates more problems than it
solves. The process limits changes to parts of the output, making these more palatable for
Western liberal consumers, while leaving the structure of the industry and its modes of
production intact.” Thus, “fixing” inputs or outputs does little to address the structures of the
systems (both cultural and technological) which produced them. When the stakes are too high, no
data could be ‘good enough’ to make life or death decisions. We cannot rely on computational
systems for infallibility and rationality, nor can we look uncritically to these technologies as
bandages for the problems they exacerbate.
Practical applications of these critiques and calls for change have remained incredibly
difficult to implement, and it is easy to feel like actively rebuilding the foundational structures of
AI systems is out of reach. How do we get there? We need hands-on, intersectional strategies!
1.1.2 AI Needs Critical, Intersectional Approaches
Intersectional lenses can reveal the tangible human and more-than-human costs entangled in
algorithmic systems–from proliferating data and its material infrastructures, to consolidated
power and its sociocultural infrastructures. In 1977 the Black feminist Combahee River Collective
called for “integrated analysis and practice based upon the fact that the major systems of
oppression are interlocking” (Moraga & Anzaldúa, 1981, 1983). Named intersectionality by
Kimberlé Crenshaw (1989), and emerging from centuries of work by women of color (Haschemi
6
Yekani et al., 2022), intersectional analysis of institutional power is often misinterpreted as
individual identity politics. In fact, intersectionality “critiques systems of power and how those
systems structure themselves to impact groups and individuals unequally” (Cooper, 2016).
Crenshaw (2021) argues that intersectionality is as useful for understanding privilege as it is for
understanding marginality. Intersectional analysis shows that power is differential by design; it
reveals the inequalities within inequalities and asks that “communities and movements are
inclusive of differences and work together towards equality” (What Is Intersectionality, n.d.).
Conversations about AI fairness, transparency, explainability, ethics, public good, and the hype
cycles of new technologies are grossly incomplete without intersectional analyses of power and
intersectional tactics of (and beyond) equity and inclusion. No change about us without us.
Intersectional AI (Ciston, 2019) calls for demystifying normative AI systems and learning
from a wide range of marginalized ethics and tactics, in order to fundamentally transform AI. It
requires multimodal, polyvocal, experimental approaches that cut through technological
solutionism. It requires slow, long-term investments in algorithmic justice, rather than
extractive, performative forms of inclusion that erase friction, context, and agency as they scale
up for machine learning tasks (Buolamwini & Gebru, 2018; Sloane et al., 2022). Intersectional AI
celebrates and documents the work done by many related efforts that call for diverse knowledge
systems to be incorporated toward reimagining machine learning as a more care-ful set of
practices, including but not limited to abolitionist (Earl, 2021), anticolonialist (Chakravartty &
Mills, 2018; Rights, 2020), antiracist (Abebe et al., 2022), crip (Hamraie & Fritsch, 2019),
Indigenous (CARE Principles of Indigenous Data Governance, n.d.; “INDIGENOUS AI,” n.d.;
Lewis et al., 2020), intersectional (Ciston, 2019; Klumbytė et al., 2022), feminist (Feminist.AI,
n.d.; Sinders, n.d.), neurodivergent (Goodman, n.d.), and queer and trans* ways of knowing
7
(Barnett et al., 2016; Keeling, -01-22 2014; Klipphahn-Karge & Koster, n.d.; Martinez & Ciston,
n.d.).
Simultaneously, a growing field of critical AI studies2 has been using interdisciplinary
techniques to analyze the pitfalls of existing AI methods and to argue these cannot be addressed
with technical improvements alone. Critical AI is distinguished from tech industry approaches
like “AI for Good” or “AI for Society,” which can lack critical perspectives on AI’s impacts, despite
an intended altruism. Critical AI researchers and professors Rita Raley and Jennifer Rhee (2023)
argue that AI makers and researchers need to engage these systems as sociotechncial objects
embedded in their historical, social context. They argue that we must be “situated in proximity to
the thing itself, cultivating some degree of participatory and embodied expertise, whether
archival, ethnographic, or applied.” This level of engagement requires interdisciplinary and
intersectional perspectives in order to permeate the entire AI pipeline, transforming it altogether.
Critical AI research is often paired with the urgent calls for alternative approaches and knowledge
systems to be applied to machine learning discussed above. Yet, importantly, none of these mixed
methods of analysis and intervention has yet to be adopted widely into standard machine
learning practices, even as the use and awareness of AI escalates and its issues grow more urgent.
Throughout this project, I adopt the term ‘trans*formative’ because it attends to root
causes and radical alternatives, suggesting intersectional queer and trans* imaginaries that
embrace radical difference and radical belonging. Here this means seeking fundamentally
reworked alternatives to the computational logics that perpetuate harmful systemic inequities. It
2A field can have many names. What do fields owe to each other? Here I use Critical AI as an umbrella for much work
being done in across Science and Technology Studies, Arts and Humanities, Computer and Data Science, Fine Arts,
and elsewhere, in relation to these questions. I acknowledge the critical perspectives that have long been situated
within computer science, such as Phil Agre’s (1998). I draw from critical data and dataset studies (Corry et al., n.d.)
and from critical algorithm studies (Gillespie & Seaver, 2015), from software studies (Chun, 2021; Cox & McLean,
2013; Wendy Hui Kyong Chun, 2008) and from critical code studies (M. Marino, 2006). I cannot promise to be
comprehensive; but I do hope to connect some important, related conversations happening across a wide range of
fields.
8
means beginning with “positive refusals” and radical alternatives that, as creative and social
computing scholar and activist Dan McQuillan (2022) suggests, “restructur[e] the conditions that
give rise to AI.” He and many others have shown these systems are direct products of the cultural
and historical conditions in which they are embedded. Therefore, transformation must go beyond
isolating AI’s problems, when it is itself an expression of broader problems. Rhetorician Adam
Banks (2006) argues for what he calls “transformative access” to digital technologies, saying that
access is more than owning or using tools, participating in processes or even critiquing their
failings. Transformative access is “always an attempt to both change the interfaces [where people
use that system] and fundamentally change the codes that determine how that system works.”
Banks highlights the important role Black people play in technology’s transformations, saying,
“Black people have hacked or jacked access to and transformed the technologies of American life
to serve the needs of Black people and all of its citizens.” Supporting the needs of people who are
pushed to the margins is essential in its own right and frequently leads to more effective support
for many others as well (Costanza-Chock, 2020). Access and agency with technology is not a
favor or a handout; more encounters across communities with different backgrounds and
knowledge, engaging and imagining different possibilities for technology, benefits the collective.
Hyped AI discourse explores limited questions about AI because it continues to draw from
limited perspectives, letting normalized narratives about humans and automated systems frame
the terms of debate. “Machine learning […] is an expression of that which has already been
categorized,” says digital culture researcher Ramon Amaro (2022). Not to think to train on a
variety of faces, or not to raise concerns about training on faces at all, happens because there is
not a variety of perspectives in the room when these very human decisions are being made:
before, during, and after the data are being collected; and before, during, and after the code is
being written and run. The spaces where technologies are discussed, designed, and implemented
9
are missing essential perspectives of those pushed to the margins, who are most capable of
addressing the concerns facing technology now. These concerns are not new, nor strictly digital.
Such questions—about representation, equity, ethics, and more—have been addressed by a
wide range of communities with different types of knowledge for centuries. Yet intimidating,
isolating cultures around the specialization of computation and programming practices have left
so many people out of these conversations. Down to the very language chosen to describe it, the
seemingly neutral choices about technology reinforce narrow cultural ideas about it. Specialized
terms exclude, and imprecise terms obscure.3 Curator and critic Nora Khan says of explainability
in AI:
“the brutal realities of algorithmic supremacy are often contingent upon its
mystification and its remove. We can map a growing hierarchy of computational
classes of ‘knowers’ versus those without knowledge, those with more access to the
workings of technology, those with partial access, and those with nearly none.” (2022)
With the goal to imagine different systems, code literacy should not be defined only on the
narrow terms of those creating existing systems (Vee, 2017). It is clear that bootcamps and hiring
initiatives, though useful, do not result in more variety of voices in effective positions of change
(Abbate, 2021; Dunbar-Hester, 2019; Hicks, 2021; Vee, 2017). Joining an ‘elite’ tech field is a
moving target, entangled with race, gender, and economic politics that rig the game. Diversity
quick-fixes do not acknowledge the many people already participating in the production of
technologies in the global majority, from those harvesting of rare earth minerals and circuit board
3For example, a machine isn’t ‘learning’ like a person learns in the process of ‘machine learning’. In ‘natural language
understanding’ tasks, nothing is ‘understood’ the way you or I understand. Generative AI systems are now said to
‘suffer’ from ‘hallucinations’, but all these terms signify purely mathematical operations happening under the hood.
A lot of math is happening very fast, but it is still just math. In an example from cognitive robotics professor Murray
Shanahan (2023), generative systems like GPT do not ‘know’ that Neil Armstrong went to the moon, but only that the
characters representing the word ‘moon’ are highly likely to follow Armstrong’s name in a text. GPT’s results hinge
on next-word prediction, using simple mathematical functions that are decided on and adjusted by its designers.
Humanities scholar Francis Hunger (2023) suggests new terminology to replace these misnomers, offering instead
‘machine conditioning’ to suggest the active role of designers in adjusting, tuning, and producing these systems.
10
manufacturers (AI Now Institute, n.d.; Crawford, 2021; Nakamura, 2014) to the content
moderators (Roberts, 2016) to the crowd workers (Sunder, 2023). Communications scholar
Christina Dunbar-Hester (2019) calls for interventions that go deeper than training more people
in tech jobs, pointing out that this also does too little to examine the structures that organize and
value existing work sectors. She argues this calls for “a larger reevaluation and appropriation of
categories themselves—the boundaries of what is ‘social’ and what is ‘technical’ are flexible
categories.” Part of reworking inclusion involves acknowledging how many more people are
already engaged with sociotechnical practices and impacted by them, as user-practitioners, data
subjects and subjectees (Ciston, 2023), skilled crafters and critics. It requires rethinking access as
mutually beneficial connection across communities of practice, by finding shared spaces and
common vocabularies.
1.1.3 Critical, Intersectional AI Needs Creative, Care-full Approaches
How do we reconnect the communities of practice who are currently building technologies and
those who are equipped with the skills and knowledge necessary to transform them?
Coding.Care argues for craft-based, process-oriented, community-driven approaches to AI
that can better help meet this challenge. It demonstrates that implementing critical approaches
into AI systems more broadly requires building inviting, inclusive spaces where more people can
engage both creatively and critically with each other and with machine learning techniques as
malleable materials.
As important as the criticisms of current AI systems are, effective critique should be
imbedded and actively practiced in order to produce meaningful alternatives. Abstract calls for
access are not enough; to create truly transformative alternatives, we need spaces that
acknowledge diverse capacities across backgrounds and disciplines, while connecting us in
11
shared goals. Caring, creative, and critical approaches must be combined in order to adapt these
conversations and these technologies to welcome different communities and the wider range of
knowledge necessary for such transformation.
‘Critical–creative coding’ is the term I use to describe combining the application of existing
critical AI approaches with creative coding, an existing set of artistic approaches to software and
the diverse communities surrounding these approaches. Creative coding has long lineages that
can be traced to 1990s net art, analog fine arts, and early computation. Like ‘tactical media’
(Raley, 2009) and ‘critical engineering’ (Oliver et al., 2011), critical–creative coding creates
software and other technological objects, not only for their aesthetic qualities, but in order to
investigate them as objects of study and critique. Like ‘critical making’ (Bogers & Chiappini, 2019)
and many hacklabs and makerspaces (Dunbar-Hester, 2019), critical–creative coding considers
the tools it takes up and the community practices which root it. For example, as an intervention
into deep learning algorithms, designer and researcher Catherine Griffiths argues for “reflexive
software development” that critically considers and interactively presents the circumstances of its
own production (Griffiths, 2022). Such research can produce tools that continue to probe their
research questions, both through the very processes of their creation and through their later use
by others. Critical–creative coding consolidates and combines a collection of practices from this
wide range of critical and creative spaces. Building on these inspirations, I find it essential in my
own practice that critical–creative coding also emphasizes care, co-learning, and
process—practices learned from crafting communities and from queer and trans* communities.
Process-oriented (creativity, craft) and radical belonging (care) practices are at the core of
this approach. Care and creativity are not additions, affectations, or antonyms to critical theory
or technical savvy; rather, they are its central fortifications. Importantly, critical–creative is not a
binary to straddle but a deeply integrated way of knowing. Critical methods activate creative
12
modes and root them in sociotechnical complexity. Creative methods in turn activate critical
modes and root them in care and connection, taking the critical out of the abstract and into
action. These strategies are interlocking and have found disciplinary grounding internationally,
sometimes called arts-based research, artistic research, design research, or research–creation
(Fournier, 2021; Loveless, 2019; Willis, 2016). I find that artistic research has the capacity to
combine rigorous scholarly investigation, deep community building and activism, material
artistic experimentation, and queer and creative play in ways that facilitate connections across
broader non-academic audiences. In short, the technical means (coding skills), the analytical
means (analytical, political, aesthetic, ethical contexts), and the material means (data, energy,
hardware, infrastructure, care) are inseparable and essential.
Code work is critical work is care work is creative work.
Code can do, mean, and be so much more—if we let it. Code is collaborative, says Mark C.
Marino (2020), who helped develop the practice of Critical Code Studies, which applies the close
reading attention of the humanities to the deeper understanding of software code. Code can be an
inviting, interpretive practice: “Code’s meaning is communal, subjective, opened through
discussion and mutual inquiry, to be contested and questioned, requiring creativity and
interdisciplinarity, enriched through the variety of its readers and their backgrounds,
intellectually and culturally.” Such collaborations are vital to the creation of hybrid communities
capable of applying the interdisciplinary, intersectional capacities of programming.
I theorize this overall approach as “crafting queer trans*formative systems” or simply
“coding care.” As these guides demonstrate, process-oriented, craft-focused practices and
intersectional queer care practices make room for in-betweenness—for rejoining the divisions
between theory and practice, and between user and programmer, which were artificially split
from the start (Artist, n.d.; Hoff, 2022; Nardi, 1993); for un-siloing domains and disciplines, the
13
artificial boundaries that divide technologists from activists and critics from creators; for finding
common language and common values that come with working knowledge of whole systems and
with openness to new systems. Craft, criticality, and care support more nuanced and more
mutual understanding. This fosters code collaborations that more fundamentally challenge and
change technologies.
“any theory that cannot be shared in everyday conversation cannot be used to educate
the public.” (hooks, 1994)
1.2 An Invitation
Coding.Care is a collection of public-facing resources that strive to put this thinking into action.
Read in any order you like. Read on the bias. Read in conversation with other texts. Read as
openings for discussion and expansion.
Produced in various non-academic formats like zines and wikis, and in contexts like public
workshops, these resources aim to support discussions across communities of practice—including
code creators, AI researchers, and marginalized outsiders. The works prioritize finding a common
language to connect across boundaries, by offering plainspoken translations of technical, critical,
aesthetic, and ethical concepts relevant to the questions raised around emerging tech. They are
written as jargon-free as possible to allow them to travel as broadly as possible.
The texts are not final, fixed, or authoritative versions. Rather, they are (and describe)
ongoing dynamic processes. They are caught moments of queer space-time, local data slices of
thinking in process. In the static, linear formats of bound books and archives, they are
temporarily constrained to an order, but please don’t let that constrain you in reading. We work
with the tools we have, and we work to build new ones.
14
Where possible I have tried to combine, misuse, and expand on the forms available. This
work has been drafted using a GitHub repository, which archives and makes public its
composition process and includes a detailed version history since May 2021. Its latest version is
hosted online at Coding.Care, which supports updates outside of and beyond the institutionally
archived edition. Unless created as part of another outside project, the writing is produced in
plain-text Markdown files, which allows for conversion to many output formats. I coded small
scripts that use these same text files to produce multiple outputs: the HTML files for the website,
the PDF files for the dissertation version, and the PDF files for printable zine booklets.4
I
developed this working method because I value the multimodal possibilities of digital systems
and the dynamic states of text. The minimalist structure and simple formatting, made without
complex software frameworks, means both the code and writing are more adaptable to other
complex forms and also more legible to other people.
I am excited by how code writing and text writing work together, in how they might work
together more, and in how forms(s) inform content and distribution. I discuss this further in
Crafting Queer Trans*formative Systems and in Coding.Care: Field Notes for Making Friends
with Code. How a text is made might seem a small concern. But to me, these questions of how we
read, how we code, the forms these take, and how they travel all relate directly to the future of
sophisticated multimodal AI models as much as to fundamental concepts of first-year
programming courses and first-year writing classes. They will continue shaping emerging
technologies at every scale and every step.
Together, the range of resources collected by Coding.Care shows how different ways of
knowing are necessary to access and address the questions raised by automated, algorithmic, and
4
I used Markdown, Pandoc, LaTex, and Make files here. If you’re interested you can find examples in the GitHub repo
or ask me for more information. I don’t necessarily recommend my technique but I haven’t found a better one. It
was a silly, but fulfilling, process—much like life.
15
emerging technologies. They also show how different aspects of these urgent issues must be
addressed simultaneously.
And so, I invite you to find the resources and the ways of knowing that suit you. Find the
people with whom you want to read this, the conversations you want to have, the action you want
to take, and the world you want to create. I hope that this plurality of forms, tactics, topics, and
voices helps you find something you need, answers some questions, and raises more.
1.2.1 Coding.Care: Field Notes for Making Friends with Code
As a pocket guide to making and sustaining friendly coding communities, Coding.Care: Field
Notes for Making Friends with Code shows why we need these communities, how to build them,
and how to let them thrive. It draws on lessons I learned from Code Collective, the diverse hack
lab that I started in early 2019 when I yearned for the adaptable, encouraging environment I had
needed when I was first struggling to learn to program. I wanted to make a space where I
wouldn’t feel like an outsider for ‘not knowing everything’ about programming, and I suspected
others might feel the same. I wondered how to recreate the positive experiences that changed
coding for me, inspired by teachers like Brett Stalbaum, who had showed me code could feel
creative instead of prescriptive.
In Code Collective, a mix of media artists, activists, makers, scientists, scholars, and
engineers gather to co-work and co-learn, thinking critically with code in an inclusive,
interdisciplinary space that supports many kinds of learners. The Collective unites students who
may have zero technical experience with those who may have lots but perhaps lack a critical or
creative lens. We value their experiences equally, reinforcing the idea that, “We all have
something to teach each other.”
16
This guide looks at a variety of the strategies and tools we have explored and developed as
the community has grown. It discusses how we have adapted to meet our shifting needs—from
hosted workshops to hybrid-format meetups, from pandemic support to alumni programming.
Code Collective’s approaches draw on many existing methodologies and methods from
intersectional queer, feminist, anti-ableist, and anti-racist theories. The guide connects these
approaches to cooperative organizations like Varia and p5.js, to Critical Code Studies, and to
practices like working iteratively and breaking critically.
As a guide for making friends with code, Coding.Care discusses how practices such as
process-oriented skillbuilding, co-teaching and co-learning, and snacks (always snacks) embody
the Collective’s guiding values, such as “scrappy artistic strategies not perfect code.” The guide
shares projects and feedback from members of the Collective, who report how these values and
practices have shaped them as emerging makers and thinkers. Personally, I have found this
community to be the strongest influence on my own research, above and beyond my role as
facilitator. Code Collective has become a joyful space for creative risk-taking that nourishes my
practice. The guide offers practical advice for getting comfortable with code, while situating these
approaches and groups within an “ethics of coding care”—grounded in shared embodied
knowledge, embedded co-creation, and programming with and for community—as an antidote to
technocratic values and as an enactment of its ethos.
In her book Coding Literacy, Annette Vee (2017) argues that, “Changing ‘how the system
works’ would move beyond material access to education and into a critical examination of the
values and ideologies embedded in that education. […] Programming is a literacy practice with
many applications beyond a profession defined by a limited set of values.” Vee calls this kind of
programming access “transformative.” Through Coding.Care’s intimidation-free, learner-led,
process-oriented approaches, it both theorizes and models the creation of caring communities
17
and innovative spaces that can transfer knowledge across social strata and intellectual disciplines
in order to reshape technological systems.
OBJECTIVES: Through Coding.Care, understand how to approach programming with less
fear and more fun, with less constraint and more community support. Think creatively and
critically about the kinds of technologies you want to make and support. Learn to choose and use
tools, languages, and platforms that match your goals and ethics. Create or join communities of
practice that feel supportive and generative.
1.2.2 Crafting Queer Trans*formative Systems, a Theory in Process
As a guide to the theories and tactics, metaphors and materials, that ground the rest of this
collection, Crafting Queer Trans*formative Systems explores how artists, activists, scholars, and
technologists can recast our relationships to emerging technologies by reframing them as
handcraft processes (e.g. crochet, woodworking, printmaking), in order to embody the
intersectional approaches necessary to transform systems. It argues that these strategies help
deflate AI hype, lower barriers to learning, build community, and emphasize slowness and
sustainability. Thinking craft-as-technology honors the inherited knowledge of many outsider
communities. Thinking technology-as-craft provides a framework to implement those theories,
ethics, and tactics as intersectional critical AI.
This text describes a “theory in progress” using craft as a framework to surface
reconsiderations of scale and dimension, care as infrastructure, counterhistories of digitization
from marginalized legacies, the merging theory and practice, creating access and drawing
boundaries, and material resistance. Rather than relying on ephemeral cloud-y metaphors,
attending to the material realities of even the most sophisticated technology brings it back within
a scope of human- and eco-relations. Material resistance considers both how materials push back
18
as well as how to (mis)use materials as modes for refusal, including extreme use, adversarial use,
handmaking, and making esoteric systems.
Central to this theory are queer and trans* becoming, radical belonging, and radical
difference, particularly as ways of (un)knowing that ground my perspective on
community-building and worldbuilding, art-making and research. These intersectional practices
are rooted in anticolonial and queer-of-color histories (Driskill, 2010; Muñoz, 2009), and these
both make plain the urgent stakes of current AI systems, and also offer the fluidity of thinking
necessary to produce substantive change in those systems.
This thinking-guide works through these stakes by unpacking each of its key terms and
how they relate to machine learning: craft as both process and knowledge; queer futures and
queer enough; the trans*ness of transformer models and the formative as a shape, a scaffold, and
an origin story; even the asterisk as a pivot and a portal; and systems as both sociotechnical
systems and speculative systems for change.
OBJECTIVES: Through Crafting Queer Trans*formative Systems, understand some
examples of process-based and craft-based approaches to technologies and how they offer
alternative perspectives and opportunities for intervention. Consider queer and trans* lenses for
technologies and their essential role in reshaping and reimagining tech spaces and tech systems.
Understand the role of form as ingrained assumptions and structuring systems for sociotechnical
tools. Imagine how these combine at scale and could be reconfigured.
1.2.3 A Critical Field Guide for Working with Machine Learning Datasets
and Inclusive Datasets Research Guide
Datasets provide the foundation for all of the large-scale machine learning systems we encounter
today, and they are increasingly part of many other research fields and daily life. Many technical
19
guides exist for learning to work with datasets, and much scholarship has emerged to study
datasets critically (Corry et al., n.d.; Gillespie & Seaver, 2015). Still no guides attempt to combine
technical and critical approaches comprehensively. Every dataset is partial, imperfect, and
historically and socially contingent—yet the abundance of problematic datasets and models
shows how little attention is given to these critical concerns in typical use.
A Critical Field Guide for Working with Machine Learning Datasets helps navigate the
complexity of working with giant datasets. Its accessible tone and zero assumed knowledge
support direct use by practitioners of all stripes—including activists, artists, journalists, scholars,
students—anyone who is interacting with datasets in the wild and wants to use them in their
work, while being mindful of their impacts. Developed with Kate Crawford and Mike Ananny, as
part of their research team Knowing Machines, the field guide discusses parts and types of
datasets, how they are transformed, why bias cannot be eliminated, and questions to ask at every
stage of the dataset lifecycle. Importantly, it shares some of the benefits of working critically with
datasets when (on the surface) it may seem just as easy not to take that care.
In a similar vein, the Inclusive Datasets Research Guide is an interactive digital guide for
academic researchers working with datasets, that supports them with an overview of key concepts
and considerations for working with datasets, as well as providing tools and software, books and
tutorials, and recommendations for thinking inclusively. Like A Critical Field Guide, the
Inclusive Datasets Research Guide focuses on a blend of technical and critical decisions that arise
when working with datasets. Because this guide is aimed at students and teachers, the format is
brief collections of resources rather than conceptual deep-dives. The guide appears on the USC
Libraries’ website along with its other research guides on many topics.
Developed by a team at USC Libraries, with the support of a grant from the USC Office of
Research, this research guide was written as part of a grant to acquire core research datasets to
20
support areas of inquiry by USC researchers into arts, humanities, and machine learning. I was
recruited to provide interdisciplinary perspective on inclusive approaches to machine learning,
and I joined a team including a chief library technologist, data science graduate students, special
collections librarians, a research communications specialist, and a multimedia digital humanities
specialist. We conducted 18 interviews with faculty across campus who worked with datasets in
order to develop an internal rubric to support collection development. Through this process, we
found less pressing need for dataset acquisition, because researchers do not yet look to libraries
for their datasets but access them elsewhere. Still, the rubric we developed was utilized to acquire
approximately 50 collections identified for being more accessible, inclusive, ‘datafiable’, and
meaningfully engaged, with the aim to offer alternative options for researchers. Additionally, we
identified the need for more curated resources and more training on how to select and use
datasets critically, while remaining mindful of their origins and impacts. This led to the expanded
aim of the grant and the development of the Inclusive Datasets Research Guide
Both A Critical Field Guide and the Inclusive Datasets Research Guide reflect on the
stakes of datasets and the human choices they rely on. Reframing this information in two
different forms shows how it can be made more effective depending on different audiences’
needs. Both works are examples of how concepts and processes researched in the Intersectional
AI Toolkit (below) can be reworked for new institutional contexts. Adapting the Toolkit to new
audiences in library science, data science, and the social sciences posed interesting challenges
that both expanded and refined the work. It required the ideas be scaled up and applied, and
sometimes renegotiated until their rewordings no longer felt watered down. In each project, I
learned how another field has addressed the problems of knowledge organization and bias,
historically and in the present. Library scientists have at least a century of practice considering
questions of how to categorize, curate, and archive. Social scientists have been asking how and
21
what to measure for just as long. None of this is perfect, either, but learning from each institution,
and combining thse findings with questions machine learning is trying to ask, helps me
understand better how we got where AI systems are today. This includes understanding each
field’s starting baselines and vocabularies. Combining these with other intersectional practices
helps me better understandwhat each domain might learn from the other.
OBJECTIVES: Through A Critical Field Guide for Working with Machine Learning
Datasets and the Inclusive Datasets Research Guide, understand the importance of working
critically with datasets as part of any machine learning practice. Identify the parts, types, and
functions of datasets as you encounter them. Determine whether a particular dataset is a good fit
for your project by understanding critical questions to ask at each phase of the dataset lifecycle.
Learn to collaborate with communities impacted by your research and to create strategies for
addressing potential harms in the datasets you utilize.
1.2.4 The Intersectional AI Toolkit
The Intersectional AI Toolkit argues that anyone should be able to understand AI and help shape
its futures. Through collaborative zine-making workshops, it aims to find common vocabularies
to connect diverse communities around AI’s urgent questions. It clarifies, without math or jargon,
the inner workings of AI systems and the ways in which they operate always as sociotechnical
systems. The Toolkit celebrates intersectional work done by many other researchers and artists
working to address these issues in interdisciplinary fields; and it gathers and synthesizes legacies
of anti-racist, queer, transfeminist, neurodiverse, anti-ableist theories, ethics, and tactics that can
contribute valuable perspective. Its three formats allow for multiple entry points: The digital wiki
offers a forum for others to discuss and expand upon its topics. The collection of printed zines
22
share AI topics at a concise, approachable scale. And the in-person and hybrid-online workshops
invite multiple communities to participate directly in the systems that impact them.
Selecting the toolkit format was a key consideration of the development process for the
Intersectional AI Toolkit. The toolkit form taken up here was first modeled after Ahmed’s ‘Killjoy
Survival Kit’ (Ahmed, 2017).5
. The term ‘toolkit’ was thoroughly contested (too instrumental, too
object-oriented, not quite compendium or catalogue or care package, not index or hub or
gazetteer, not manifesto or knapsack or portal), but settled upon after nothing else quite suited.
The technologies used went through many iterations, from git repo to wiki to self-hosted hybrid
back to repo again, in search of a platform that would facilitate guest user access without heavy
onboarding, that could track edits and adapt to multimedia print and digital zine forms. I am still
remaking the work and searching for the perfect form. I suspect I will have to create it, and it will
continue to change. In its various iterations, the Toolkit has grown into eleven workshops,
compiled into eight printed zines, plus a selection of focused topic pages online. Work on the
Toolkit also resulted in the two related datasets projects (above), which reflected back to inform
the Toolkit. Citizen data researcher Jennifer Gabrys (2019) says toolkits “provide instructions not
just for assembly and use but also for attending to the social and political ramifications of digital
devices.” She says they are spaces of “instruction, contingency, action, and alternative
engagement.” As such, the Intersectional AI Toolkit hopes to provide resources and access points
for engaging differently with machine learning systems in non-intimidating ways that connect
different audiences.
OBJECTIVES: Through the Intersectional AI Toolkit, appreciate the need for plural
perspectives on AI systems. Understand key terminology related to machine learning and to
5Ahmed says in Living a Feminist Life that the killjoy survival kit should contain books, things, tools, time, life, permission notes, other killjoys, humor, feelings, bodies, and your own survival kit.
23
intersectionality. Share perspectives on the impact of emerging technology as it relates to you.
Choose critically which AI tools and resources you will engage and how.
1.2.5 Interstitial Portals
• (Un)Limiting: Rebecca Horn, constraint, and COVID-era art
• (Un)Raveling: Sonya Rapoport, fiber art, and computation
• (Un)Forming: VALIE EXPORT, glitch feminism, and broken machines
• (Un)Living: On Kawara, dailiness, death, and data
• (Un)Knowing: Pipilotti Rist, black boxes, and tech trauma
The essay form is a kind of embodied processing that moves the corpus (book) through the
corpus (body), a reckoning in throat and gut that pairs bodily processing with computational
processing. These interstitial essays serve, as queer scholar KJ Cerankowski (2021) writes, “to let
this book be the crisis rather than about the crisis or crises, rather, a plurality of traumas and
pains felt collectively and individually.” Long traditions of artistic and literary outliers have
maintained the need for such forms, like autotheory and lyric essay, which bridge aesthetic,
personal, and political concerns. Known as the ‘first’ essayist, Michel de Montaigne called the
essay form a ‘trial’ or ‘attempt’. After Trinh T. Minh–ha’s “speaking nearby” (Chen, 1992), these
interstitial essays are attempts to speak in the “nearbyness” of Coding.Care. They are trials, in the
sense of struggles, to get closer to the core of a strange creature by sneaking between its ribs. An
oblique strategy (Eno & Schmidt, 1975), they glance against logical modes of critical analysis or
direct address, in order to become, probe, and interrupt simultaneously. From these traditions, I
am interested both in the constitutive act of form-making (as prefiguration) and in the
reconsitituion of critical forms into poetic, personalized, or approachable forms (Fournier, 2021),
24
into forms that can remake the forms we see around us as places we want to inhabit (abolitionist
forms).
Locating “a correspondence, not an assemblage,” the essays join together by “living with”
concepts (Ingold, 2015). They are portals to a co-existing “past-present-future” (Olufemi, 2021)
for exploring our relationships with systems differently and intimately, in which “the past is not
lost, however, but rather a space of potential” (Chun, 2021). The essays also use correspondance
as a form, relying on epistolary address to conjure up analogue antecedents to the digital media
discussed in other sections of Coding.Care. I read the works of five 20th century media artists as
pre-responses to automated systems. Their wide range of practices—from minimalist daily rituals
to queer feminist body art and performance—show how we have always-already been living in,
talking about, performing with the questions amplified by automated systems, classification, and
datification. Their works offer a breadth of artistic possibilities for reconsidering our
relationships with computational systems—and these responses were already being established in
parallel to the development of those systems. They help me reimagine how I want to respond now.
“the future is not in front of us, it is everywhere simultaneously: multidirectional,
variant, spontaneous. We only have to turn around. Relational solidarities, even in
their failure, reveal the plurality of the future-present, help us to see through the
impasse, help temporarily eschew what is stagnant, help build and then prepare to
shatter the many windows of the here and now.” (Olufemi, 2021)
These are alternate takes on algorithmic “bias,” because bias cannot be “optimized” out of
systems. These are bias cuts moving diagonally or diffractively across the warp and weft of the
fabric of the other texts here. Cutting and sewing on the bias puts fabric in tension, making
garments that take the shape of the bodies they surround. Bias cuts are ways of working with and
against materials, acknowledging their limits and not resolving them to right angles. Thus, these
25
essays sustain the research tensions of Coding.Care, unfurling the questions in the materials
rather than folding them away.
OBJECTIVES: Engage the questions and concerns of this collection through artistic
practice and poetic language. Consider the personal, emotional, physical, social impacts of
automated systems through modes that might not be accessed through other texts. Expand the
timelines, media formats, and contexts with which you frame the algorithmic. Imagine how you
will choose to respond.
26
References
A Wishlist for Trans*feminist Servers. (n.d.). BAK. Retrieved September 26, 2023, from
https://www.bakonline.org/prospections/a-wishlist-for-transfeminist-servers/
Abbate, J. (2021). Coding Is Not Empowerment. In T. S. Mullaney, B. Peters, M. Hicks, & K.
Philip (Eds.), Your Computer Is on Fire (pp. 253–272). The MIT Press.
https://doi.org/10.7551/mitpress/10993.003.0018
Abebe, V., Amaryan, G., Beshai, M., Ilene, Gurgen, A. E., Ho, W., Hylton, N. R., Kim, D., Lee, C.,
Lewandowski, C., Miller, K. T., Moore, L. A., Sylwester, R., Thai, E., Tucker, F. N., Webb, T.,
Zhao, D., Zhao, H. C., & Vertesi, J. (2022). Anti-Racist HCI: notes on an emerging critical
technical practice. Extended Abstracts of the 2022 CHI Conference on Human Factors in
Computing Systems, 1–12. https://doi.org/10.1145/3491101.3516382
Abi-Karam, A., & Gabriel, K. (Eds.). (2020). We want it all: an anthology of radical trans
poetics. Nightboat Books.
Adema, J. (2021). Living Books. The MIT Press.
https://mitpress.mit.edu/books/living-books
Agre, P. E. (1998). Toward a Critical Technical Practice: Lessons Learned in Trying to Reform AI.
In Social Science, Technical Systems, and Cooperative Work. Psychology Press.
Ahmed, S. (2006). Queer phenomenology: orientations, objects, others. Duke University Press.
http://www.loc.gov/catdir/toc/ecip0612/2006012768.html
Ahmed, S. (2017). Living a feminist life. Duke University Press.
Ahmed, S. (2018, November 8). Queer Use. Feministkilljoys.
https://feministkilljoys.com/2018/11/08/queer-use/
27
AI Now Institute (Director). (n.d.). The Labor that Makes AI “Magic” | Lilly Irani | AI Now 2016.
Retrieved September 2, 2018, from
https://www.youtube.com/watch?time_continue=68&v=5vXqpc2jCKs
Aliyu, Z. (2023). From the Sasha into the Zamani: Death as a Moment of Radical Continuity.
Logic(s) Magazine, 19. https://logicmag.io/supa-dupa-skies/
from-the-sasha-into-the-zamani-death-as-a-moment-of-radical-continuity/
Amaro, R. (2022). The Black technical object: on machine learning and the aspiration of Black
being. Sternberg Press.
Amoore, L. (2020). Cloud Ethics: Algorithms and the Attributes of Ourselves and Others.
https://doi.org/10.1215/9781478009276
Artist, A. (n.d.). Black Gooey Universe. Unbag. Retrieved February 19, 2021, from
https://unbag.net/end/black-gooey-universe
Ashby, W. R. (1956). An introduction to cybernetics (6. repr). Chapman & Hall.
ATNOFS-radio-broadcast-spideralex-danae-tapia | Variapad! (n.d.). Retrieved March 23,
2022, from
https://pad.vvvvvvaria.org/ATNOFS-radio-broadcast-spideralex-danae-tapia
Banks, A. J. (2006). Race, Rhetoric, and Technology: Searching for Higher Ground. Routledge.
Baraka, I. A. (1966). Home; social essays. Morrow.
Barnett, F., Blas, Z., Cárdenas, M., Gaboury, J., Johnson, J. M., & Rhee, M. (2016). QueerOS: A
User’s Manual. In M. K. Gold & L. F. Klein (Eds.), Debates in the Digital Humanities 2016
(pp. 50–59). University of Minnesota Press. https://doi.org/10.5749/j.ctt1cn6thb.8
Bender, E. M., Gebru, T., McMillan-Major, A., & Shmitchell, S. (2021). On the Dangers of
Stochastic Parrots: Can Language Models Be Too Big? Proceedings of the 2021 ACM
28
Conference on Fairness, Accountability, and Transparency, 610–623.
https://doi.org/10.1145/3442188.3445922
Benjamin, R. (2019). Race After Technology: Abolitionist Tools for the New Jim Code (1
edition). Polity.
Bennett, J. (2010). Vibrant Matter: A Political Ecology of Things. Duke University Press.
https://doi.org/10.1215/9780822391623
Benning, Sadie (Director). (2023, June 18). Girl Power (1992).
https://www.youtube.com/watch?v=tuZ4Bo9GYjk
Berlant, L. (2016). The commons: Infrastructures for troubling times*. Environment and
Planning D: Society and Space, 34(3), 393–419.
https://doi.org/10.1177/0263775816645989
Blackwell, A. F. (2024). Moral Codes. Moral Codes: Designing Alternatives to AI.
https://moralcodes.pubpub.org/
Bogers, L., & Chiappini, L. (Eds.). (2019). The critical makers reader: (un)learning technology.
Branson, S. (2022). Practical anarchism: a guide for daily life. Pluto Press.
Braybrooke, K. (2020). “Together, we dance alone”: building a collective toolkit for creatives in a
pandemic. Interactions, 27(4), 68–71. https://doi.org/10.1145/3406027
Britton, R. L., & Pritchard, H. (2022). For Careful Slugs: Caring for Unknowing in CS (Computer
Science). Catalyst: Feminism, Theory, Technoscience, 8(2), Article 2.
https://doi.org/10.28968/cftt.v8i2.37723
brown, adrienne maree. (2021). Holding Change. AK Press.
https://www.akpress.org/holding-change.html
Browne, S. (2015). Dark matters: on the surveillance of blackness. Duke University Press.
29
Brunton, F., & Nissenbaum, H. F. (2015). Obfuscation: a user’s guide for privacy and protest.
MIT Press.
Buolamwini, J. (Director). (2017, March 29). How I’m fighting bias in algorithms | Joy
Buolamwini. https://www.youtube.com/watch?v=UG_X_7g63rY
Buolamwini, J., & Gebru, T. (2018). Gender Shades: Intersectional Accuracy Disparities in
Commercial Gender Classification. Proceedings of the 1st Conference on Fairness,
Accountability and Transparency, 77–91.
https://proceedings.mlr.press/v81/buolamwini18a.html
Calvino, I. (1976). Numbers in the dark: and other stories (T. Parks, Ed.). Pantheon Books.
cárdenas, micha. (2016, July 20). Trans of Color Poetics: Stitching Bodies, Concepts, and
Algorithms. The Scholar & Feminist Online. https://sfonline.barnard.edu/
micha-cardenas-trans-of-color-poetics-stitching-bodies-concepts-and-algorithms/
CARE Principles of Indigenous Data Governance. (n.d.). Global Indigenous Data Alliance.
Retrieved March 14, 2022, from https://www.gida-global.org/care
Carson, A. (-07-14 2014). Eros the Bittersweet: An Essay. Princeton University Press.
Cerankowski, K. J. (2021). Suture: trauma and trans becoming (1st ed.). Punctum Books.
Chakravartty, P., & Mills, M. (2018). Virtual Roundtable on “Decolonial Computing.” Catalyst:
Feminism, Theory, Technoscience, 4(2), Article 2.
https://doi.org/10.28968/cftt.v4i2.29588
Chen, A. (2021, September 4). How to write an image description. Medium.
https://uxdesign.cc/how-to-write-an-image-description-2f30d3bf5546
Chun, W. H. K. (2021). Discriminating Data: Correlation, Neighborhoods, and the New Politics
of Recognition. The MIT Press. https://doi.org/10.7551/mitpress/14050.001.0001
30
Ciston, S. (2019a). Intersectional AI Is Essential: Polyvocal, Multimodal, Experimental Methods
to Save Artificial Intelligence. Journal of Science and Technology of the Arts, 11(2), 3–8.
https://doi.org/10.7559/citarj.v11i2.665
Ciston, S. (2019b). ladymouth: Anti-Social-Media Art As Research. Ada: A Journal of Gender,
New Media & Technology, 15.
https://scholarsbank.uoregon.edu/xmlui/handle/1794/26793
Ciston, S. (2022). MISUNDERSTANDING AS POETICS: COLLABORATING WITH AI TO
REWRITE THE INNER CRITIC / SARAH CISTON. Leonardo Electronic Almanac.
https://www.leoalmanac.org/
misunderstanding-as-poetics-collaborating-with-ai-to-rewrite-the-inner-critic-sarah-ciston/
Ciston, S. (2023). A CRITICAL FIELD GUIDE FOR WORKING WITH MACHINE LEARNING
DATASETS (K. Crawford & M. Ananny, Eds.).
https://knowingmachines.org/critical-field-guide
Ciston, S., & Marino, M. C. (2021, August 19). How to Fork a Book: The Radical Transformation
of Publishing. Medium. https://markcmarino.medium.com/
how-to-fork-a-book-the-radical-transformation-of-publishing-3e1f4a39a66c
Coleman, G. (Tue, 06/07/2016 - 12:00). Hackers. In B. Peters (Ed.), Digital Keywords.
https://press.princeton.edu/books/hardcover/9780691167336/digital-keywords
Cooper, B. (2016). Intersectionality. In L. Disch & M. Hawkesworth (Eds.), The Oxford
Handbook of Feminist Theory (Vol. 1). Oxford University Press.
https://doi.org/10.1093/oxfordhb/9780199328581.013.20
Corry, F., Kang, E. B., Sridharan, H., Luccioni, S., Ananny, M., & Crawford, K. (n.d.). Critical
Dataset Studies Reading List. Knowing Machines.
https://knowingmachines.org/reading-list
31
Costanza-Chock, S. (2020). Design Justice: Community-Led Practices to Build the Worlds We
Need. https://doi.org/10.7551/mitpress/12255.001.0001
Cox, G., & McLean, A. (2013). Speaking code: coding as aesthetic and political expression. The
MIT Press.
Cox, G., & Soon, W. (2020). Aesthetic Programming: A Handbook of Software Studies. Open
Humanites Press.
http://www.openhumanitiespress.org/books/titles/aesthetic-programming/
Crawford, K. (2021). Atlas of AI: power, politics, and the planetary costs of artificial
intelligence. Yale University Press.
Crawford, K., Ananny, M., Buschek, C., Sridharan, H., Luccioni, S., Orr, W., Schultz, J., & Thorp,
J. (n.d.). 9 ways to see a Dataset.
https://knowingmachines.org/publications/9-ways-to-see
Crawford, K., & Joler, V. (2023, November 22). Calculating Empires.
https://knowingmachines.org/publications/calculating-empires
Crawford, K., & Paglen, T. (2019, September 19). Excavating AI: The Politics of Images in
Machine Learning Training Sets. https://www.excavating.ai
Crenshaw, K. (1989). Demarginalizing the Intersection of Race and Sex: A Black Feminist
Critique of Antidiscrimination Doctrine, Feminist Theory and Antiracist Politics. University
of Chicago Legal Forum, 1989, 139–168.
https://heinonline.org/HOL/P?h=hein.journals/uchclf1989&i=143
Crenshaw, K. (2021, March 29). What Does Intersectionality Mean? : 1A.
https://www.npr.org/2021/03/29/982357959/what-does-intersectionality-mean
Crochet, Code, Craft. (n.d.). Mercedes Bernard. Retrieved March 10, 2023, from
http://www.mercedesbernard.com/blog/crochet-code-craft
32
Davies, H., McKernan, B., & Sabbagh, D. (2023, December 1). “The Gospel”: how Israel uses AI to
select bombing targets in Gaza. The Guardian. https://www.theguardian.com/world/2023/
dec/01/the-gospel-how-israel-uses-ai-to-select-bombing-targets
de la Bellacasa, M. P. (2011). Matters of care in technoscience: Assembling neglected things.
Social Studies of Science, 41(1), 85–106. https://doi.org/10.1177/0306312710380301
Dhaliwal, R. S. (2022). On Addressability, or What Even Is Computation? Critical Inquiry, 49(1),
1–27. https://doi.org/10.1086/721167
Dodge, J., Prewitt, T., Tachet des Combes, R., Odmark, E., Schwartz, R., Strubell, E., Luccioni, A.
S., Smith, N. A., DeCario, N., & Buchanan, W. (2022). Measuring the Carbon Intensity of AI in
Cloud Instances. Proceedings of the 2022 ACM Conference on Fairness, Accountability, and
Transparency, 1877–1894. https://doi.org/10.1145/3531146.3533234
Dog Park Dissidents. (n.d.). Queer As In Fuck You. Retrieved November 4, 2023, from
https://genius.com/Dog-park-dissidents-queer-as-in-fuck-you-lyrics
Driskill, Q.-L. (Cherokee). (2010). Doubleweaving Two-Spirit Critiques: Building Alliances
between Native and Queer Studies. GLQ: A Journal of Lesbian and Gay Studies, 16(1),
69–92. https://muse.jhu.edu/pub/4/article/372445
Dunbar-Hester, C. (2019). Hacking Diversity: The Politics of Inclusion in Open Technology
Cultures. Princeton University Press. https://doi.org/10.2307/j.ctvhrd181
Dziri, N., Milton, S., Yu, M., Zaiane, O., & Reddy, S. (2022). On the Origin of Hallucinations in
Conversational Models: Is it the Datasets or the Models? (arXiv:2204.07931; Version 1).
arXiv. http://arxiv.org/abs/2204.07931
Earl, C. C. (2021). Towards an Abolitionist AI: the role of Historically Black Colleges and
Universities (arXiv:2101.02011). arXiv. https://doi.org/10.48550/arXiv.2101.02011
EGP. (n.d.). Retrieved December 27, 2023, from http://www.elisagiardinapapa.org/
33
ender. (n.d.). You Are A Coder (zine) by enderverse. Itch.io. Retrieved October 24, 2023, from
https://enderversing.itch.io/uracoder
Eno, B., & Schmidt, P. (1975). Oblique Strategies.
Feminist.AI. (n.d.). Retrieved March 17, 2019, from https://www.feminist.ai/
feministkilljoys. (2018, November 8). Queer Use. Feministkilljoys.
https://feministkilljoys.com/2018/11/08/queer-use/
Fountain, D. (2021, December 13). Survival of the Knittest: Craft and Queer-Feminist
Worldmaking. MAI: Feminism & Visual Culture. https://maifeminism.com/
survival-of-the-knittest-craft-and-queer-feminist-worldmaking/
Fournier, L. (2021). Autotheory as Feminist Practice in Art, Writing, and Criticism.
http://direct.mit.edu/books/book/5028/
Autotheory-as-Feminist-Practice-in-Art-Writing-and
Franklin, U. M. (2004). The real world of technology (Rev. ed). House of Anansi Press ;
Distributed in the United States by Publishers Group West.
Gaboury, J. (2013, February 19). A Queer History of Computing. Rhizome.
http://rhizome.org/editorial/2013/feb/19/queer-computing-1/
Gaboury, J. (2018). Becoming NULL: Queer relations in the excluded middle. Women &
Performance: A Journal of Feminist Theory, 28(2), 143–158.
https://doi.org/10.1080/0740770X.2018.1473986
Gabrys, J. (2019). How to Do Things with Sensors. University of Minnesota Press.
https://doi.org/10.5749/j.ctvpbnq7k
Gillespie, T., & Seaver, N. (2015, November 5). Critical Algorithm Studies: a Reading List. Social
Media Collective.
https://socialmediacollective.org/reading-lists/critical-algorithm-studies/
34
Gilmore, R. W. (2018, December 20). Making Abolition Geography in California’s Central
Valley. THE FUNAMBULIST MAGAZINE.
https://thefunambulist.net/magazine/21-space-activism/
interview-making-abolition-geography-california-central-valley-ruth-wilson-gilmore
Glissant, É. (2009). Poetics of relation (Nachdr.). University of Michigan Press.
Golding, J., Reinhart, M., & Paganelli, M. (Eds.). (2020). Data Loam. De Gruyter.
http://www.degruyter.com/document/doi/10.1515/9783110697841/html
Goodman, A. (n.d.). The Secret Life of Algorithms: speculation on queered futures of
neurodiverse analgorithmic feeling and consciousness. 22.
Griffiths, C. (2022). Toward counteralgorithms: the contestation of interpretability in machine
learning [Ph.D., University of Southern California]. https://digitallibrary.usc.edu/CS.
aspx?VP3=DamView&VBID=2A3BXZ8B021HS&SMLS=1&RW=1334&RH=859
Hakopian, M. F. (2022). The Institute for Other Intelligences (A. Vikram & A. Iwataki, Eds.). X
Artists’ Books.
Hamraie, A., & Fritsch, K. (2019). Crip Technoscience Manifesto. Catalyst: Feminism, Theory,
Technoscience, 5(1), Article 1. https://doi.org/10.28968/cftt.v5i1.29607
Harney, S., & Moten, F. (2021). All Incomplete. Minor Compositions.
Haschemi Yekani, E., Nowicka, M., & Roxanne, T. (2022). Revisualising Intersectionality.
Springer International Publishing. https://doi.org/10.1007/978-3-030-93209-1
Hedva, J. (2018, November 14). Belonging in the Mess. Processing Foundation.
https://medium.com/processing-foundation/belonging-in-the-mess-3d3ad0577499
Hedva, J. (2022, March 13). Why It’s Taking So Long. Topical Cream.
https://www.topicalcream.org/features/why-its-taking-so-long/
35
hedva, johanna. (n.d.). Johanna Hedva Care Syllabus Interview. CARE SYLLABUS. Retrieved
July 14, 2021, from
https://www.caresyllabus.org/johanna-hedva-care-syllabus-interview
Heikkilä, M., & Ryan-Mosley, T. (2023, October 30). Three things to know about the White
House’s executive order on AI. MIT Technology Review.
https://www.technologyreview.com/2023/10/30/1082678/
three-things-to-know-about-the-white-houses-executive-order-on-ai/
Hickman, L. (2019). Transcription Work and the Practices of Crip Technoscience. Catalyst:
Feminism, Theory, Technoscience, 5(1), Article 1.
https://doi.org/10.28968/cftt.v5i1.32081
Hicks, M. (2021). Sexism Is a Feature, Not a Bug. In T. S. Mullaney, B. Peters, M. Hicks, & K.
Philip (Eds.), Your Computer Is on Fire (pp. 135–158). The MIT Press.
https://doi.org/10.7551/mitpress/10993.003.0011
Hoff, M. (2022, October 17). always-already-programming.md. Gist.
https://gist.github.com/melaniehoff/95ca90df7ca47761dc3d3d58fead22d4
hooks, bell. (1994). Teaching to transgress: education as the practice of freedom. Routledge.
Hunger, F. (2023). Unhype Artificial “Intelligence”! A proposal to replace the deceiving
terminology of AI. Zenodo. https://doi.org/10.5281/zenodo.7524493
INDIGENOUS AI. (n.d.). INDIGENOUS AI. Retrieved April 25, 2022, from
https://www.indigenous-ai.net/
Ingold, T. (2015). The life of lines. Routledge.
Initiative (WAI), W. W. A. (2024, March 26). Tips for Getting Started. Web Accessibility
Initiative (WAI). https://www.w3.org/WAI/tips/
36
Iraqi, A. (2024, April 3). “Lavender”: The AI machine directing Israel’s bombing spree in Gaza.
+972 Magazine. https://www.972mag.com/lavender-ai-israeli-army-gaza/
Irigaray, L. (1985). This sex which is not one. Cornell University Press.
Joque, J. (2022). Revolutionary mathematics: artificial intelligence, statistics and the logic of
capitalism. Verso.
Jordan, M. D. (2023). Queer Callings: Untimely Notes on Names and Desires (1st ed.). Fordham
University Press.
Kapoor, S., & Narayanan, A. (2023, March 29). A misleading open letter about sci-fi AI dangers
ignores the real risks [Substack newsletter]. AI Snake Oil.
https://aisnakeoil.substack.com/p/a-misleading-open-letter-about-sci
Kawara, O., Weiss, J., Wheeler, A., Buren, D., & Solomon R. Guggenheim Museum (Eds.). (2015).
On Kawara - silence: ... on the occasion of the exhibition “On Kawara - Silence” ... Salomon
R. Guggenheim Museum, New York, February 6 - May 3, 2015. Guggenheim Museum Publ.
Keeling, K. (-01-22 2014). Queer OS. Cinema Journal, 53(2), 152–157.
https://doi.org/10.1353/cj.2014.0004
Khan, N. (2022). HOLO 3: This Isn’t Even My Final Form. HOLO.
https://www.holo.mg/dossiers/holo-3/
Khullar, D. (2023, February 27). Can A.I. Treat Mental Illness? The New Yorker.
https://www.newyorker.com/magazine/2023/03/06/can-ai-treat-mental-illness
Kleesattel, I. (2021). Situated Aesthetics for Relational Critique: On Messy Entanglements from
Maintenance Art to Feminist Server Art. In C. Sollfrank, F. Stalder, & S. Niederberger (Eds.),
Aesthetics of the Commons. Diaphanes.
Klipphahn-Karge, M., & Koster, A.-K. (n.d.). Queere KI - Zum Coming-out smarter Maschinen.
37
Klumbytė, G., Draude, C., & Taylor, A. S. (2022). Critical Tools for Machine Learning: Working
with Intersectional Critical Concepts in Machine Learning Systems Design. Proceedings of the
2022 ACM Conference on Fairness, Accountability, and Transparency, 1528–1541.
https://doi.org/10.1145/3531146.3533207
Knuth, D. E. (1974). Computer programming as an art. Communications of the ACM, 17(12),
667–673. https://doi.org/10.1145/361604.361612
Ko, A. J., Beitlers, A., Wortzman, B., Davidson, M., Oleson, A., Kirdani-Ryan, M., Druga, S., &
Everson, J. (2023). Critically Conscious Computing: Methods for Secondary Education.
https://bookish.press/undefined
Langlands, A. (2019). Craeft. Norton.
Leonard, Z. (1992). I want a president.
lesya Shats (Director). (2014, May 12). Ever is overall Pipiloti.
https://www.youtube.com/watch?v=-gd06ukX-rU
Lewis, J. E., Abdilla, A., Arista, N., Baker, K., Benesiinaabandan, S., Brown, M., Cheung, M.,
Coleman, M., Cordes, A., Davison, J., Duncan, K., Garzon, S., Harrell, D. F., Jones, P.-L.,
Kealiikanakaoleohaililani, K., Kelleher, M., Kite, S., Lagon, O., Leigh, J., … Whaanga, H.
(2020). Indigenous Protocol and Artificial Intelligence Position Paper [Monograph].
Indigenous Protocol and Artificial Intelligence Working Group and the Canadian Institute for
Advanced Research.
https://doi.org/10.11573/spectrum.library.concordia.ca.00986506
Lison, A. (2022). Minimal Computing. https://doi.org/10.18452/25607
Liu, H., Xue, W., Chen, Y., Chen, D., Zhao, X., Wang, K., Hou, L., Li, R., & Peng, W. (2024). A
Survey on Hallucination in Large Vision-Language Models (arXiv:2402.00253). arXiv.
https://doi.org/10.48550/arXiv.2402.00253
38
Local Contexts – Grounding Indigenous Rights. (n.d.). Retrieved June 15, 2022, from
https://localcontexts.org/
Loukissas, Y. A. (2019). All Data Are Local: Thinking Critically in a Data-Driven Society. MIT
Press, The MIT Press.
Loveless, N. (2019). How to Make Art at the End of the World: A Manifesto for
Research-Creation. Duke University Press Books.
Manning, E. (2013). Always more than one: individuation’s dance. Duke University Press.
http://public.eblib.com/choice/publicfullrecord.aspx?p=1173228
Marino, M. (2006). Critical Code Studies | electronic book review. Electronic Book Review.
https://electronicbookreview.com/essay/critical-code-studies/
Marino, M. C. (2020). Critical code studies. The MIT Press.
Marks, L. U. (2002). Touch: Sensuous Theory and Multisensory Media (NED - New edition).
University of Minnesota Press; JSTOR.
http://www.jstor.org/stable/10.5749/j.ctttv5n8
Martinez, E., & Ciston, S. (n.d.). Unsupervised Pleasures. Unsupervised Pleasures. Retrieved
January 29, 2023, from
https://unsupervisedpleasures.com/unsupervisedpleasures.com/
masso, cypress evelyn. (n.d.). How to contribute to open source software by sharing uncertainty.
In X. X. Shih & K. Moriwaki (Eds.), Critical Coding Cookbook. Retrieved February 5, 2024,
from https://criticalcode.recipes/contributions/
how-to-contribute-to-open-source-software-by-sharing-uncertainty
masso, evelyn. (n.d.). Handling Null. Evelyn Masso. Retrieved October 9, 2023, from
https://outofambit.format.com/handling-null
McConnell, S. (2004). Code complete (second edition). Microsoft Press.
39
McKinney, C. (2020). Information Activism: A Queer History of Lesbian Media Technologies.
Duke University Press. https://doi.org/10.1215/9781478009337
McPherson, T. (2018). Feminist in a Software Lab: Difference + Design. Harvard University
Press.
McQuillan, D. (2022). Resisting AI: An Anti-fascist Approach to Artificial Intelligence.
https://bristoluniversitypress.co.uk/resisting-ai
Milmo, D., & editor, D. M. G. technology. (2023, November 23). OpenAI “was working on
advanced model so powerful it alarmed staff.” The Guardian.
https://www.theguardian.com/business/2023/nov/23/
openai-was-working-on-advanced-model-so-powerful-it-alarmed-staff
Minimum Viable Learning – Labs – SoC. (n.d.). Retrieved October 8, 2021, from
https://www.schoolofcommons.org/labs/minimum-viable-learning
Montfort, N. (2016). Exploratory Programming for the Arts and Humanities (1 edition). The
MIT Press.
Montfort, N. (2021). Exploratory Programming for the Arts and Humanities (2nd ed.). MIT
Press.
Moraga, C., & Anzaldúa, G. (Eds.). (1981, 1983). This bridge called my back: writings by radical
women of color (2nd ed). Persephone Press.
Muñoz, J. E. (2009). Cruising Utopia: The Then and There of Queer Futurity. NYU Press.
https://www.jstor.org/stable/j.ctt9qg4nr
Munster, A. (2013). An aesthesia of networks: conjunctive experience in art and technology.
MIT Press.
40
Nakamura, L. (2014). Indigenous Circuits: Navajo Women and the Racialization of Early
Electronic Manufacture. American Quarterly, 66(4), 919–941.
https://muse.jhu.edu/pub/1/article/563663
Nake, F. (1971). There Should Be No Computer Art. Bulletin of the Computer Arts Society.
Nardi, B. A. (1993). A Small Matter of Programming: Perspectives on End User Computing.
https://doi.org/10.7551/mitpress/1020.001.0001
Nedden, C. zur, & Dongus, A. (2017, December 17). Biometrie: Getestet an Millionen
Unfreiwilligen. Die Zeit. https://www.zeit.de/digital/datenschutz/2017-12/
biometrie-fluechtlinge-cpams-iris-erkennung-zwang
Nelson, T. H. (1974). Computer Lib/Dream Machines (9. print). The Distributors.
NET ART ANTHOLOGY: FloodNet. (2016, October 27). NET ART ANTHOLOGY: FloodNet.
https://anthology.rhizome.org/floodnet
Nimkulrat, N., Kane, F., & Walton, K. (Eds.). (2016). Crafting textiles in the digital age.
Bloomsbury Academic.
Noble, S. U. (2018). Algorithms of Oppression : How Search Engines Reinforce Racism. NYU
Press. http://search.ebscohost.com/login.aspx?direct=true&db=nlebk&AN=1497317&
authtype=sso&custid=s8983984
Nowviskie, B. (2013, January 5). resistance in the materials. Bethany Nowviskie.
http://nowviskie.org/2013/resistance-in-the-materials/
Oakley, B. (2023). Imperfect Archiving, Archiving as Practice: The Ethics of the Archive.
GenderFail.
Oliveira, P. (n.d.). To Become Undone. Ding Magazine. Retrieved March 10, 2024, from
https://dingdingding.org/issue-4/to-become-undone/
41
Oliver, J., Savičić, G., & Vasiliev, D. (2011). The Critical Engineering Manifesto.
https://criticalengineering.org/
Olufemi, L. (2021). Experiments in Imagining Otherwise. Hajar Press.
Ovalle, A., Subramonian, A., Gautam, V., Gee, G., & Chang, K.-W. (2023). Factoring the Matrix
of Domination: A Critical Review and Reimagination of Intersectionality in AI Fairness
(arXiv:2303.17555). arXiv. https://doi.org/10.48550/arXiv.2303.17555
p5.js/contributor_docs/access.md at main · processing/p5.js. (n.d.). GitHub. Retrieved March
26, 2024, from
https://github.com/processing/p5.js/blob/main/contributor_docs/access.md
Perrigo, B. (2023, January 18). Exclusive: The $2 Per Hour Workers Who Made ChatGPT Safer.
Time. https://time.com/6247678/openai-chatgpt-kenya-workers/
Pipilotti Rist: Big Heartedness, Be My Neighbor. (n.d.). Retrieved February 4, 2024, from
https://www.moca.org/exhibition/pipilotti-rist
Pipkin, E. (2020). On Lacework: watching an entire machine-learning dataset. The
Photographers’ Gallery: Unthinking Photography.
https://unthinking.photography/articles/on-lacework
Pirate Care. (n.d.). Pirate Care Syllabus. Retrieved April 18, 2022, from
https://syllabus.pirate.care/
Plant, S. (1998). Zeros + Ones. Fourth Estate.
Raley, R. (2009). Tactical media (Vol. 28). Univ. of Minnesota Press.
Raley, R., & Rhee, J. (2023). Critical AI: A Field in Formation. American Literature, 10575021.
https://doi.org/10.1215/00029831-10575021
Reddy, A. (2021, December 22). Internet of Towels. Medium.
https://anu1905.medium.com/lenticular-crochet-1d5ec1b48d99
42
Reddy, A. (2023, July 13). Reading AI Family Resemblances using Cross-Stitch Patterns.
Formy. https://formy.xyz/en/artykul/
reading-ai-family-resemblances-using-cross-stitch-patterns/
Rights, C. (2020, September 10). Decolonising AI: A transfeminist approach to data and social
justice. Medium. https://medium.com/codingrights/
decolonising-ai-a-transfeminist-approach-to-data-and-social-justice-a5e52ac72a96
Roberts, S. T. (2016). Commercial Content Moderation: Digital Laborers’ Dirty Work. In The
intersectional Internet: race, sex, class and culture online (pp. 147–160). Peter Lang
Publishing, Inc.
Robertson, L. (2012). Nilling: prose essays on noise, pornography, the codex, melancholy,
lucretius, folds, cities and related aporias (Second edition). BookThug.
Rodenbeck, J. F. (2014). Radical Prototypes: Allan Kaprow and the Invention of Happenings.
MIT Press. https://mitpress.mit.edu/9780262526128/radical-prototypes/
Russell, L. (2020). Glitch feminism: a manifesto. Verso.
Ryan-Mosley, T., & Strong, J. (2020). The activist dismantling racist police algorithms. MIT
Technology Review. https://www.technologyreview.com/2020/06/05/1002709/
the-activist-dismantling-racist-police-algorithms/
Schwarz, C. (2010). The Anarchist’s Tool Chest. Lost Art Press LLC.
Seaver, N. (2021). Care and Scale: Decorrelative Ethics in Algorithmic Recommendation.
Cultural Anthropology, 36(3), Article 3. https://doi.org/10.14506/ca36.3.11
Sedgwick, E. K. (2012). The weather in Proust (J. Goldberg, Ed.). Duke University Press.
http://catalog.hathitrust.org/api/volumes/oclc/741273595.html
Shanahan, M. (2023). Talking About Large Language Models (arXiv:2212.03551). arXiv.
https://doi.org/10.48550/arXiv.2212.03551
43
Shane, J. (2021). You look like a thing and I love you: how artificial intelligence works and why
it’s making the world a weirder place (First edition). Voracious/Little, Brown and Company.
Sharma, S. (2020). A Manifesto for the Broken Machine. Camera Obscura: Feminism, Culture,
and Media Studies, 35(2 (104)), 171–179. https://doi.org/10.1215/02705346-8359652
Shaw, M. (2020). Courting the wild twin. Chelsea Green Publishing.
Shih, X. X., & Moriwaki, K. (Eds.). (2022). Critical Coding Cookbook.
https://criticalcode.recipes
Sinders, C. (n.d.). Feminist Data Set. Retrieved January 29, 2023, from
https://carolinesinders.com/feminist-data-set/
Sit.Co.De Lab. (n.d.). Situated Computation + Design Lab. Retrieved March 21, 2022, from
https://sit-code.com/
Sloane, M., Moss, E., Awomolo, O., & Forlano, L. (2022). Participation Is not a Design Fix for
Machine Learning. Equity and Access in Algorithms, Mechanisms, and Optimization, 1–6.
https://doi.org/10.1145/3551624.3555285
Sollfrank, C. (Ed.). (2018). The Beautiful Warriors: Technofeminist Praxis in the 21st Century
(2018–) [German, English]. https://monoskop.org/log/?p=20272
Sollfrank, C., Stalder, F., & Niederberger, S. (2021). Aesthetics of the Commons. Diaphanes.
Soulellis, P. (2023). Bad Archives. JCMS, 62(4), 181–187.
Springer, A. J. (2015). Syllabus ENGL 8A.
Star, S. L. (1999). The Ethnography of Infrastructure. American Behavioral Scientist, 43(3),
377–391. https://doi.org/10.1177/00027649921955326
Stark, L., & Crawford, K. (2019). The Work of Art in the Age of Artificial Intelligence: What
Artists Can Teach Us About the Ethics of Data Practice. Surveillance & Society, 17(3/4),
442–455. https://doi.org/10.24908/ss.v17i3/4.10821
44
Stein, G. (2014). Tender buttons (S. Perlow, Ed.; The corrected centennial edition. First City
Lights edition). City Lights Books.
Steyerl, H. (2023). Mean Images. New Left Review, 140/141, 82–97.
Suchman, L. A., & Trigg, R. H. (1993). Artificial intelligence as craftwork. In J. Lave & S. Chaiklin
(Eds.), Understanding Practice: Perspectives on Activity and Context (pp. 144–178).
Cambridge University Press. https://doi.org/10.1017/CBO9780511625510.007
Sunder, A. (Ed.). (2023). Platformisation: Around, Inbetween and Through. Singapore Art
Museum. https:
//aartisunder.com/2022/11/07/platformisation-around-in-between-and-through/
Tbakhi, F. (2023, December 8). Notes on Craft: Writing in the Hour of Genocide • Protean
Magazine. Protean Magazine. https:
//proteanmag.com/2023/12/08/notes-on-craft-writing-in-the-hour-of-genocide/
The surprising ancient history of the hedge apple. (2021, November 23). Environment.
https://www.nationalgeographic.com/environment/article/
hedge-apple-osage-orange-ghost-of-evolution
Tito, A. (2022, February 5). Week 3: p5.js and Access - Discussion Starter. CCS Working Group.
https://wg.criticalcodestudies.com/index.php?p=/discussion/120/
week-3-p5-js-and-access-discussion-starter
varia. (n.d.). A Traversal Network of Feminist Servers. Retrieved September 19, 2023, from
https://varia.zone/ATNOFS/
Vasconcelos, E. (n.d.). A visual introduction to AI. Retrieved August 21, 2022, from https://
kim.hfg-karlsruhe.de/visual-introduction-to-ai?name=visual-introduction-to-ai
45
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, L., &
Polosukhin, I. (2023). Attention Is All You Need (arXiv:1706.03762). arXiv.
https://doi.org/10.48550/arXiv.1706.03762
Vee, A. (2017). Coding Literacy: How Computer Programming Is Changing Writing. http:
//direct.mit.edu/books/book/3543/Coding-LiteracyHow-Computer-Programming-Is
Vee, A., Laquintano, T., & Schnitzler, C. (Eds.). (2023). TextGenEd: Teaching with Text
Generation Technologies. The WAC Clearinghouse.
https://wac.colostate.edu/repository/collections/textgened/
Vicente, M. V. (2021). Transgender: A Useful Category?: Or, How the Historical Study of
“Transsexual” and “Transvestite” Can Help Us Rethink “Transgender” as a Category. TSQ:
Transgender Studies Quarterly, 8(4), 426–442.
https://doi.org/10.1215/23289252-9311032
Vis, D. (2021). Research for people who (think they) would rather create (First edition).
Onomatopee.
Von Busch, O. (2013). Zen and the Abstract Machine of Knitting. TEXTILE, 11(1), 6–19.
https://doi.org/10.2752/175183513X13588738654774
Weckert, S. (n.d.). Google Maps Hacks. https://simonweckert.com/googlemapshacks.html
Wendy Hui Kyong Chun. (2008). On “Sourcery,” or Code as Fetish. Configurations, 16(3),
299–324. https://doi.org/10.1353/con.0.0064
Werkdetail | VALIE EXPORT. (n.d.). Retrieved January 27, 2024, from
https://www.valieexport.at/jart/prj3/valie_export_web/main.jart?rel=de&
reserve-mode=active&content-id=1526555820281&tt_news_id=1995
What Are AI Hallucinations? | IBM. (n.d.). Retrieved April 11, 2024, from
https://www.ibm.com/topics/ai-hallucinations
46
what is intersectionality. (n.d.). Center for Intersectional Justice. Retrieved February 16, 2024,
from https://www.intersectionaljustice.org/what-is-intersectionality
What is Transformative Justice? (2020, March 5). Barnard Center for Research on Women.
https://bcrw.barnard.edu/videos/what-is-transformative-justice/
White, R. (2022, June 2). Q&A with Teaching Award Winner Briana Bettin. Michigan
Technological University. https:
//www.mtu.edu/news/2022/06/qa-with-teaching-award-winner-briana-bettin.html
Willis, H. (2016). Fast Forward: The Future(s) of the Cinematic Arts (pp. 224 Pages).
Wallflower Press.
Wilson, S. (2008). Research Is Ceremony: Indigenous Research Methods (F First Edition Used).
Fernwood Publishing.
Wittig, M. (1971). Les guérillères. Viking Press.
Yang, K. (n.d.). Coem. Katherine Yang. https://kayserifserif.place/work/coem/
Zilboorg, A. (2015). Knitting for anarchists: the what, why and how of knitting. Dover
Publications, Inc.
47
Abstract (if available)
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Toward counteralgorithms: the contestation of interpretability in machine learning
PDF
Living with the most humanlike nonhuman: understanding human-AI interactions in different social contexts
PDF
Real fake rooms: experiments in narrative design for virtual reality
PDF
Against reality: AI co-creation as a powerful new programming tool
PDF
“A door, an exit, a way out”: trans*temporality in hybrid media
PDF
The popularizing and politicizing of queer media images in Taiwan: 1997 to the present
PDF
Navigating political polarization: a group case study of community engagement in the adoption of intersectional ethnic studies in a Southern California K-12 school district
PDF
Queerness explained to my mother
PDF
Video camera technology in the digital age: Industry standards and the culture of videography
PDF
Scalable optimization for trustworthy AI: robust and fair machine learning
PDF
I can't afford the luxury of just being an actress: the politicized work of African American actresses on television
PDF
Labors of love: Black women, cultural production, and the romance genre
PDF
Common ground reasoning for communicative agents
PDF
The virtual big sister: television and technology in girls' media
PDF
Viral selves: Cellphones, selfies and the self-fashioning subject in contemporary India
PDF
Making way for artificial intelligence in cancer care: how doctors, data scientists and patients must adapt to a changing landscape
PDF
¡Que naco! Border cinema and Mexican migrant audiences
PDF
Responsible AI in spatio-temporal data processing
PDF
Emergent media technologies and the production of new urban spaces
PDF
GM TV: sports television and the managerial turn
Asset Metadata
Creator
Ciston, Sarah
(author)
Core Title
Coding.Care: guidebooks for intersectional AI
School
School of Cinematic Arts
Degree
Doctor of Philosophy
Degree Program
Cinematic Arts (Media Arts and Practice)
Degree Conferral Date
2024-05
Publication Date
05/01/2024
Defense Date
02/29/2024
Publisher
Los Angeles, California
(original),
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
artificial intelligence,artistic research,community,craft,critical AI,critical data studies,datasets,intersectionality,machine learning,OAI-PMH Harvest,queer,trans,zines
Format
theses
(aat)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
McPherson, Tara (
committee chair
), Willis, Holly (
committee chair
), Raley, Rita (
committee member
)
Creator Email
sarahciston@gmail.com,sarahciston@protonmail.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-oUC113911993
Unique identifier
UC113911993
Identifier
etd-CistonSara-12884.pdf (filename)
Legacy Identifier
etd-CistonSara-12884
Document Type
Dissertation
Format
theses (aat)
Rights
Ciston, Sarah
Internet Media Type
application/pdf
Type
texts
Source
20240503-usctheses-batch-1147
(batch),
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the author, as the original true and official version of the work, but does not grant the reader permission to use the work if the desired use is covered by copyright. It is the author, as rights holder, who must provide use permission if such use is covered by copyright.
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Repository Email
cisadmin@lib.usc.edu
Tags
artificial intelligence
artistic research
community
craft
critical AI
critical data studies
intersectionality
machine learning
queer
trans
zines