On the Moral Collapse of AI Ethics
I support Dr. Timnit Gebru. I’ve had the good fortune to become friends with Timnit over the last several weeks as we’ve spent hours discussing the spread of mis/disinformation and hate speech on social media in Ethiopia. Our collaboration began with a frank conversation around the limitations of the AI ethics community. I felt she sincerely engaged with the critiques I raised about the representation politics in predominantly white institutions interpolating a handful of African elites as ambassadors of the Black American experience. Out of the love I got for her and this community of computer scientists, data/tech policy analysts, academics, I feel the need to be harsh and keep it real about the moral collapse of AI Ethics.
If demands for corporate transparency crystalized in the Standing with Dr. Timnit Gebru Petition defines the horizon for tech worker resistance, we are doomed. It’s been more than 2 years since Google removed the “Don’t Be Evil” slogan and the so-called resistance is circulating the first mediocre action item that surfaced instead of providing the moral leadership required to build meaningful grassroots resistance against surveillance capitalism as it’s embodied in Gebru’s firing. Even the concept of surveillance capitalism is insufficient to understand the ways these computational infrastructures are subsuming our human relationships. Notably, Shoshana Zuboff’s eponymous book only mentions Africa as the passive recipient of surveillance as a service. The larger point is even our language is failing us because we’re too often fighting to save a democracy that modernity has foreclosed. We need a radical reframing away from the lone researcher against Goliath, to reckon with the failure of the Fairness, Accountability, Transparency and broader ethics frameworks that have allowed opportunists to build their brand, taking up the space required to address the core issues of Big Tech’s hegemonic violence as described in Timnit’s paper. We must support Timnit and acknowledge that support isn’t enough.
I appreciate the amount of public support Timnit Gebru has received from the professional community but real friends tell each other that representation politics of the urban elite are not adequate to address the many apocalypses we find ourselves in at this political flashpoint. Maybe you’re at Oxford, UCLA, Google or AI Now thinking “Who am I?”. Maybe you’re assuming someone or somewhere else specializes in natural language processing, labor law, or organizing tech worker resistance? Maybe you’re assuming someone else is her friend and will point out that yes, corporations do something to people, it’s called late stage racial capitalism so you just RT and hashtag as you were instructed. Black In AI has never coupled their mission with a coherent political economic analysis so maybe you feel blindsided now.
If you were shocked by the firing of Timnit you haven’t been paying attention. We need to fucking take responsibility for the present because while you’re immobilized, debating whether if you’re the one who should mention the corporate diversity, equity and inclusion (DEI) lexicon (inherently de-coupled from a political economic analysis) is half of the problem. The most opportunistic and/or mediocre are defining the discourse on a global stage. We’re ruminating about Jeff Dean’s feelings instead of building a cross class labor movement that defines tech workers broadly, ie researchers, engineers, Uber drivers, Amazon warehouse workers, content moderators etc. We should cry out not because Timnit is a brilliant scientist who has shaped her field, but becuase her firing is a symptom of the broken society we’ve constructed. She isn’t fucked, we all are.
The academics and computer scientists claiming they will no longer volunteer their free labor when it comes time to reviewing Google academic submissions should be commended. However they should ask themselves what is it about the high minded ideals of unimpeded academic freedom that were so divorced from the praxis that research was enabling. Saying “That phrenology was conducted rigorously” is insufficient. At issue isn’t Google or language models or Jeff Dean. Rather, it is a field that has sold itself to industries that are killing people.
I have no anger towards the Google Walk Out Group who developed the petition, but they haven’t been center stage for years on panels, books and webinars declaring resistance against the monopolies of private tech datafying and monetizing our lives. However, I texted Timnit, please intervene because the vultures are going to have Gebrugate collectibles with their logo on it by the time you finish driving across the country. I doubt my ability to make it through the thicket of virtue signalling allies and I tried to exercise restraint, but the infographic demanding Google provide reparations to Timnit was the final straw. Reparations is a well articulated, collective and historical demand made by Black Americans in this country and to project it onto an individual Eritrean born engineer is a reminder of how broken we are as a movement. I’m going to need the ally starter pack to evolve beyond Race After Technology, Algorithms of Oppression cliff notes with a reference to Oscar Gandy b/c someone mentioned it at the FAT* (now FAccT) conference. For the record, I love Ruha Benjamin and Safiya Noble but y’all substituted entire fields for their back covers and it shows.
Separately, all our skinfolk ain’t our kinfolk and if you’re championing narcissists out of uncritical liberal guilt, you are complicit in this hot mess that’s supposed to be working with society to envision and build a more just future. Some Black women represent a class viewpoint that is diametrically opposed to the interests of the vast majority of us. In lieu of performative solidarity ie #IBelieveBlackWomen, I’m going to need y’all to actually take the time to tell us apart and to recognize that this system of knowledge production across industry and academia has created a process of “vetting minorities” often silencing the most radical for a goldilock token-someone who’s confrontational enough to ethics wash the institution but not so much as to fundamentally challenge the status quo.
I got hella love for Timnit, so when she said she was driving across the country, I told her to watch out for the sundown towns. She was like you know I just learned what sundown towns are the other day from so and so. I wasn’t surprised because I’m first generation Ethiopian so I know first hand that the scope of Black American history African born students have exposure to — particularly in STEM — is limited. At the same time, how Sway? How have we all been waterboarded into zooming since March but nobody’s having conversations about the real? We need to talk about racial capitalism and all this identity politic merchandising us at every corner and structural analysis doesn’t mean institutions, policies and economies aren’t at the end of the day people. We have agency, my call out isn’t for some grand afropessimist nihilism, it’s to shake you all and say how we going to build this movement to change the world if you can’t hit up an engineer to say you going down the wrong path or integrate the humanities beyond evidence for an apriori argument about bias.
Before this weekend, Timnit, El Mahdi El Mhamdi and I had been talking about potentially co-founding an institute focused on content moderation/misinformation/surveillance in the global south with Morocco and Ethiopia as an initial case study. I’ve been so irritated with the celebrity encircling facial recognition technologies over the years, that I didn’t realize until 40,000 Ethiopian refugees crossed the South Sudan border that the UNHCR is the largest multinational biometric program. I didn’t realize how many people would be displaced from their homes due to misinformation amplified by Google’s recommendation as search function because only 1 of the 83 languages in Ethiopia has automated translation.
The stakes are high, the situation is grave-not for any individual engineer’s life but those in the cross hairs, impacted by facial recognition technologies, automated decision making systems, etc. The failure of the AI ethics initiatives and fairness, accountability transparency framework has allowed this moment to happen where the codified institutional resistance is immobilized and only the opportunists feel agency.
You people who I love and respect- out here with your AI resistance headbands on, with access to capital — but inhibited by your bourgeois anxiety — refugees, the homeless, the policed etc are for better or worse counting on you and you’re out here talking about corporate diversity. If the room taken up with building individual brands lent itself to researching with those most impacted; developing tactical initiatives like a social justice war room — we’d be in a very different place.
Innovation and technological progress, in the context of racial capitalism, are predicated on profound extraction so the next disaster is inevitable, our failure to respond and offer an alternative worldview is not. We can do better.