This Artwork Is a Satirical Siri to Check White People’s Racial Biases
American Artist and Rashida Richardson, Ally AI (still), 2019. Courtesy of the artists.
Becky, a white girl who lives in Brooklyn, might describe herself as “woke.” In her apartment, racial justice books are stacked alongside a foraging manual; a Carrie Mae Weems print hangs on the wall. Becky says that she “manifests the world around her,” and appears calm and collected. That is, until the rap music playing outside in her gentrified neighborhood gets too loud. She reflexively picks up her phone and holds it to her mouth—as if to ask Siri the weather. Instead, she asks Ally AI what to do about the noise.
Ally AI is a satirical voice assistant intended to help white people “be less problematic,” created by Rashida Richardson,director of policy research at AI Now, and American Artist, a multimedia artist who creates work to address intersections of blackness and technology. The project, which recalls a cross between a Dear White People episode and a Saturday Night Live sketch, debuted at the New Museum’s Seven on Seven. The annual Rhizome event sees seven technologists paired with seven artists to collaborate on a project. Richardson and American Artist presented the voice assistant parody through a series of sketch videos.
American Artist and Rashida Richardson, Ally AI (still), 2019. Courtesy of the artists.
In an ad for the voice assistant, American Artist asks: “Can’t tell if your [new product] is racist?” while pointing to Gucci’s infamous blackface sweater. “Ally AI can help.”
“Do you need answers to difficult questions like, ‘Can you say the N word while singing along to your favorite song?’” Richardson offers, “or ‘What’s the politically correct way to ask where someone is really from?’” Ally AI can help.
Richardson, the technologist half of the collaboration, has a background researching social implications of artificial intelligence, and recently published an article about dirty data and predictive policing. In their work, American Artist engages with surveillance and “networked life” as a black person, they explained. Their first foray into the overlap between technology and race was a chatbot inspired by Sandra Bland, and they are currently working on an exhibition about predictive policing software for the Queens Museum.
American Artist and Rashida Richardson, Ally AI (still), 2019. Courtesy of the artists.
The pair found a shared interest in tech-solutionism—the sanguine idea that technology can be a kind of cure-all for societal ills. Research into hiring algorithms, predictive policing, and facial-recognition software have shown that even the most data-driven technologies are not exempt from human biases. Rather, those biases are legitimized and perpetuated under the guise of objectivity. “Surveillance and other technologies are just updated versions of the analog strategies used to oppress people before,” American Artist explained via email.
As the ad explains, Ally AI claims to help users navigate “tricky” situations related to race. But if the app were a real product, it would likely perpetuate complacency by giving the white user a quick-fix through the voice assistant. Instead of questioning their role in oppression, such as the implications of living in a gentrified neighborhood like Becky, the characters in the satirical videos appear totally devoid of self-awareness. “White people—or, more generally, dominant groups—don’t want to engage in the labor of figuring things out, and that’s why many of the issues we make fun of persist,” Richardson noted.
American Artist and Rashida Richardson, Ally AI (still), 2019. Courtesy of the artists.
When Ally AI tells Becky to approach her neighbors about their loud rap music, she swears at it, and instead instructs Siri to call the police—a nod to the reality that white people in America are treated more fairly by law enforcement. It’s a grim picture of the present moment—that it’s easier for the white characters to talk to a robot than a person of color.
In one sketch, a white male tech worker insists that the company he works for should hire his cousin, who just went on a trip to Africa and knows five languages. When Ally AI chimes in to note that their staff is 90-percent white and male, and recommends they switch up their hiring algorithm to account for this, the man nods absently and says “HR would love that.”
American Artist and Rashida Richardson, Ally AI (still), 2019. Courtesy of the artists.
Richardson approached the work keeping in mind the “general aversion of people in tech and art spaces to address or even speak explicit about issues of race,” she said. “[It] is clearly a huge elephant in the room of both spaces and especially where they intersect.”
Richardson and American Artist unapologetically criticize so-called “politically correct” spaces. Virtue signaling, the public performance of moral correctness, runs through the sketches. We see it in Becky’s choice of books and the white male tech workers’ feigned enthusiasm for diversifying their hiring processes. Even the logo for Ally AI is a pair of safety pins—a reference to the post–2016 election symbol of “allyship.”
American Artist and Rashida Richardson, Ally AI (still), 2019. Courtesy of the artists.
While Ally AI proposes a solution to some of these problems, it—like many technologies—only further alienates marginalized groups by allowing white people to hide behind their iPhone screens. Despite being more connected than ever, our devices leave us distanced from pressing issues.
Ultimately, as Richardson explained, real change requires direct engagement. “If people want to genuinely address their own racial biases or racism more broadly in society,” she said, “then they must engage with the issues, educate themselves, and work with other white people.”