Blame the Data

Book review: Race after Technology by Ruha Benjamin

This is a timely book. It hit the market just as three United States cities – San Francisco, Oakland and Somerville – voted to ban facial recognition cameras. Race after Technology covers much more than just that software’s tendency to classify Black faces as criminals. But this topic exemplifies Ruha Benjamin’s arguments.

As a Black Associate Professor at Princeton, she has access to a wealth of research. Notes and references fill almost one-third of the book’s pages. So this is not a rant against racist robots. It’s a reasoned exploration of what technology means for our concept of race.

On facial recognition, it is easy to chant the geeks’ mantra: “garbage in, garbage out” – to blame the data that are used to train the Artificial Intelligence for any racist outcomes. If the biggest available datasets of Black faces are the custody mugshots of suspects and prisoners, this will “teach” the AI to associate Black with crime. Weighting the algorithm to correct the balance could fix the problem, say some commentators – Harvard’s Dr James Zhou, for example.

Others such as Joy Buolambwini of Massachussetts Institute of Technology insist it is wrong to rely on facial recognition it, since it frequently makes mistakes – especially with Black people. She has founded a civil rights movement, the Algorithmic Justice League, to advocate for public scrutiny in AI use cases. Buolambwini insists facial recognition should never be used where lethal force is required because of this danger of false positive misidentification.

Not cited in the book, but significant for European readers, a new report from the University of Essex notes that in their observation of London’s Metropolitan Police live trial of facial recognition  it frequently made mistakes. Professor Peter Fussey’s report cites research showing “All classifiers performed best for lighter skinned individuals and males overall. The classifiers performed worst for darker females.”

Again, the data on which the AI trained is blamed for this poor performance.

Benjamin rejects this technical approach. Automated decision-making is already selecting job applicants, university students, benefit claimants and prisoners for parole. Human oversight is required, she insists:

Even when public agencies are employing such systems, private companies are the ones developing them, thereby acting like political entities but with none of the checks and balances. …   which means that the people whose lives are being shaped in ever more consequential ways by automated decisions have very little say in how they are governed.”

Here the clever play on words in the book’s title becomes clear. Humans are racing after technology, trying to catch up with its data-hungry evolution. Benjamin, Buolawambini and others want a mandatory PAUSE button, so that before a new AI product or service is rolled out, there is an informed public debate – an Algorithmic Impact Study.

The title’s other meaning questions what will be our notion of race after technology has moulded it? Diving deeper into semantics (well, she is a sociologist after all!) the author contends that “data portability, like other forms of movement is already delimited by race as a technology (her emphasis) that constricts one’s ability to move freely.” Is race a technological construct? That seems like political correctness taken to a wild extreme. Yet if we regard racial classifications the way that Jaron Lanier describes the categories in online forms (You are Not a Gadget, 2010) it makes sense. Personal data must be made to fit categories, so that it can be cleaned and scraped and feed the algorithms in a logical way. Machines make us fit their parameters.

And the current controversy about whether the US 2020 census should include a new question about citizenship shows that the official collection of data – far from being neutral – is a political act.

Equally thought-provoking is Race After Technology’s historical perspective. Kodak film would not capture Black faces because of its chemical composition. Polaroid’s instant camera became reviled amongst Black South Africans during apartheid because it enabled the (White) police to take instant mugshots of suspects. These examples show the author’s wide sweep of archive material.

The book’s stellar array of quotes from Black and female academics makes it worth buying for commentators and companies seeking excellent, well-qualified BAME women to diversify their workforce or extend their range of experts.

And for all who wonder how AI will develop in the near future, Ruha Benjamin quotes a young woman who discovered her social worker using Electronic Benefit Cards to automatically track her spending. “You should pay attention to what happens to us” the woman said, “You’re next.”

Benjamin concludes: “We need to consider that the technology that might be working just fine for some of us now could harm or exclude others….a visionary ethos requires looking down the road to where things might be headed. We’re next.”

 Race After Technology by Ruha Benjamin is published by Polity Press, July 2019.