THE HIDDEN RACISM in shiny new systems

Hazel Bryan and Elizabeth Eckford, Little Rock, Ark., September 1957.

Right now social media has overflowed with hate speech, racial hatred and discriminatory memes. Online race hate crime makes up the majority of all online hate offences. However, the problem is not only hate speech. A rusty value judgements embedded in shiny new systems reinforces the discriminatory systems of a previous era - says author of Race After Technology, Ruha Benjamin.

“If we consider that institutional racism is an ongoing unnatural disaster, then crime prediction algorithms should more accurately be called crime production algorithms ,” Ruja Benjamin states in Race After Technology.

A rusty value judgements embedded in shiny new systems could - says Benjamin - deepen the divides between have and have-nots, between the deserving and the undeserving. In an era driven by the-rush-for-the-new engineered inequity risks to amplify social hierarchies. That said, they already do: In a recent audit of California’s gang database, Blacks and Latinxs constitute 87 %, however many of the names listed turned out to be babies under the age of 1. And as Benjamin states: “Once someone is added to the database, whether they know they are listed or not, they undergo even more surveillance and lose a number of rights”. Another example of the hidden digital caste system, structured by existing hidden hierarchies and power structures is a Princeton study . Researchers found that, a popular algorithm trained on human writing online, associated White-sounding names with “pleasant” words and Black-sounding names with “unpleasant” ones. This is what Benjamin calls “the New Jim Code”: the employment of new technologies that reflect and reproduce existing inequities but that are promted and perceived as more objective or progressive that the discriminatory systems of a previous era.

The problem is not technology, the problem is that we are all expected to pay no attention to the man behind the screen. This is the billion dollar marketing (+ lobby) trick that caused the sleepwalking into a new century: Technology is neutral and can fix everything. As Benjamin puts it: “In that light, I think the dominant ethos in this arena is best expressed by Facebook’s original motto: “Move Fast and Break Things.” To which we should ask: What about the people and places broken in the process? (…) I think we should stop calling ourselves “users”. Users get used .“

A new study on hate speech presented in The British Journal of Criminology examines how social media platforms in England and Europe have facilitated the propagation of extreme narratives often manifesting as hate speech targeting minority groups. Data show that 1,605 hate crimes were flagged as online offences between 2017 and 2018, representing 2 per cent of all hate offences. This represents a 40 per cent increase compared to the previous year. Online race hate crime makes up the majority of all online hate offences (52 per cent), followed by sexual orientation (20 per cent), disability (13 per cent), religion (12 per cent) and transgender online hate crime (4 per cent).  

As Ruha Benjamin puts it in Race After Technology we must consider the machine-learning systems behind the screens and especially how the algorithms are trained and by whom? “Even when public agencies are employing such systems, private companies are the ones developing them, thereby acting like political entities but with none of the checks and balances. They are, in other words of one observer, “governing without a mandate,” which means that people whose lives are being shaped in ever more consequential ways by automated decisions have very little say in how they are governed.” Benjamin concludes: “Whenever we hear the promises of tech being extolled, our antennae should pop up to question what all that hype of “better, faster, fairer” might be hiding and making us ignore. And, when bias and inequity come to light, “lack of attention” to harm is not a viable alibi. One cannot reap the reward when things go right but downplay responsibility when they go wrong.”

Ruha Benjamin, Race After Technology (2019)

Matthew L Williams, Pete Burnap, Amir Javed, Han Liu, Sefa Ozalp, “Hate in the Machine: Anti-Black and Anti-Muslim Social Media Posts as Predictors of Offline Racially and Religiously Aggravated Crime” in The British Journal of Criminology, Volume 60, Issue 1, January 2020, Pages 93–117, https://doi.org/10.1093/bjc/azz049

Photo: Hazel Bryan and Elizabeth Eckford, Little Rock, Ark., September 1957.

Previous
Previous

LOOK AT ME // The psychology of Selfies

Next
Next

The endgame of the Silicon Valley business model