With my PhD in English Literature at Edinburgh University about to begin, I will be reading lots of stuff this year. Do not expect weekly reviews, I do not read quickly. But I will share with you anything interesting I do read, whether it’s a novel that’s in vogue, or something from my course that I think is worth knowing that broadened my horizon. I’ll be reading a lot of things about transgender discourse, but hopefully, a lot of things which aren’t, as well.
Algorithms of Oppression: How Search Engines Reinforce Racism by Safiya Umoja Noble
Should the Internet be regulated? It's a question I've never thought about, until recently, and the reading of this book. Here is my review of a book about a topic I can barely talk about without looking like those aged politicians trying to grill Mark Zuckerberg in that Senate Committee, like a Retirement Home ensemble unable to use the remote control.
It's a word you often hear, its meaning quite innocuous, but the impact on the world profound. Algorithm, to those like me who use the Internet but don't really know how it works, is an all-knowing cyber-oracle that takes us where we want to go, or from working out our profile, suggests things we might like. To be technical, by quoting an online definition, algorithm is:
'a process or set of rules to be followed in calculations or other problem-solving operations, especially by a computer.'
So far, so innocent. Yet in her study Algorithms of Oppression (2018), Safiya Umoja Noble investigates, with considerable technical expertise and experience in the field, the illusion of this 'neutral technology.' The result is the exposure of algorithms as reinforcing the neoliberal, and patriarchal, white Western ideology of its creators, the Silicon Valley set of Zuckerbergs and Musks. Noble's book reveals that our Internet-funnelled choices are weighted towards particular cultural and commercial interests, presented with the veneer of omniscient impartiality.
To give one example: 'Google Search,' says Noble, is 'an advertising platform, not intended to solely serve as a public information resource in the way that, say, a library might. Google creates advertising algorithms, not information algorithms' (38).
The hegemony of the white male gaze, and the popular and lucrative accessing of pornography, means particular representations of the female will be favoured. More egregiously are the consequences for people of colour. Noble's research started with an online search for 'black girls,' resulting in proliferations of sexualised and dehumanized imagery. Search engines would prioritize moneyed representations that distorted black women according to a series of mainstream, male-driven caricatures: 'Women, particularly of color, are represented in search queries against the backdrop of a White male gaze that functions as the dominant paradigm on the Internet in the United States.' Noble's message throughout the book is that we have to regulate the search engines more effectively, to ensure that representations of the disempowered don't become the domain of myths and caricatures of the empowered majority culture.
Relating to this is arguably the most directly damaging impact of Google's algorithms: the Charleston church shooting in 2015, in which white supremacist Dylann Roof entered a prayer service with a handgun and murdered nine people, all African Americans. As Noble says of Roof's movements prior to the killing, 'Roof allegedly typed "black on White crime" in a Google search . . . What Roof found was information that confirmed a patently false notion that Black violence on White Americans is an American crisis. Roof reportedly reached the Council of Conservative Citizens (CCC) when he searched Google for real information . . . For Roof, CCC was a legitimate information resource purporting to be a conservative news media organization.' The CCC is apparently a slightly more respectable version of the KKK, a segregationist movement presenting objectivity.
Noble questions how Google could prioritize, for a search of 'black on white crime,' a fact-checked-free site of white segregationists like the CCC's, instead of a source of fact-checked information like the FBI's crime statistics; in the latter case, the statistics demonstrate 'how crime against White Americans is largely an intraracial phenomenon,' of white-against-white violence being the commonality. The Charleston Church shooting arguably was caused by the way a search engine that guides inquiries to skewed racist websites, purporting to be factual.What Noble's revelations provide is further evidence of the need for regulating the Internet, including the use of algorithms and search engines, with their dangerous presentation of white male mythologies on the front pages of our searches. With 'fake news' contaminating fact-checked information on a daily basis like never before, white supremacist ideology in particular appears to be shaping the news, and people's responding actions. How one regulates the Internet and its search engine algorithms is, of course, another issue entirely. What is clear is that the Internet is already not a site of neutrality or liberation, and our searches are not just some trip into the library. Commercially-driven manipulations are already occurring. Disempowered peoples are victimized due to it. Nobel's book is an excellent point from which to begin discussing this complex area.