Algorithmic Oppression: A Concept

“In 2011, if you were to type anything about “Black Girls” into google, among the first couple of search results you would see would have something to do with porn.”

The Power of Algorithms

“Briefly describe how Google Searches about Black Girls have changed overtime.”–a question that was posed on one of our homework assignments of this week. In response, as stated by Safiya Umoja Noble, author of Algorithms of Oppression, “if you were to type anything about ‘Black Girls’ into google, among the first couple of search results you would see would have something to do with porn”. At first thought it may seem surreal, but based on our own research, Noble’s statement is far from false. When searching “Black Girls” in our web browser, the first thing that we were met with were sites that read “sexy black girls” or “black girl dating”. Although not necessarily porn, these search results still held subtle dehumanizing and discriminatory characteristics. Granted, as addressed by Noble, pornographic results have reduced as time has gone on, but in the span of two years following her initial search, they remained the same. Soon enough, they were gone, but the erasure of these results were very abrupt.

How did that happen?

Discriminatory behavior on search engines isn’t a recent phenomenon. They’ve existed forever, and it is because of algorithms that we have endured this. Algorithms, or a process or set of rules to be followed in calculations or other problem-solving operations, can be dangerous if not corrected, especially when digital. Although this idea of a set of rules designed to control a computer may seem like an easy concept to grasp, that is definitely not the case. It can be difficult to understand the inter-working of algorithms because they are intricate mathematical formulas.

As proven earlier, there are several instances of discriminatory behavior in algorithms. This is commonly referred to as algorithmic oppression. Algorithmic oppression describes cases when such algorithmically driven data failures are specific to people of color and women, underscoring the structural ways that racism and sexism are fundamental.

People across the nation are taking steps to stop this idea of algorithmic oppression. In 2013, the UN launched a campaign directed by the advertising agency Memac and Ogilvy & Mather Dubai using “genuine Google searches” to bring attention to the sexist and discriminatory ways in which women are regarded and denied human rights. By launching this campaign, they discovered how far we are from achieving gender equality. Not only did this project show the discriminatory nature of the searches, but it also showed just how powerful the nature of search engine results are.

Leave a comment