
Google engineers don’t take sociology courses. And yet — with search engines guiding our every decision — they’re shaping society behind closed doors. Safiya U. Noble, associate professor of Information and African American Studies at UC Los Angeles, is trying to unveil the bias behind code.
On March 12, Kresge College and the UC Santa Cruz Humanities Institute hosted Noble at Kresge Town Hall to discuss her new book, “Algorithms of Oppression.” It’s about the subtle ways in which search engines reinforce harmful stereotypes about marginalized groups, a topic often overlooked in other books about big data.
“[Other books] were not centering vulnerable people,” Noble said. “They weren’t using frameworks of analysis that would allow them to kind of see the disparate impact search engines have on vulnerable communities, [like] women and girls of color.”
The event was a crossover between Kresge College’s Media and Society lecture series and the Humanities Institute’s Data and Democracy initiative. Over the course of the year, the two programs have chewed opposite ends of similar questions, like how institutions will adapt to a changing technological landscape. They met in the middle at “Algorithms of Oppression.”
“The conversation about data is so present in our society right now,” said Rachel Deblinger, research program manager of the Humanities Institute. “When we were planning for the year’s theme last spring, there was revelation after revelation of data breaches and questions about the security of our voting systems.”
Although most associate Google with objectivity and hard data, this scope is limited.

“Algorithms of Oppression” takes readers behind the pixel curtains of Google search. As Noble explains, the platform is more auction house than library, with for-profit interests bidding for the attention of potential customers. To do this, companies circulate brand-associated keywords across the web — and in doing so, divert search queries of these keywords to their sales pages.
Called “search engine optimization” (SEO), this marketing strategy is the reason why Googling “love” links to eBay. But while SEO might be a Digital Age version of last century’s jingles, Noble said that it’s one of many forces online that incentivize companies to traffic in bigotry.
“Going all the way back to the first full length film through contemporary Hollywood, we have stereotypes being incredibly profitable, because they reinforce negative depictions of African Americans, of Latinos, of [the] indigenous people of the Americas,” Noble said. “We wouldn’t have those types of images circulating in our society if they weren’t tied to products and services and films.”
Put simply, Google not only reflects — but amplifies — stereotypical views. This creates a version of reality in which groups are reduced to the most sensational imagery they are associated with.
In her talk, Noble ran through the gambit of recent cases of “algorithmic oppression.” One, of 2016 tweetstorm fame, compared the results of Googling “three white teenagers” and “three black teenagers.” Whereas the former displayed pictures of wholesome all-Americanism, the latter displayed an array of mugshots, lineups and arrests.
Since Google plays such an outsized role in informing the way young people view the world, inaccuracy can kill. Twenty-one-year-old Dylann Roof’s shooting of nine Black churchgoers on June 17, 2015 was due in part to Google’s algorithms, Noble said, with Roof’s search queries about Black-on-white crime fueling the rationale for a perceived need to retaliate.
“Dylann Roof was trying to make sense of the news media story. He was trying to fact check,” Noble said. “What you find when you search on Black-on-white crime is not, for example, FBI statistics that show that violent crime is actually an intra-racial phenomenon. […] White Americans are actually more likely to be murdered by other white Americans [than by Black Americans].”
To avoid similar tragedies of misinformation, Noble makes the case for an information society based on curated knowledge. Taking cues from the data centers of old — libraries — she said that search engines moderated by librarian-like professionals would prevent the spread of misleading facts. These databases would be maintained for the public interest.
“One of the things that I call for in the book is public interest search engines as a counterweight to commercial advertising platforms,” Noble said. “It doesn’t make sense that [we] have come to expect that all the world’s knowledge can be known in 0.03 seconds. […] We would be out of business at universities if that were the case.”