Member Reviews
I loved the thesis of the book but I found myself bored halfway through.
The data is stunning, we sholud be strive for equal representation online without discrimination.
I subscribe to the list for the Progressive Librarians Guild which recently established a group on Goodreads. They decided to select Algorithms of Oppression by Safiya Omoja Noble as their first book for discussion. I downloaded a copy for review from Net Galley last year. So I prioritized it and started reading it as soon as I could. I finished it toward the end of January, and this is my review.
Safiya Umoja Noble is described in About The Author as an instructor in the University of California at Los Angeles School of Education and Information Studies. So UCLA combined their library school with the education department. Since librarians are educators, this does make a certain amount of sense. I should note that UCLA still does have a Masters of Library Information Science degree program.
The foundation of Noble's argument is her discussion of how search works. This was an eye opener for me. It shouldn't have been, but I hadn't examined the topic critically. She states that citation analysis is the basis for page ranking in search.
Citation analysis is used by scholars to determine the significance of a book or article in a particular field. Noble then tells us that the problem with citation analysis is that there is no consideration of whether the material in question is being cited approvingly or negatively. So the number of citations doesn't indicate whether other scholars thought it was well written or valid. Similarly, search is supposed to give us the most popular results first. The most obvious problem is that the most popular websites aren't necessarily the most relevant for any particular user. I was taught in library school that librarians need to be search professionals who know how to elicit relevant results with the most specific search terms.
Is this perception of search as a neutral process really an accurate description? Users of search tend to assume that the top results are indeed the most popular. Noble interrogates our assumptions about search. What is search engine optimization? It means that some have found ways to game the system. Advertising is also a factor. After all, users don't pay for their searches. Search engines need to be profitable in order to survive. So the top results in many cases are advertisers. As users, most of us would say that we are willing to tolerate advertising in return for free services. Are there limits to this tolerance? What if the advertisers are offensive to users? What if they promote racist or sexist attitudes?
Noble shows us numerous examples of racist and sexist top search results. She does admit that Google is eventually responsive to complaints, and makes changes to ameliorate the situation. This means that activist users must be vigilant, and refuse to allow advertisers or gamers of the system to demean entire groups of people.
Algorithms of Oppression is a significant book. Information professionals and students in the field should definitely read it, but I think it's also illuminating to anyone who uses search engines.
This was a rough read. It is depressing to see more of the bigger picture that was obscured by my ignorance about the business side of search engines, the internet in general, related corporate dominance, and how the algorithms work. That it is commonplace to say "google it" is more frightening now. I read pretty broadly on racism and other dynamics of social oppression, so that we have deeply rooted structural racism is not new to me, but Noble's book covered an area I was very unfamiliar with even though it is part of my daily life.
https://www.goodreads.com/review/show/2400804793
I had to edit this review severely to get it up on Amazon as my first version was bounced for using terms like black and porn. I was just explaining what the contents of the book were describing.
Still waiting to see if Amazon accepts it.
I have a continuous struggling in understanding how algorithms work, I am so glad I was given the chance to read this amazing book! The author makes it look less complicated.
Safiya Umoka Noble has produced a fascinating study of data discrimination. This is highly recommended reading.
Collects a lot of criticisms of Google’s dominance, focused on the racist implications of taking the corpus as you find it, e.g., searching for “black girls” for a long time returned only porn on the first page. Only some people can live in a “filter bubble” where everything they see reinforces their own beliefs; racism and sexism come to people who don’t want them. Noble had to “take it as a given that any search I might perform using keywords connected to my physical self and identity could return pornographic and otherwise disturbing results,” and asks: “Why was this the bargain into which I had tacitly entered with digital information tools? And who among us did not have to bargain in this way?” Black Girls Code is nice, Noble argues, but it’s not black girls’ job to solve Silicon Valley’s racist exclusion and misrepresentation.
The problem is: what to do? Noble complains that Google directs searches to conglomerate news sources, but on YouTube that doesn’t happen and the results seem to be worse, leaning towards extremism and conspiracies, with a lot of racism. Past forms of information sorting were really bad too; Noble notes the history of racist Dewey Decimal System and Library of Congress classifications (not just history, though more contested now). She also discusses how Dylann Roof was radicalized by reading online, starting from Wikipedia and going from there—searches for “black on white crime” lead to white supremacist sites, rather than to neutral crime statistics that would reveal that most crime is intraracial. Could anything other than human moderation stop this pattern? I just don’t know; Noble suggests developing public search engines so that corporate motivations wouldn’t control the data collection/surveillance, but (1) they’d still confront the problems of dealing with a racist corpus, and (2) I’m not so hot on government surveillance either. Another suggestion is a black-friendly search engine, and there are some moves towards that, but I don’t think that solves the problem for people who don’t know to seek it out in the first place—or people like Roof.
The last chapter of the book focuses on a small business owner who cares for black hair, and whose business was harmed by two neoliberal blows—a decrease in the number of African-American students because of anti-diversity policies, and the rise of Yelp, which represented an increased cost—they’d only give her prominence/keep other hairdressers off her page if she paid, even if the other places didn’t specialize in black hair—and also presented particular difficulties for her reviews, inasmuch as she perceived that her customers were less likely to use Yelp in the first place than white people, so their reviews of her place might be the only reviews those customers left on Yelp and thus were more likely to look fake to Yelp. “Black people don’t ‘check in’ and let people know where they’re at when they sit in my chair. They already feel like they are being hunted; they aren’t going to tell The Man where they are. I have reviews from real clients that they put into a filter because it doesn’t meet their requirements of how they think someone should review.” Not that she was all that fond of all her customers—she also complained about people who came into her business to photograph the products she used, then order them online for less. Again, search engines aren’t the only problem she’s facing; it’s a constellation of economic and social changes of which search engines are only a part, perhaps a minor part, though it’s certainly worth pointing out that the small producers are the ones from whom wealth can still be extracted by these larger companies like Yelp.
Most conversations about racism in America focus on individual acts and people. Was LuAnn wearing blackface as Diana Ross racist? When white people use the n-word ironically is it racist? Is Steve King of Iowa racist? (Yes, yes, and yes.) Unfortunately, those conversations consume so much of the air in the room, there is little energy to focus on the structures that reinforce and sustain racism, systems that guarantee that racist oppression and disparities would continue even if not one single racist epithet is uttered ever again anywhere in the world.
The problem is structural or systemic racism is usually invisible, which makes books like Algorithms of Oppression from Safiya Umoja Noble so critical to advancing justice. Noble happened to be google “black girls” and the autocomplete prediction that came up was appallingly pornographic. She wondered what kind of messages that sends, how the imprimatur of search, which many people mistake for research, implicitly endorses the results no matter how problematic they may be. This motivated years of research on those algorithms that have so much power to define and shape our world. These are instructions, code that is private and proprietary, code that operates with a kind of impunity. It’s code, how can code be racist?
But these sorts of errors have been part of our information systems all along. Consider the ubiquitous Dewed Decimal System or the Library of Congress classifications. When activists lobbied successfully to get the Library of Congress to stop classifying immigrants without documents as illegal aliens, Congress hurriedly introduced a law to prohibit political influence in the classifications, as though the original classifications were not political already. In another example, although most people in the world belong to other faiths and most of the texts in the world are from other faiths, eighty percent of the classification labels in the system are related to Christianity. Even the geography of Asia and Africa get the short end of the classification stick. No one sat down to create a racist classification system, it was just a matter of attention to and awareness of white culture and attention withheld and ignorance of non-white cultures and sources.
White supremacy is often a matter of simply being the default. By being the default, nothing needs to be said, it just is. Noble makes the critical point that bad search results are the result of decisions made in creating the algorithms that govern what they produce. That they can be modified for better results is evident by Google’s decision to downgrade the rank of porn results. That result ranking can be manipulated is evidenced by the big Search Engine Optimization industry, by Dan Savage’s cyber-prank santorum, and by Google’s own actions.
Noble does an excellent job making her point. She builds her case carefully and expands beyond the readily obvious to looking deeper, at card catalogs, at the Library of Congress, making the argument that if information retrieval is biased, the basic facts needed to redress inequity are not even available. When examining information classification, archival, and search algorithms through the lens of critical race theory, Noble strives to be dispassionate and analytical. It makes those chapters a bit of a struggle. Some might say she becomes “overly academic”, but that reminds me of this tweet.
ok hi what if for 2018 we agree to stop including “the prose is too academic” in reviews of academic books from academic presses by academics writing books that their fields require them to write
— free kellyanne (@DearSplenda) November 23, 2017
It’s actually not that there is too much academic jargon. There isn’t. It is that sometimes she writes dispassionately about something we know she cares about passionately. If we care for justice, we must be passionate about it, too. The thing is, when people write about race and gender and challenge the power structure, they will be attacked. Writing dispassionately, analytically, dare we say academically, is a necessary bulwark against those attacks.
Her passion comes through loud and clear in the final chapter. There is a riff on how technology will not save us that would make Gil Scott-Heron proud. She takes down a lot of popular shibboleths of the technocrats, the idea the colorblind is good, that racism has been solved, that the solution to racism is an individual practice, that we are postracial, that all we need to fix racism is fix the feeder system to education and industry. She presents the evidence and has receipts. This is a myth-busting book, busting the comfortable myth that technology and computers can’t be racist. They are made by people, of course they can be.
I received an e-galley of Algorithms of Oppression from the publisher through NetGalley.
Algorithms of Oppression at NYU Press
Safiya Umoja Noble author site
An important book about an important topic. While I wanted more conclusions and even more information, I do find this to be a good and useful book that everyone should read to learn more about how algorithms are very much not neutral. What can we do about this? Well, the answer remains to be seen. Nevertheless, Noble lays out a vast amount of evidence for us to consider and come to terms with.
Full review to come on the blog.
In Algorithms of Oppression, Safiya Umoja Noble, really gets to the bottom of the systemic problem of racism plaguing our society. Algorithms are in general flawed but can be and are used as a tool to promote the continuation of social problems and underlining bias whether they are designed to be or not.
Back in 2009, Safiya Umoja Noble googled the words “black girls.” To her horror, the search yielded mostly pornographic results. A similar quest for “white girls” gave out far less demeaning content. The lewd results from that google search are far less prominent nowadays but this doesn’t mean that Noble’s inquiry into how race and gender are embedded into google’s search engine has lost its purpose. Google, her book demonstrates, is still a world in which the white male gaze prevails.
The author sets the stage for her critique of corporate information control by debunking the many myths and illusions that surround internet. She explains that, no, the Google search engine is neither neutral nor objective; yes, Google does bear some responsibility in its search results, they are not purely computer-generated; and no, Google is not a service, a public information resource, like a library or a school.
Google is not the ideal candidate for the title of ‘greatest purveyor of critical information infused with a historical and contextual meaning.’ First, Google might claim that it is an inclusive company, but its diversity scorecard proves otherwise. While it is slowly improving, it’s still nothing worth shouting over the rooftops about. And it’s not just Google, a similar lack of diversity can be observed all over Silicon Valley.
Another reason why we shouldn’t trust Google to provide us with credible, accurate and neutral information is that its main concern is advertising, not informing. That’s why we should be very worried. While public institutions such as universities, schools, libraries, archive and other memory spaces are loosing state funding (the book focuses on the USA but Europe isn’t a paradise either in that respect), private corporations and their black-boxed information-sorting tools are taking over and gaining greater control over information and thus over the representation of cultural groups or individuals.
Noble ties in these concerns about technology with a few observations regarding the sociopolitical atmosphere in her country: disingenuous ideologies of ‘colourblindness’, the rise of “journalism” that courts clicks and advertising traffic rather than quality in its reporting, a head of state known for his affinities with white supremacy and disinformation and a climate characterized by hostility towards unions, movements such as Black Lives Matter.
What makes Algorithms of Oppression. How Search Engines Reinforce Racism particularly interesting is that its author doesn’t stop at criticism, she also suggests a few steps that we (the internet users), Google, its Sili Valley ilk and the government should take in order to achieve an information system that doesn’t reinforce current systems of domination over vulnerable communities.
Noble strongly calls for public policies that protect the rights to fair representation online. This would start with a regulation of techno giants like Google that would prevent it from holding a monopoly over information.
She also urges tech companies to hire more women, more black people or more Latinos to diversify their tech workforce, but also to bring in critically-minded people who are experts in black studies, ethnic studies, American Indian studies, gender and women’s studies and Asian American studies as well as other graduates who have a deep understanding of history and critical theory.
Noble also encourages internet users to ask themselves more often how the information they have found has emerged and what its social and historical context might be.
Finally, the author suggests that non profit and public research funding should be dedicated to explore alternatives to commercial information platforms. These services wouldn’t be dependent on advertising and would pay closer attention to the circulation of patently false or harmful information.
Algorithms of Oppression is a powerful, passionate and thought-provoking publication. It build on previous research (such as Cathy O’Neil’s book Weapons of Math Destruction) but it also asks new questions informed by a black feminist lens. And while Noble’s book focuses on Google, much of her observations and lessons could be applied to many of the tech corporations that mediate our everyday hyper-connected life.
When it comes to Algorithms of Oppression, I am forced to sit on the fence. Because, even though I found the insights that the book offered to be unique, thought-provoking and completely relevant to our new lives in the online sphere, the writing was just not there for me - it was repetitive, tedious and dry, even despite the fascinating topic that the study was focussed on. This was a huge barrier for me, because I wanted (by god, I wanted) to soak up as much as I could on a topic that absolutely captives me, but the speed at which I could read the study - and keep my focus from wandering- was hindered by the writing.
But besides all of that, in Algorithms of Oppression, Safiya Umoja Noble examines the racial and sexual bias against black women and girls by online search engines, dissecting the dominant lens in which these algorithms expect internet-surfers to look at them through: a white, heterosexual male one. Honestly I had never even thought that much about Google, despite it being such a huge part of online life, but Nobel’s findings make me want to pay closer attention to it, and its search results, in the future.
Google, after-all, has consistently signposted to sites that include ‘fake news’; revenge porn; racist, sexist, homophobic and anti-Semitic views; white supremacy; and porn sites that focus on the fetishisation of minorities. Nobel covers all of these forms of oppression in her study, showing that the only group of people who are really gaining anything from search engines are straight, white men.
Shocking, right?
The speed at which society changes seems to have grown exponentially since the birth of the internet, so who knows how relevant this study will be in relation to internet culture, in a few years time. I mean, who knows, right? With the election of Trump as President, maybe it will be even more relevant.
But, right now at least, Nobel presents an accurate and disturbing picture of a website that forms one of the foundation-blocks of online culture, and I urge everyone to pick up this book and learn more about an intrinsically-biased part of our everyday lives.
This book shows how Google search engine helps in some ways to propagate oppression of certain group of people based on race and gender. I find this book passionate about its topic although I would prefer to read about the things that Google are doing to minimize the oppressive effects laid out in this book.
Informative and well-argued. The book focuses mostly on Google and how its search algorithms are not neutral, but mimic society's biases and then serves to amplify them. The book focuses specifically on the ways that the current and past search results are racist and sexist against black women and girls and how this can harm them, and it is also a call to action to do better. There are also several key examples of ways that the search results are biased against other minority groups.
So you read So You Want to Talk About Race and now you have more questions. Specifically, you’re wondering how privilege affects your life online. Surely the Internet is the libertarian cyber-utopia we were all promised, right? It’s totally free of bias and discrimina—sorry, I can’t even write that with a straight face.
Of course the Internet is a flaming cesspool of racism and misogyny. We can’t have good things.
What Safiya Umoja Noble sets out to do in Algorithms of Oppression: How Search Engines Reinforce Racism is explore exactly what it is that Google and related companies are doing that does or does not reinforce discriminatory attitudes and perspectives in our society. Thanks to NetGalley and New York UP for the eARC (although the formatting was a bit messed up, argh). Noble eloquently lays out the argument for why technology, and in this case, the algorithms that determine what websites show up in your search results, is not a neutral force.
This is a topic that has interested me for quite some time. I took a Philosophy of the Internet course in university even—because I liked philosophy and I liked the Internet, so it seemed like a no-brainer. We are encouraged, especially those of us with white and/or male privilege, to view the Internet as this neutral, free, public space. But it’s not, really. It’s carved up by corporations. Think about how often you’re accessing the Internet mediated through a company: you read your email courtesy of Microsoft or Google or maybe Apple, and ditto for your device; your connection is controlled by an ISP, which is not a neutral player; the website you visit is perhaps owned by a corporation or serves ads from corporations trying to make money … this is a dirty, mucky pond we are playing around in, folks. The least we can do as a start is to recognize this.
Noble points out that the truly insidious perspective, however, is how we’ve normalized Google as this public search tool. It is a generic search term—just google it—and, yes, Google is my default search engine. I use it in Firefox, in Chrome, on my Android phone … I am really hooked into Google’s ecosystem—or should I say, it’s hooked into me. But Google’s search algorithms did not spring forth fully coded from the head of Zeus. They were designed (mostly by men), moderated (again by men), tweaked, on occasion, for the interests of the companies and shareholders who pay Google’s way. They can have biases. And that is the problem.
Noble, as a Black feminist and scholar, writes with a particular interest in how this affects Black women and girls. Her paradigm case is the search results she turned up, in 2010 and 2011, for “black girls”—mostly pornography or other sex-related hits, on the first page, for what should have been an innocuous term. Noble’s point is that the algorithms were influenced by society’s perceptions of black girls, but that in turn, our perceptions will be influenced by the results we see in search engines. It is a vicious cycle of racism, and it is no one person’s fault—there is no Chief Racist Officer at Google, cackling with glee as they rig the search results (James Damore got fired, remember). It’s a systemic problem and must therefore be addressed systemically, first by acknowledging it (see above) and now by acting on it.
It’s this last part that really makes Algorithms of Oppression a good read. I found parts of this book dry and somewhat repetitive. For example, Noble keeps returning to the “black girls” search example—returning to it is not a problem, mind you, but she keeps re-explaining it, as if we hadn’t already read the first chapter of the book. Aside from these stylistic quibbles, though, I love the message that she lays out here. She is not just trying to educating us about the perils of algorithms of oppression: she is advocating that we actively design algorithms with restorative and social justice frameworks in mind.
Let me say it louder for those in the back: there is no such thing as a neutral algorithm. If you read this book and walk away from it persuaded that we need to do better at designing so-called “objective” search algorithms, then you’ve read it wrong. Algorithms are products of human engineering, as much as science or medicine, and therefore they will always be biased. Hence, the question is not if the algorithm will be biased, but how can we bias it for the better? How can we put pressure on companies like Google to take responsibility for what their algorithms produce and ensure that they reflect the society we want, not the society we currently have? That’s what I took away from this book.
I’m having trouble critiquing or discussing more specific, salient parts of this book, simply because a lot of what Noble says is stuff I’ve already read, in slightly different ways, elsewhere—just because I’ve been reading and learning about this for a while. For a newcomer to this topic, I think this book is going to be an eye-opening boon. In particular, Noble just writes about it so well, and so clearly, and she has grounded her work in research and work of other feminists (and in particular, Black feminists). This book is so clearly a labour of academic love and research, built upon the work of other Black women, and that is something worth pointing out and celebrating. We shouldn’t point to books by Black women as if they are these rare unicorns, because Black women have always been here, writing science fiction and non-fiction, science and culture and prose and poetry, and it’s worthwhile considering why we aren’t constantly aware of this fact.
Algorithms of Oppression is a smart book about how colonialism and racism are not over. They aren’t even sleeping. They’ve just transformed, rebranded for the 21st century. They are no longer monsters under the bed or slave-owners on the plantation or schoolteachers; they are the assumptions we build into the algorithms and services and products that power every part of our digital lives. Just as we have for centuries before this, we continue to encode racism into the very structures of our society. Online is no different from offline in this respect. Noble demonstrates this emphatically, beyond the shadow of a doubt, and I encourage you to check out her work to understand how deep this goes and what we need to do to change it.
Really enjoyed a pre-release of this book. Dr. Noble highlights how search engines actually work, and their social consequences. What happens when a little girl googles for other black girls and finds pornography? What happens when a boy searches for civil rights leaders and finds white supremacy websites instead? And if Google is our shared public source of information, don't they have a social responsibility for the quality of results? To be fair, her final chapter calls for change in the biased categorization systems of libraries, as well. I'd give it 4 stars because a careful academic argument isn’t for everyone (on my wish list: a popularized version for my family and friends!), but 5 stars for the quality and range of her thought!
At once a primer on Black feminist thought *and* an examination of the mechanics of search engines (in particular, Google's search engine). Invaluable and essential for all individuals. We need more explorations of how Internet technologies uphold values that misrepresent, oppress, and further marginalize people who categorically have no financial, political, or social powers. The truth is most people won't read this - though they should - but if they did, they'd get a great springboard into what is the diverse field of information science.
Noble began collecting information in 2010 after noticing the way Google Search and other internet sites collect and display information about non-white communities. Her results dovetail with other studies positing that algorithms are flawed by their very nature: choosing & weighting only some variables to define or capture a phenomenon will deliver a flawed result. Noble wrote this book to explore reasons why Google couldn’t, or wouldn’t, address concerns over search <i>results</i> that channel, shape, distort the search itself, i.e., the search “black girls” yielded only pornographic results, beginning a cascade of increasingly disturbing and irrelevant options for further search.<br /><br />In her conclusion Noble tells us that she wrote an article about these observations in 2012 for a national women’s magazine, <i>Bitch</i>, and within six weeks the Google Search for “black girls” turned up an entire page of results like “Black Girls Code,” Black Girls Rock,” “7-Year-Old Writes Book to Show Black Girls They Are Princesses.” While Noble declines to take credit for these changes, she continued her research into the way non-white communities are sidelined in the digital universe. <br /><br />We must keep several things in mind at once if the digital environment is to work for all of us. We must recognize the way the digital universe reflects and perpetuates the white male patriarchy from which it was developed. In order for the internet to live up to the promise of allowing unheard and disenfranchised populations some voice and access to information they can use to enhance their world, we must monitor the creation and use of the algorithms that control the processes by which we add to and search the internet. This is one reason it is so critical to have diversity in tech. Below find just a few of Noble's more salient points:
<blockquote>
※<b> We are the product that Google sells to advertisers.</b>
<br /><br />※<b>The digital interface is a material reality structuring a discourse, embedded with historical relations...Search does not merely present pages but structures knowledge...</b>
<br /><br />※<b> Google & other search engines have been enlisted to make decisions about the proper balance between personal privacy and access to information. The vast majority of these decisions face no public scrutiny, though they shape public discourse.</b>
<br /><br />※<b> Those who have the power to design systems--classification or technical [like library, museum, & information professionals]--hold the ability to prioritize hierarchical schemes that privilege certain types of information over others.</b>
<br /><br /> ※<b> The search arena is consolidated under the control of only a few companies.</b>
<br /><br />
※<b> Algorithms that rank & prioritize for profits compromise our ability to engage with complicated ideas. There is no counterposition, nor is there a disclaimer or framework for contextualizing what we get.</b>
<br /><br />※<b> Access to high quality information, from journalism to research, is essential to a healthy and viable democracy...In some cases, journalists are facing screens that deliver real-time analytics about the virality of their stories. Under these circumstances, journalists are encouraged to modify headlines and keywords within a news story to promote greater traction and sharing among readers.</b>
</blockquote>The early e-version of this manuscript obtained through Netgalley had formatting and linking issues that were a hindrance to understanding. Noble writes here for an academic audience I presume, and as such her jargon and complicated sentences are appropriate for communicating the most precise information in the least space. However, for a general audience this book would be a slog, something not true if one listens to Noble (as in the attached TED talk linked below). Surely one of the best things this book offers is a collection of references to others who are working on these problems around the country. <br /><br />The other best thing about this book is an affecting story Noble includes in the final pages of her Epilogue about Kandis, a long-established black hairdresser in a college town trying to keep her business going by registering online with the ratings site, Yelp. Noble writes in the woman’s voice, simply and forthrightly, without jargon, and the clarity and moral force of the story is so hard-hitting, it is worth picking up the book for this story. At the very least I would reconmend a TED talk on this story, and suggest placing the story closer to the front of this book in subsequent editions. <br /><br />Basically, the story is as follows: Kandis's shop became an established business in the 1980s, before the fall off of black scholars attending the university, "when the campus stopped admitting so many Blacks." To keep fewer students aware of her business providing an exclusive and necessary service, she spent many hours to find a way to have her business come up when “black hair” was typed in as a search term within a specified radius of the school. The difficulties she experienced illustrate the algorithm problems clearly. <blockquote> “To be a Black woman and to need hair care can be an isolating experience. The quality of service I provide touches more than just the external part of someone. It’s not just about their hair.”</blockquote>I do not want to get off the subject Noble has concentrated on with such eloquence in her treatise, but I can’t resist noting that we are talking about black women’s hair again…Readers of my reviews will know I am concerned that black women have experienced violence in their attitudes about their hair. If I am misinterpreting what I perceive to be hatred of something so integral to their beings, I would be happy to know it. If black hair were perceived instead as an extension of one’s personality and sexuality without the almost universal animus for it when undressed, I would not worry about this obsession as much. But I think we need also to work on making black women feel their hair is beautiful. Period. <br /><br />By the time we get to Noble’s Epilogue, she has raised a huge number of discussion points and questions which grew from her legitimate concerns that Google Search seemed to perpetuate the status quo or service a select group rather than break new ground for enabling the previously disenfranchised. This is critically important, urgent, and complicated work and Noble has the energy and intellectual fortitude needed to work with others to address these issues. This book would be especially useful for those looking for an area in the digital arena to piggyback her work to try and make a difference.
The scholarly work of this book is important—internet search engines are not the unbiased information sources they are believed to be. Noble not only proves this, but lays out the practices at Google and other internet giants that result in searches providing racist results. Read to learn what really determines how sites rise to the top of your searches, for examples of the cultural biases and racism these reinforce, and for the threats posed to our society. Technical but worth the read.
Book Review
(Algorithms of Oppression, Safiya Noble, New York University Press, 2018)
4 The politics of technology design
The internet is not the neutral, unbiased warehouse of all things. Search, for example, is loaded with ingrained prejudices from our culture and history. Safiya Noble, a black feminist, was incensed when she searched ”black girls” on Google and came up with listing after listing of porn. All she wanted was some activities suggestions for her stepdaughter and her visiting cousins. She had to close the laptop lid before they saw the results.
If it weren’t bad enough that only porn resulted, look at Google’s autocomplete suggestions for a search on “women”:
-Women cannot: drive, be bishops, be trusted, speak in church
-Women should not: have rights, work, vote, box
-Women should: stay at home, be slaves, be in the kitchen, not speak in church
-Women need to: be put in their places, know their place, be controlled, be disciplined
The result is Algorithms of Oppression, a six year project to determine the extent of this poison, how it came to be, and what should be done. Noble found western society itself at the heart of it.
Google, incredibly, denies any responsibility. It says its algorithm operates on its own and they can’t train it. This is what we call a lie, as Google has managed to abide by all kinds of European directives against hate, Nazi products for sale, and the right to be forgotten. And magically, the black girls search results have been evolving too, as Noble shows in her many screenshots.
We like to believe that what rises to the top in search is whatever is most popular and most relevant. But we fool ourselves. There are classification systems at work, and Noble says blacks have been “contained and constrained“ by them. The search “beautiful” results in an endless page of photos of white women. Not Starry Night, Niagara Falls or the Taj Mahal, but white women. A search for “professor” brings photos of only white men. And a search for “unprofessional hairstyles for work” shows only women of color. As you might guess, “professional hairstyles for work” shows only white women.
And it’s not as if Google has customized the results according to Noble’s search history. She has spent several years using Google in her pursuit of a doctorate in black feminist studies. And this is how Google profiles her.
Basically, Google’s search algorithm represents the white male view of the world, she says, and brings up results to fulfill that need. Black community or society is simply not part of the equation, and therefore not part of the algorithm. Same goes for women.
Noble has a chapter on libraries, because librarians classify everything. They must of course, in order for anyone to do any sort of in-depth research. Yet the very act of classification is discriminatory. Irish Catholic, Korean American, black feminist – are all problems looking for homes. Everyone becomes an “objectified symbol“ to someone else. Leo Buscaglia spent his life ranting against this. Because of these labels, we think we know something about this person, he used to say, but we don’t at all. This same built-in bias shows up in online search. It is not in any way neutral.
As exhaustive as she has tried to be, Noble made no effort to stem the tide. Her screenshots do not also show results when Google’s Family Filter is on, and she never tried searching with negative terms to block the sex listings (Black girls –sex). It’s almost certainly true that most people can’t be bothered with either of these tactics, but Noble should have included their results.
Unfortunately, she concludes that Google search be federally regulated. This despite her entire book demonstrating the embedded, if not innate bias throughout every aspect of western society. It’s not an especially hopeful ending, and really just skirts the whole core issue.
We are nowhere near being postracial.
David Wineberg