Member Reviews
2.75 stars, rounded to 3 for the purposes of Goodreads.
Disclaimers: This review is based on an advanced copy provided through Netgalley by the publisher. All opinions expressed are my own.
As someone interested in the disinformation and propaganda campaigns occurring online, I was very excited to read this book. Having finished it, I am glad that I read it, primarily since there were several moments where I felt I was given language to articulate thoughts I already had about social media and the internet (primarily regarding my own interactions with it). However, as a whole, the book wasn't a terribly enjoyable read for me.
However, I feel the book format is actually quite ineffectual for this kind of discussion; my most common thought while reading was that it would be so much more effective as a piece of audio/visual media. The writing, too, didn't help. I found the style largely unengaging. Strangely enough, despite coming into the book already interested and highly invested in the topic, I had to force myself to finish reading because it was just such a slog to get through. I think one of the primary reasons for this is the structure of the book, which instead of focusing on specific actors or instances, it instead focuses on different elements of propaganda and bounces around narratives as it tries to explain it.
The only moments I found genuinely interesting or engaging were a few moments where DiResta expresses her own experiences or indulges in a bit of tongue-in-cheek dry humor. For example, the forward, when she talked about her involvement in pro-vaccine movements in her hometown as a young mother, or when she poked fun at Ali Alexander's involvement in Stop the Steal as "bravely live stream[ing] from a rooftop several blocks away once the violence started" (p. 169).
It's also interesting to see where the author's personal bias intersects with her criticism of certain figures. For example, while she did criticize Joe Rogan, her criticism of him was remarkably light considering... well, everything about him and his career.
Additionally, while she examined the propaganda efforts run by political organizations including the Russian government, Hamas, and Iran, she was curiously very vague about other government's propagana efforts (most notably, due to the ongoing genocide against Palestinians, an bizarrely careful skirting of Israel's internet propaganda efforts both historically and present-day).
Finally, for a book subtitled "the people who turn lies into reality," there is remarkably little focus on actually analyzing or profiling specific propagandists (besides one or two DiResta has personal 'beef' with--'beef' which is perfectly understandable she has, considering it's mostly Taibbi). But still, it would have been nice if she had actully done deep dives into specific propagandists besides Taibbi.
These obvious glaring moments where DiResta handles certain actors with kid gloves, paired with the difficult-to-get-through writing and overall lack of in-depth analysis on specific actors, makes Invisible Rulers a book I just can't say I would recommend. While certain parts were interesting, the overall reading experience as a whole wasn't enjoyable or engaging.
If interested in modes of propaganda, misinformation, and recruitment for bespoke realities online, I'd personally recommend Caelan Conrad's videos (including their "Gender Critical" series and "The School Litterbox: Modern Urban Legend") as personal favorites.
When I was a radio talkshow host, there was no internet (for the general public), and we were still in the age-old mode of shared experience and consensus reality. People tuned into my shows to learn something, and know that tens of thousands of others would be listening to the same thing. They could talk about it later, agree or disagree, modify their positions and feel they had accomplished something. Shared experience was the way of life. Its most famous story is the tale of New York City Waterworks employees being able to tell when the commercials occurred in I Love Lucy on Tuesday evenings because hundreds of thousands of toilets would all flush at the same time.
It was important for everyone to be up on the latest, and it was not easy. My and other such shows were the social media of the day. They brought people together. But you had to want to listen, and to call in to be heard. Today, there are so many choices and so much spare time, Americans check their phones well over 200 times a day, looking for anything to stimulate them, which includes hating.
As a radio talkshow host, I was what we now call an influencer. People either appreciated or despised me and my approach to everything. I found out years later that my reputation was that I went after guests with a scalpel. Listeners really appreciated me after the bland nonsense of most tv programs in news and public affairs that pretended to high standards of journalism. But I didn’t just sit there and spew nonsense to earn ratings (as shock jocks learned to do, later). I brought in the actual experts, and grilled them. And opened the phone lines so the audience could too. And we were on a seven second delay in case anyone on the phone got out of hand. So it stayed civil. Imagine that.
Today, the internet is about separating people into smaller and smaller groups, based on hatred, anger, and reinforcement of their own warped, uneducated conclusions. Facebook acknowledged (at least internally) six years ago that its methods were divisive, exploiting “the human brain’s attraction” to it. Divisiveness is what kept ‘em coming back They’ve known it all along. That’s what social media exploits. Cancel Culture has replaced shared experience. Immediate doxxing has replaced watercooler talk the next day. Death threats go out faster than Valentine’s Day cards. It is not an improvement.
Today’s online influencers mold themselves to algorithms that drive traffic to them. They actively seek to be paid by the very people and firms I exposed and criticized. They do a lot of sucking up to extremists, whether of politics or fashion, religion or extreme conservatism. It is not a better world for it.
What brought on this rant was a new book from Renée DiResta called Invisible Rulers. In it, she examines the evolution of community and communications, and classifies the various online players in categories so readers can pigeonhole them. She is the best placed person to write such a book thanks to her work at the Stanford Internet Observatory and the Election Integrity Partnership there. Her job is to discover and understand all the factors that make social media the mess it is, understand the characters at the centers of the vortices, and understand the platforms’ inabilities to ensure a positive, rewarding experience that makes for a better world.
She notes that consensus reality has been usurped by bespoke reality, in which everyone lives in a bubble of their own values (as modified by influencers), and pushes others away. Inside their bespoke reality, all the lies are true, she says.
The proof can be astonishing. DiResta cites a 2022 study showing the 67% of respondents actually changed their lives thanks to an online influencer. Is it any wonder that corporations line up to pay them big bucks just to keep doing what they’re doing?
Her research on the political side revealed that the far right, despite all its moaning about being overwhelmed and silenced by the vicious left, is way ahead in the “repeat spreader” category. Of the 150 top spreaders, only 11 were Biden supporters. These spreaders are the invisible rulers who make outlandish rumors go viral. It is their bespoke reality that the 2020 election was stolen. No evidence necessary. It just was. Nearly half of Americans believe them, and their number continues to increase in 2024.
She found that all the noise over COVID-19 vaccines predated their actual development and were not at all about responses to safety studies. They went back to 2015, when California was going through its measles inoculation fight in schools. New disease, same old arguments. And social media was right there, pushing it. As early as 2015, she says, Facebook was already offering anti-vaccine as an advertising category for those who wanted to target that segment.
It has come to the point where “The trust in the old top-down system of institutions, experts, authority figures, and mass media isn’t simply declining. Within a significant portion of the pubic it has been reallocated to the bottom-up system of influencers, algorithms, and crowds.”
The influencers know they can say and do what they want and no one will sue because it costs far too much and takes years to get to trial. So their universe keeps expanding. And they can attack anyone at any time, for no reason at all if they think it will attract followers and find favor with the algorithm promoting such comments.
Foreign nations come under her microscope too. They sport various tics that make them stand out as obvious attempts at influence and misinformation. Some copy and paste on massive scales. Their misplaced punctuation can identify them as being directed. Same for malapropisms. So the Chinese networks are different from the Russian or the Saudi, if you know how to look. And artificial intelligence has magnified and multiplied everything, raising the overall quality of all of them. It has come to the point where genuine videos are being dismissed as AI-generated. This is a total success for the disinformation crowd.
DiResta spends a good 250 pages cataloguing all these kinds of stereotypes, and illustrates them with real people and the stories of how they became influential, famous, and rich. She profiles the Rogans, MrBeasts, Taibbis and Pollys of cyberspace: who they were, who they are now, how they got there, and how many became what they most despised. She calls it influence-as-a-service.
DiResta is a confident professional researcher and writer. She is never amazed, dumbfounded or confused. She reports it all thoroughly, fairly and comprehensively. Possibly too much so. After 250 pages, it becomes more of the same, with an ever-expanding cast of characters readers will likely know of, but get tired of too. It becomes encyclopedic, or even textbook-y. I found myself longing for something to take it all to another level.
And right then and there, DiResta does just that. Because she herself was drawn into the maelstrom by shameless, selfish, greedy and amoral influencers. For all the good work she did (and does) in analyzing social media data, they claimed her (purely academic) projects were to silence the extreme right. That the government and the CIA were driving her efforts. That she had a mandate to somehow wipe 22 million extreme right messages from social media. The invisible rulers made these charges reality.
Suddenly, threats poured in to Stanford and DiResta personally. On just one day, she found herself having to block 6000 people from her social media accounts. And the lawsuits swarmed in from trolls supposedly wounded by the work she was doing, backed by the likes of Stephen Miller, the famous White House xenophobe under Trump. Gym Jordan handed Miller Stanford’s testimony from the closed door, private sessions of his congressional subcommittee, allowing Miller to sue on behalf some right wing extremists, who of course were not only not harmed, but multiplied their subscribers on sites like Substack by slamming DiResta everywhere. Miller said it was “probably the largest mass-surveillance and mass-censorship program in American history,” in the usual Trumpian terms where everything is always the most it could be in history. Jordan demanded years of every e-mail ever sent between Stanford’s projects and government, for example, and then twisted lines out of context. Legal bills mounted, and she and her colleagues spent an enormous amount of time fighting off all of this. Both at work and at home.
The depths of it are so depraved that DiResta feels compelled to expose it for what it is – pure fantasy and lies. Michael Benz called her the “CIA Fellow” and the mastermind of the “Censorship Industrial Complex.” Michael Shellenberger called her “one of the most dangerous people in America right now.” Their followers ran with it, no questions asked.
What DiResta was doing to deserve these epithets was simply sharing the project’s findings with platforms that provided her the data to study. It was not funded, mandated or influenced by anyone in the federal government. It had no goal other than to understand what was going on in this new online society. Doesn’t matter. The death threats mounted.
With all her personal experience in the hell of being the Main Character in one of these scenarios, DiResta says “We now have an actively disinformed citizenry, spread across bespoke realities.”
From that, readers might think the Conclusion would be that people cannot deal rationally on social media, and that it should be closed off before it causes a total breakdown in civilization. But she doesn’t. Instead, her Conclusion is a collection of rational steps that will not be implemented as society slides unchecked towards the abyss. Nonetheless, her own damning experience clearly proves her case, making this a powerful book – and warning.
Meanwhile, back in my world, everyone seems to think emojis are an entirely new invention of social media. Growing up in the fifties and sixties, we had no emojis, but we had buttons. People sported buttons with icons, art, pithy sayings or political slogans. Whole chestfuls and hatfuls. There was Black Power instead of Black Lives Matter. This was soon followed by Flower Power and the joke Jewish Power. Peace signs were very big because of the US war in Viet Nam and the spread of nuclear weapons. Make love, not war was a hot item. So was Ban the Bomb. But so was You bet your sweet bippy from Laugh-In. Those were Americans’ emojis in the sixties.
Today by comparison, we walk around rather anonymously, saving all our firepower for attacks over social media. Hashtags tell us all we need to know, about topics we know nothing about. The emojis show up in our twitter handles and tweets and as responses when we can’t be bothered with words. The world might not be so different than it was in decades past. Only the medium – the technology – has changed. But it’s a killer change.
David Wineberg
The premise of the book is basically the same truthiness narrative that’s been repeated ad nauseam since 2016: social media creates "bubble realities…that operate with their own norms, media, trusted authorities and frameworks of facts,” which she calls “bespoke realities” (apparently “echo chambers” was becoming a trite term and needed a replacing). Free from expert hall monitors wagging their fingers at rubes, disinformation, misinformation, polarization and conspiracies proliferate in these enclaves. Algorithms then help the fringe theories such as QAnon that develop in these unregulated ecosystems “to grow into an omniconspiracy, a singularity in which all manner of conspiracy theories melted together and appealed to far more adherents than any component part.” (No, that’s not from an episode of South Park — but it did make me laugh so hard that I peed a little.)
According to DiResta, this prevents society from operating within a “consensus reality” required for voters to make informed decisions, putting democracy itself at risk. “Reaching consensus is how societies make decisions and move forward, and steering that process can transform the future. Societies require consensus to function. Yet consensus today seems increasingly impossible,” DiResta writes. (Putting aside the creepy “steering that process can transform the future” part aside for a minute, a consensus on what? Was there a consensus before? Why is a consensus better than disagreement? Could a desire to seek consensus lead to social pressures that stifle truth? Once a consensus is formed what’s to stop it from hardening into dogma? She never gives more than fleeting thoughts to any of these questions.)
DiResta supports these claims mostly with footnotes citing her own opinion pieces in outlets like The Atlantic and scattered anecdotes everyone has heard over and over (and she repeats them over and over some more): Pizzagate, Russiagate, ivermectin-gate, QAnon-gate, climate denial-gate, January 6-gate, and petty gossip about anyone who has criticized her (mostly Matt Taibbi and Michael Shellenberger, who apparently “harass” her…by filing FOIA requests). But are conspiracies, science denial, extremist politics, etc. actually increasing? Or has there just been more focus on these issues, making them appear more prevalent?
According to research published in 2022 by Joseph Ucsinski, a political scientist who specializes in belief in conspiracies and misinformation, “Across four studies––including four distinct operationalizations of conspiracism, temporal comparisons spanning between seven months and 55 years, and tens of thousands of observations from seven nations––we find only scant evidence that conspiracism, however operationalized, has increased.”
In a peer reviewed surveyof over 150 academic experts on misinformation, “less than half of experts surveyed agreed that participants sincerely believe the misinformation they report to believe in surveys. This should motivate journalists to take alarmist survey results with a grain of salt.”
As far as science denialism goes, the nerds can rest easy knowing that, according to Pew, Americans have more confidence in scientists to act in the public interest than any other profession — rising from 76% in 2016 to 86% in 2019. Only 13% say they have “not too much” or “low confidence” in scientists. About 9 in 10 Americans also believe the benefits of vaccines outweigh the risks. And only 3% of Americans say science has had a mostly negative effect on society.
Even when it comes to ostensibly “divisive” political issues like climate change, the reality of public opinion is once again drowned out by hysterical disasterbation (which I propose adding to the growing mis/dis/malinformation family). DiResta uses the book Merchants of Doubt to claim private interests have spread a distrust of climate science among the public. Yet research conducted by Stanford University found that 82% of Americans believe “humans are at least partly responsible for global warming,” and 80% thought it was a “very” or “somewhat” serious problem for the US. The same article also coincidentally challenged Merchants of Doubt, in which “historian Naomi Oreskes asserted that the fossil fuel industry and its supporters had been engaged in efforts to reduce the certainty with which some Americans believed that global warming has been happening and to increase the certainty with which others believed that it has not been happening. Our surveys suggest that since 1997, there has been no systematic shifting of certainty in either of these ways. Among Americans who have believed that warming probably has been occurring, the proportion expressing this view with high certainty was quite consistent between 1997 and 2015, ranging from 44% to 58%. In 2020, it reached an all-time high of 63%.” Hm, sounds like a consensus to me. If only our institutions weren’t so dysfunctional something could actually be done about climate change.
(None of this should suggest that Mis/dis/malinformation (MDM) aren’t still problems that deserve attention and to hopefully discover ways to improve public literacy, but context matters. It’s important to figure out whether the effects being observed are caused by social media or human nature, and if social media is actually exacerbating negative aspects of human nature or simply making them more noticeable.)
Now comes DiResta’s “interesting” contribution to this moldy potluck narrative. As the authority of those institutions has waned and trust in the Expert Class erodes, DiResta argues, the “rumor mill” of human nature — gossip in small communities — has now collided with the “propaganda machine” of social media. In place of the wise experts guiding public discourse are “influencers” — QAnon moms, adolescent gamers, and distinguished journalists who have Substacks — who manipulate algorithms and forge hive-mind communities, where they use propaganda techniques to amass power and influence. They are now the new invisible rulers. And brave underdogs like Renee DiResta, multimillion dollar NGOs, and the federal government must stand up to these all-powerful podcasters/YouTubers/Substackers.
To perform these remarkable feats of mental gymnastics, DiResta uses the Two Step theory of information — which posits that certain community leaders have more influence over their group’s opinions than media or experts — to argue that it’s actually influencers who hold the real power. (The theory was developed in the 1950s by observing a small group of women and does not easily translate to the digital age.) “A handful of seemingly arbitrary people on social platforms, ‘influencers,’ now have a significant impact on what the public talks about and what the news media cover on any given day—particularly when it comes to culture war politics,” DiResta writes. The idea of peasants voicing their opinions online — and having people actually listen to them! — is just too absurd for her to handle. “A mom blogger, who got famous for her fun school lunch content, weighing in on Fed rate hikes? Why not.” Actually, that’s a good question. Why not? (Paul Krugman just exploded.)
While I completely agree on the deleterious effects some influencers are having on social and political discourse, she attempts to gerrymander definitions in order to tar independent journalists, activists and opinion leaders who’ve criticized her, such as Matt Taibbi, Michael Shellenberger, Bari Weiss, etc., as “influencers” or “propagandists,” and anyone who reads them as fanboys, stans and fanatics. The book attempts to distinguish an “influencer” from every other public figure based on their “access” to an audience. The influencer “has to be connected to a person somehow in order to ‘do something’ to him or her.” Using this definition, a punk band that interacts with their fans or a professor who socializes with students could be considered an influencer. But the vast majority of an audience never interacts with the supposed influencer, so would they still be considered to have the “access” required to persuade? This is where DiResta pulls an about-face. “Influencers have significant reach and access to audiences within their own follower communities; you might not know MrBeast personally, but his relatability and constant presence in your feed create a sense of connection, of some sort of relationship.” Ok…so do characters on sitcoms and nightly news anchors. And in a media environment where writers frequently freelance, publish books and interact on social media, this expansive definition only serves as a way to label those whom you disagree with a word that has belittling connotations.
Ironically, she encourages readers to look for “the importance of being alert to words and symbols redefined by the propagandist to serve his needs—like censorship has been today.” She doesn’t cite anything showing who supposedly redefined censorship or how it changed, but in 2006 Oxford defined censorship as: “1. Any regime or context in which the content of what is publically expressed, exhibited, published, broadcast, or otherwise distributed is regulated or in which the circulation of information is controlled. 2. A regulatory system for vetting, editing, and prohibiting particular forms of public expression, presided over by a censor. 3. The practice and process of suppression or any particular instance of this. This may involve the partial or total suppression of any text or the entire output of an individual or organization on a limited or permanent basis.” She never gives her definition of censorship, but does concede that posts being removed from social media can be considered censorship, which is a much narrower definition from what’s commonly accepted.
Disinformation is another definition that has gotten a makeover recently — some might even say plastic surgery. When disinformation became the hot new craze in 2016, it was defined as “deliberately false content, designed to deceive,” usually deployed by foreign governments. It now includes everything from information that can be seen as misleading to anything that contains an “adversarial narrative,” which the Global Disinformation Index (GDI) defines as stories, whether true or not, which “inflame social tensions by exploiting and amplifying perceived grievances of individuals, groups and institutions,” with institutions defined as including “the current scientific or medical consensus.” The GDI is an NGO — which receives funding from and works with several governments including the U.S. — that evaluates news outlets for what it considers disinformation and “aims to disrupt, defund and down-rank” sites on their naughty list, particularly by starving these sites of ad revenue. While most reasonable people would assume the sites on this naughty lists would be ones such as Russia Today or Nazi Daily, they are actually filled mostly with regular center or center right outlets. Unherd, which is rated Center by AllSides and has a higher factuality score than The New York Times on Newsguard, found itself on the blacklist because it published pieces by Kathleen Stock, who critiques some transgender care methods. Reason Magazine, a libertarian publication that has a perfect score on NewsGuard, was ranked as one of the 10 riskiest sites — even though the GDI admits Reason “did largely refrain from perpetuating in-group out-group narratives or unfairly targeting certain actors via its reporting,” it nonetheless posed “high risk” because it used “sensationalized, emotional language” and didn’t moderate its comments section.
Yet, DiResta insists throughout the book that none of this is actually happening. She will gladly inform you that any mention of free speech is totally irrelevant because the only censorship on social media that ever took place was because of private companies making all their own decisions with no help from government whatsoever. Also, there has never been any censorship on social media at all. Even if there was, censorship is no different than the algorithm platforms use to rank content, anyways. Besides, you should love censorship! It tells only people you disagree with to shut up and will never creep its way towards your beliefs! Unless you’re a Nazi or something! (Or have slight disagreements with US policy deemed too important for you to have an opinion about…which is whatever Disinformation Warriors say is too important.)
Thank you to the publisher and NetGalley for allowing me to read and review this book.
Renee DiResta’s Invisible Rulers delves into the shadowy realm of information warfare, disinformation, and the pervasive influence of unseen entities shaping public opinion. DiResta, a renowned researcher and expert in the field of digital disinformation, provides a comprehensive analysis of how hidden forces manipulate narratives to achieve their objectives.
DiResta's central thesis revolves around the concept of "invisible rulers"—entities, often state actors or well-organized groups, that exert influence over public discourse and opinion without being directly visible to the general populace. She meticulously examines the mechanisms through which these rulers operate, from the creation of fake social media accounts to the dissemination of false information through various digital platforms. The book shines a light on the sophisticated strategies employed to sow discord, manipulate beliefs, and ultimately control the narrative.
Complex Network Dynamics: One of the book’s key strengths is the exploration of the complex network dynamics that enable disinformation to spread. She illustrates how these networks are constructed, maintained, and exploited, offering readers a detailed understanding of the underlying infrastructure of modern information warfare.
The author uses a series of well-documented case studies to illustrate her points. These include the 2016 U.S. presidential election, the rise of anti-vaccine movements, and various geopolitical conflicts. Each case study is dissected to reveal the tactics and impacts of disinformation campaigns, providing concrete examples of how invisible rulers operate.
The book delves into the psychological aspects of disinformation, explaining how cognitive biases and emotional triggers are exploited to make false information more believable and shareable. DiResta effectively demonstrates how these psychological levers are pulled to create widespread misinformation.
DiResta doesn’t shy away from discussing the ethical and societal implications of her findings. She raises important questions about the responsibilities of social media companies, the role of government regulation, and the need for public awareness and education in combating disinformation.
DiResta’s writing is clear, engaging, and accessible to a broad audience. She adeptly balances technical details with narrative storytelling, making complex concepts understandable without oversimplifying them. Her ability to weave together technical analysis with real-world examples ensures that readers are not only informed but also captivated.
Invisible Rulers is an essential read for anyone interested in understanding the hidden forces that shape public opinion in the digital age. DiResta’s thorough research and engaging writing make it a compelling and insightful work that sheds light on one of the most critical challenges of our time. Whether you are a policy maker, a tech enthusiast, or simply a concerned citizen, this book provides valuable insights into the invisible rulers of our modern world.
Oh yes! As a librarian I am trained to spot misinformation and propaganda. It still gets me sometimes. I will think something is a new trend and then I have to step back and question it. The internet will then make the thing an actual trend. Things like painting brick houses black, storming the capitol, AND NOT VACCINATING OUR KIDS.
People just roll with it. I love a good algorithm but we have to be honest- it does feed the disinformation monster. There are lines in this book that hit me right in my soul.
I love the research into influencers. That is ancient stuff. That falls into the entire idea that our reputation matters too. Imagine how a queen could influence manners and clothing 500 years ago?
The Sociologist in me is in love with this book.
Just read it! Trust me.