Las Vegas Sun

April 29, 2024

Where I Stand: Information pollution is choking our humanity

Mary Blankenship

Steve Marcus

Ukrainian-American researcher Mary Blankenship poses on UNLV campus Tuesday, March 1, 2022. Blankenship is doing research on misinformation regarding the current crisis in Ukraine.

Editor’s note: As he traditionally does around this time every year, Brian Greenspun is turning over his Where I Stand column to others. Today’s guest is UNLV Brookings Mountain graduate student researcher Mary Blankenship.

Is there any information online that I can trust?

After sifting through AI images, deepfakes, bots, trolls, ads and disinformation campaigns while just trying to casually peruse social media, I often think there isn’t. 

In one way or another, we are consumed by information pollution — whether we fall victim to believing it or are trying to combat it. Information pollution, namely misinformation and disinformation, permeates every discussion.

Want to talk about the election? Well, it has already been rigged. Concerned about the war in Ukraine? Apparently, it is all a cover-up for Hunter Biden’s laptop controversy. Trying to follow the developments of a mass shooting? That’s an inside job used to take away our guns.

These are the softened versions of the many claims I have come across in my research using millions of tweets generated in response to major national and international events. Disinformation and misinformation not only derail the conversation but wipe away the human suffering and cost as a result of these events, and with it, a part of our humanity. Disinformation works so well because it preys on ignorance and latches onto emotional responses of fear and anger, especially if it pits you against someone else. As a result, we keep doom-scrolling, arguing with each other, and generating engagement on social media platforms. The exploitation of our natural responses and cognitive blind spots is profitable for many.

For-profit disinformation networks often take to far-right movements and conspiracy theories, as occurred with the Freedom Convoys, where a fake Facebook group collected $7 million in donations before the platform even had the chance to take it down. Disinformation websites imitating news sites generated ad revenue of $235 million across 20,000 domains in 2019, according to a study by the Global Disinformation Index. “Anti-woke” brands, political candidates, policies and advocacy groups have been on the rise and often adopt false narratives for either financial or political gain. 

When false narratives become our identity and reality, the problem immediately evolves into a tragedy. 

The recent death of Laura Ann Carleton, who was shot for having a gay pride flag in her store, is the latest example. Her killer had a long history of posting anti-LGBTQIA+ and antisemitic conspiracy theories on social media.

Other incidents include the Buffalo shooting in 2022, where the shooter targeted a predominantly Black community in New York, killing 10 people and injuring three. According to the investigative report of the shooting by the state attorney general, “it is hard to ignore the correlation between the rise in mass shootings perpetrated by young men and the prevalence of online platforms where racist ideology and hate speech flourish.” In this case, the shooter embraced the “great replacement” conspiracy theory and social media played a pivotal role at each step of this crime: formulating his ideology, planning out his attack, learning how to use an assault rifle, and finally live-streaming the shooting. 

Information pollution also plays out on a larger scale with conspiracies like the “big lie,” which instigated the insurrection at the U.S. Capitol, or COVID-19 misinformation that led to the death of 2,800 people in Canada alone, according to conservative estimates by the Council of Canadian Academies.

When dealing with disinformation, it is not just about having the skills necessary to sift through what is real or fake, or about one’s level of education, or political affiliation. The problem is far wider and more difficult to solve.

Research on happiness and well-being by Carol Graham and Emily Dodson at the Brookings Institution indicates that individuals and communities experiencing deep levels of despair are more vulnerable to misinformation. These same communities have disproportionately higher amounts of premature deaths caused by drug overdoses, alcohol poisonings and suicide, as well as high levels of opioid addiction.

Claims that exploit existing tensions and grievances within a population are often the most widely spread on social media, as seen with Russian disinformation about the war in Ukraine on both a local and international scale. 

Common mechanisms employed to deal with online disinformation and misinformation, like adding warning labels to posts or restricting conspiracy theory accounts, are important but are only treating symptoms of much deeper wounds within our nation.

Wounds, when combined with disinformation and misinformation, that have us mistrust each other and degrade other humans into insidious entities or “sheep.”