Disinformation on Fossil Fuels is a Threat to Your Business. What You Can Do to Inoculate Against Disinformation Programming.
This article, written by SGA President & CEO Suzanne Ogle, was originally published in The Link, Q4, 2023.
Disinformation and the manipulation of knowledge are now an art in America. Both legacy media and social media are powerful venue for the transmission of misinformation. However, misinformation spreads at different speeds and volumes within each system.
As the CEO of a natural gas trade association, what concerns me is the fairy dust of disinformation regarding fossil fuels that threatens our member’s license to operate, investment thesis, market capitalization, reputation, customer loyalty, and talent acquisition. Large-scale disinformation undermines informed decision-making, creates distrust among citizens, and exacerbates social and political polarization, ultimately impacting our energy security, foreign policy, and domestic politics. It is a risk to our business and to reaching our emission reduction goals, and, as such, it must be a priority for Southern Gas Association.
From Mainstream to Digital – Amplified Speech Compromises Trust and Informed Decision-Making
As an Accredited Public Relations (APR) professional, I believe that information, communication, and its evolving digital technology hold vast potential to enhance our lives and collective well-being.
However, the rising dominance of social media as an information source, combined with the platform’s velocity, size, and ubiquity, has created a fertile ground for disinformation. The result — is increased polarization that poses a threat to the ability of individuals, communities, and organizations to engage in informed decision-making. From climate change to COVID-19 to war — disinformation is supercharging speech that impedes real progress.
Multiple studies show that the vast majority of Americans view disinformation as a severe threat to democracy and our economy, saying it pollutes the information environment and undermines societal harmony. An NPR/Ipsos poll finds that 64% of Americans believe U.S. democracy is “in crisis and at risk of failing”¹. A Pearson Institute/APNORC poll finds that 95% of American adults believe the spread of misinformation is a problem, and most blame social media companies, social media users, and U.S. politicians for its spread.²
But it’s not only the public that is concerned. PEN America’s nationwide survey³ of reporters and editors believes disinformation is disrupting the practice of journalism, “disrupting newsroom processes, draining the attention of editors and reporters, demanding new procedures and skills, and jeopardizing community trust in journalism.”
Despite the credibility gap, digital news has become an important part of Americans’ news media diets, with social media playing a crucial role in news consumption. Today, half of U.S. adults get news at least sometimes from social media.
A 2023 Pew Research study shows news consumption across digital platforms.4 The use of each news platform varies by age, gender, race, ethnicity, educational attainment, and political leaning. The study shows that Americans ages 50 or older are more likely than younger adults to turn to and prefer television and print publications. Americans ages 18-29 prefer digital platforms like social media.
More than Fake: Technically Correct, But Not Factually Accurate.
We like talking about “fake news,” meaning false information, but that’s not the only source of disinformation. Much isn’t truly false; it is taken out of context or misphrased, demonstrating you don’t have to be a “fake news” outfit to propagate disinformation. Two examples follow:
A ‘healthy’ doctor dies two weeks after getting COVID-19 vaccine; CDC is investigating why.
The headline above was one of the “most viewed” Facebook content in the first quarter of 2021, viewed by nearly 54 million Facebook accounts in the United States. While the article suggests that the coronavirus vaccine was at fault for the death of the Florida doctor, months later, the medical examiner’s report said there wasn’t enough evidence to say whether the vaccine contributed to the doctor’s death. The autopsy and investigation found no evidence to connect the vaccine and the doctor’s death, which was attributed to natural causes.
Let’s bring it a little closer to home. In Dr. Scott Tinker’s Colorado Mesa University and Garfield County’s Energy & Environment Symposium presentation (available at switchon.org), he highlights the need to be completely factual and factually complete. Here, he discusses winter storm URI and FERC’s subsequent report that concludes “natural gas accounted for the majority of outages” which is completely factual but not factually complete.
ERCOT set an all-time winter peak demand of 69,692 MW in February of 2021, the month of URI. As Dr. Tinker explains, “It was cold for six days going into URI, and there were no blackouts. ERCOT’s normal consumption of about 40,000 MW hours went way up to 54,200 MW, and the system met that, then there were blackouts bringing generation down to 46,700 MW, finally returning to normal at 35,600 MW.” Dr. Tinker examines fuel by fuel in blackout time.
Performance Drop by Fuel During Blackout
- Nuclear (-14%) 5,100 MW dropped to 4,400 MW
- Coal (-9%) 8,700 MW rose to 10,800 during the very cold days before the blackout then dropped to 7,900
- Solar (-33%) 1,500 dropped to 600 MW during the very cold days before then to 1,000 MW
- Wind (-55%) 12,000 MW dropped to 4,900 MW before the blackout then up a little during the blackout to 5,400 MW
- Gas (+233%) 12,000 MW rose to 32,900 MW during the cold days before then dropped 40% during the blackout to 28,000, still 233% above normal to 11,700
Who’s Most to Blame for Disinformation, And What Is the Solution?
Often, there is a tendency to blame a “side” or “group” for disinformation. The very nature of the question implies, “My side isn’t responsible.” We must be able to hold two ideas in our heads simultaneously. First, everyone is vulnerable to misinformation. Second, the incentives for misinformation are likely to cause both sides to promote and spread it if we aren’t careful. Echo chambers bind and isolate online communities with similar views, which aids the spread of falsehoods and impedes the spread of factual corrections. What if we could reshape the way we digest and verify information?
Cognitive Immunity to Manipulative Tactics
There are a range of interventions for misinformation on both an individual level and a system level. It begins with maintaining knowledge infrastructure, teaching digital literacy, and incorporating techniques like fact-checking, prebunking, and nudging.
Combating Misinformation Through an Infrastructure of Knowledge
There is social importance to the preservation of knowledge. Knowledge in the digital world has materially changed. Public knowledge infrastructure such as libraries and archives have preserved facts and falsehoods and made them publicly available for five thousand years. From clay tablets in cuneiform script to the 20th-century Dewey Decimal System, libraries have democratized knowledge by providing access to information, allowing individuals to make informed decisions about their lives and the world around them.
The shift of public knowledge to an online sphere has created an enormous challenge for these institutions, who now face the preservation of the analog past with the preservation of the digital present. A digital presence is often owned and controlled by private superpowers who have immense sway over the way people receive information. They can harvest your “like” of a Tweet, a Facebook post, or your ability to receive advertising.
Facebook’s partnership with Cambridge Analytica (a case study we use in SGA’s Introduction to ESG, a demonstration of how governance and supplier relationships impact corporate reputation) is a perfect example of data manipulation. To date, the financial cost of institutionalizing digital media has been prohibitive. Because this digital knowledge, manipulation, and influence is unarchivable, it is lost for public access as a point of reference for facts and truth.
Navigating the new information environment can confuse young and older Americans. A panacea to combat disinformation is teaching media literacy and institutional information skills, equipping people to understand the provenance of information and information flows; for example, simple things like understanding that the first results of a Google search can be thanks to advertising. That is something that needs to be better understood among the general population. Fact-checking (debunking) is valuable and tends to correct misperceptions effectively. When someone believes something untrue, verify the factual accuracy of questioned reporting and statements. Fact-checking is a learned skill, and individuals can harness technology. The Reporters’ Lab at Duke University maintains a database of fact-checking organizations that provide fact-checkers and researchers with tools to thwart misinformation. The database tracks more than 100 non-partisan organizations around the world. The Lab’s inclusion criteria are based on whether the organization:
- Examines all parties and sides
- Examines discrete claims and reaches conclusions
- Tracks political promises
- Is transparent about sources and methods
- Discloses funding/ affiliations
- And whether its primary mission is news and information
Pre-bunking prevents unwanted persuasion to reduce the probability of being persuaded by misinformation. Pre-bunking focuses on identifying manipulative techniques rather than if the information is true or false. The approach is more scalable than trying to debunk individual sources. Pre-bunking uses a psychological framework and is known as inoculation theory.5
The premise of pre-bunking is exposure to a weak version of an argument builds psychological resistance against future unwanted persuasion. It is a strategy similar to that used in medical vaccines where weakened or dead pathogens prompt the immune system to create antibodies that build resistance against future infection.
There are two parts to prebunking.
- A warning. (e.g., “People may try to manipulate you by saying……”)
- A statement that preemptively refutes the argument. (e.g., “This is not true, because…”)
In the context of misinformation, there are two dominant types of inoculation interventions: Issue-based interventions tackle false individual arguments or stories, and technique-based interventions address the common techniques that underlie misinformation.
Inoculation interventions have been shown to be effective at reducing susceptibility to both individual examples of misinformation and various manipulation techniques. However, in the words of Professor Snape, “Your defenses must be as flexible and inventive as the arts you seek to undo.”
There are five misinformation techniques that we know are a) common and b) epistemologically dubious.
- Emotionally manipulative language/fearmongering
- Incoherence
- False dichotomies
- Scapegoating
- Ad hominem attacks
Look for natural gas prebunking videos from the Southern Gas Association in 2024.
By Suzanne Ogle, President and CEO, Southern Gas Association
- https://www.npr.org/2022/01/03/1069764164/american-democracy-poll-jan-6
- https://apnorc.org/wp-content/uploads/2021/10/misinformation_Formatted_v2-002.pdf
- https://pen.org/report/hard-news-journalists-and-the-threat-of-disinformation/
- https://www.pewresearch.org/journalism/fact-sheet/news-platform-fact-sheet/
- Inoculation theory (Compton et al., 2021; McGuire, 1964; McGuire & Papageorgis, 1961; van der Linden, 2023)