Breaking the Breaking News

…Exploring the influence of Cognitive Biases in our human approach to Breaking News through 5 historical events.

In our modern era of constant instant news updates 24/7, the so-called “breaking news” hold a magnetic grip on our consciousness, individually and collectively.

This model is thoroughly molding our perceptions and decisions in profound ways. Yet, within the superabundance of headlines and updates lies a hidden labyrinth of cognitive biases waiting to trap our unwary and overcrowded minds.

As we embark on a journey to dissect the influence of breaking news, historical examples offer a compelling lens through which we will explore the intricate relationship between sensational headlines and the biases that mold our understanding of “facts” and the world.

Through the prism of confirmation bias, availability heuristic, bandwagon effect, anchoring bias, and the Dunning-Kruger effect, this article will uncover the hidden forces at play, shedding light on how breaking news can distort reality and increase misconceptions.

Let us now dive into our common history, breaking the spell of the “breaking news” and uncover the truth beneath the headlines.

1. The Salem Witch Trials (1692) and the confirmation bias

The Confirmation bias describes people’s inherent inclination to grant more attention to “evidence” supporting their existing beliefs.

In this specific case, their confirmation bias led the people of colonial Massachusetts to interpret casual events as evidence of “witchcraft” ,reinforcing their existing beliefs in the “evil supernatural” unlike religion of course!

Nobody questioned the potential benefits of sharing some of the knowledge said people could have had for instance in plants?! No this was too much reasoning, naming people witches and hanging them does cost less energy.

In other words, if not aware of said bias, we could be constructing our thoughts not from the facts and their rational weight, but rather, we could be selectively weighing facts that corroborate our preexisting beliefs.

1792 engraving of Matthew Hopkins, Witch Finder General ( London, J. Caulfield, 1792) showing Hopkins depicted with two witches who are calling out the names of their imp familiars.

1792 engraving of Matthew Hopkins, Witch Finder General

( London, J. Caulfield, 1792)

Hopkins is depicted with two witches who are calling out the names of their imp familiars.

For example, let’s go back to our modern world and let’s say… Sabrina thinks purple car drivers are reckless as she was taught her whole life! Whenever she sees one speeding, she says to her friend, “See? Purple car drivers are reckless!”. But whenever she sees a purple car driver following the rules, she dismisses it as an exception , sticking to her belief.

In this example, Sabrina is selectively interpreting available information to confirm her preexisting belief :that people who drive purple cars are reckless, thus ignoring evidence pointing to the contrary.

This ladies and gentlemen is “confirmation bias”!

When the news of alleged witchcraft broke during that era, it sparked a tsunami of hysteria and paranoia.

As random accusations escalated, fueled by rumors and common hearsay, so did the witch hunt leading to the infamous trials.

Those who wholeheartedly believed in the existence of witchcraft selectively interpreted ambiguous and even casual events as evidence of supernatural activity, reinforcing their preconceived beliefs.

For example, when individuals fell ill or experienced misfortune, it was often attributed to the malevolent influence of “witches”, confirming the biases of the accusers.

Meanwhile, evidence contradicting the existence of witchcraft, such as rational explanations for the phenomena observed, was consistently dismissed or ignored.

In fact, the trials relied heavily on highly subjective and unreliable “evidence”: the testimonies and “spectral evidence,” which included the testimony of witnesses who claimed to have seen the accused in the form of specters or apparitions.

This bias fueled a tragic cycle of scapegoating and persecution, resulting in the wrongful imprisonment and execution of 19 individuals for “witchcraft”.

Following these events, the irrational fear spread beyond Salem and more accusations ensued in other parts of Massachusetts.

2. The Black Death Pandemic and the Availability Heuristic

The Black Death pandemic of the 14th century represents a striking illustration of the availability heuristic in action.

This cognitive bias, identified by psychologists Amos Tversky and Daniel Kahneman, describes the tendency for people to rely on information that is easily accessible or instantly recalled when making decisions or assessing risks.

Let us remember here and now that the Black Death was one of the deadliest pandemics in human history, causing the deaths of an estimated 30 to 50% of Europe’s population.

The excerpt below is extracted from The Decameron, authored by Giovanni Boccaccio (1313–1375), a Renaissance humanist, statesman, and the son of a businessman. The Decameron, deemed the foremost significant European prose work, portrays seven young women and three young men who seek refuge from the plague-stricken city of Florence by retreating to a countryside villa.

“(…) they walked everywhere with odours and nosegays to smell to; as holding it best to corroborate the brain: for the whole atmosphere seemed to them tainted with the stench of dead bodies, arising partly from the distemper itself, and partly from the fermenting of the medicines within them. Others with less humanity, but perchance, as they supposed, with more security from danger, decided that the only remedy for the pestilence was to avoid it: persuaded, therefore, of this, and taking care for themselves only, men and women in great numbers left the city, their houses, relations, and effects, and fled into the country: as if the wrath of God had been restrained to visit those only within the walls of the city; or else concluding, that none ought to stay in a place thus doomed to destruction. Thus divided as they were in their views(…) ”

Back to the bias itself! Let’s consider a person who fears shark attacks after having just watched news reports about a recent shark attack at a beach.

Despite statistically low probabilities of shark attacks compared to other dangers, such as car accidents or the ocean itself, the vivid and emotionally charged news coverage leads the person to overestimate the likelihood of encountering a shark while swimming in the ocean.

As news of the plague spread across Europe, the graphic accounts of its devastating impact dominated public consciousness, leading to widespread panic and irrational behavior. For example, many individuals and families chose to flee from areas heavily affected by the plague in hopes of escaping its spread. However, this often led to the unintentional transmission of the disease to new areas.

Reports of cities decimated by the plague, mass graves overflowing with corpses, and the rapid spread of the disease created a sense of imminent danger that permeated society.

Faced with the terrifying prospect of contagion and death, communities resorted to extreme measures in an attempt to protect themselves, including isolating and ostracizing those suspected of being infected. Indeed fear and panic occasionally resulted in the scapegoating of specific groups, such as minorities or foreigners, who were wrongly blamed for spreading the disease.

This unjust discrimination and persecution further escalated social tensions and turmoil during that period which cannot be summarized here.

In short, the availability heuristic here induced distorted perceptions of risk and probability and exacerbated the fear and chaos unleashed by the pandemic.

3. The Rise of Totalitarian Regimes in the 20th century and the Bandwagon Effect

The 20th century witnessed the rise of totalitarian regimes across Europe, fueled in part by the bandwagon effect.

Imagine a crowded stadium during a sports event where the home team’s supporters start doing “the wave.” Even though some spectators may not be particularly interested, they join in because they see everyone else participating. In this specific case individuals hop on board with the crowd’s behavior to feel part of the excitement and camaraderie.

This illustrates the bandwagon effect.

In the aftermath of World War I and amidst economic turmoil, “charismatic” totalitarian politicians such as Adolf Hitler, Benito Mussolini or even Joseph Stalin exploited breaking news of societal unrest and disillusionment to advance their authoritarian agendas.

Through propaganda and manipulation, these leaders capitalized on the desire for stability and national renewal, tapping into the collective psyche of the masses.

The case of the Völkischer Beobachter is particularly interesting to study.

The NSDAP party organ reporting on the murder of 15-year-old Hitler Youth Herbert Norkus in 1932: We demand the immediate ban of the communist murder organization! The decent Germany demands protection against the blood frenzy of the red subhumans

The Völkischer Beobachter, also known as the “People’s Observer,” served as the Nazi Party’s official newspaper in Germany from the 1920s until 1945. Originally founded in 1887 as the Münchner Beobachter “The Munich Observer” , Adolf Hitler transformed it into a daily anti-Semitic publication in 1923, with a circulation exceeding 1.1 million by 1941 far beyond its 7,000 prior to its acquisition by the then future Leader of Nazi Germany.

Facing suspensions in the early 1920s due to its anti-Semitic content, it resumed publication as a weekly in 1925 before becoming a daily again.

The NSDAP party organ reporting on the murder of 15-year-old Hitler Youth Herbert Norkus in 1932: We demand the immediate ban of the communist murder organization! The decent Germany demands protection against the blood frenzy of the red subhumans

Under the editorship of Alfred Rosenberg, it remained a propaganda tool for Hitler and Joseph Goebbels. The newspaper expanded its reach with editions in Berlin, South Germany, and Vienna, attracting attention from foreign correspondents and diplomats seeking insights into Nazi policy shifts and propaganda strategies.

The bandwagon effect here, driven by a desire to belong and a fear of social isolation, led many individuals to align themselves with the prevailing narrative of nationalist fervor and promise of order.

This conformity facilitated the consolidation of power by totalitarian regimes, despite the ominous warning signs of their authoritarian ambitions resulting in nothing less than the Holocaust in the case of Nazi Germany.

This era obviously cannot be fully explained through this sole prism, certainly not in this brief account.

4. The Cuban Missile Crisis (1962) and the Anchoring Bias

The discovery of Soviet missiles in Cuba in October 1962, was immediately framed as a showdown between the United States and the Soviet Union. It thus established a cognitive anchor that influenced both the perception and the reaction of people, including decision makers.

Indeed, despite all efforts to de-escalate tensions through backchannel negotiations and diplomatic routes, decision-makers on both sides found themselves anchored to the narrative of brinkmanship and Cold War rivalry. This was extensively studied by historians and scholars.

For example, in his book “Thirteen Days: A Memoir of the Cuban Missile Crisis,” then Attorney General Robert F. Kennedy, highlighted how the way the discovery of Soviet missiles in Cuba was revealed to the world set the stage for the crisis and conditionned the response from then U.S. leadership.

It undoubtedly fueled a perilous rise in both rhetoric and actions, as leaders wrestled with breaking away from the deeply ingrained narrative of mutual hostility and nuclear brinkmanship.

The front page of The New York Times on Oct. 23, 1962

The crisis was finally resolved via vigorous diplomatic negotiations between the United States and the Soviet Union with reason.

Soviet Premier Nikita Khrushchev’s decision to withdraw Soviet missiles from Cuba was contingent upon the United States’ commitment not to invade the island.

Moreover, in a secret agreement, the United States agreed to remove its missiles from Turkey.

This diplomatic compromise effectively prevented a direct military confrontation between the two superpowers and contributed to a temporary de-escalation of Cold War tensions which could have ended in a nuclear winter.

5. The Dunning-Kruger Effect and The Vietnam War (1955–1975)

The Vietnam War exemplifies the perils of the Dunning-Kruger effect in the realm of military strategy and decision-making.

But what is the Dunning-Kruger effect?

Every now and then, people with limited access to information and therefore knowledge on a certain topic will overestimate their skills in said topic. This is something that is often talked about in the corporate world but truly it is something one can observe in our daily lives!

We all have that friend who thinks they can fix your leaking shower better and faster than an actual plumber without having extensively researched the subject first. Maybe we all have been that friend ourselves at one point?!

Well it has a name: The Dunning-Kruger effect. Now you know.

In contrast, some individuals who actually do harness a certain skill might wrongly assume that said skills should be equally easy to everyone else around them.

It is worth noting that upon hearing “Breaking news” we only have a partial part of said “news”.

So when the news of military engagements and escalations in Southeast Asia broke in the early 1950s, it ignited a sense of overconfidence and hubris among political and military leaders.

Regardless of growing evidence highlighting the intricate nature of counterinsurgency warfare, decision-makers consistently underappreciated the determination and strategies employed by their adversaries, thus falling for the illusion of competency in the matter.

This overconfidence led to a series of strategic miscalculations, including a reliance on conventional military tactics ill-suited to the guerrilla warfare tactics employed by Viet Cong insurgents.

The failure to recognize their own limitations and biases contributed to prolonged conflict and awful human costs.

Demonstrators in Washington D.C., April 1971 Image: Wikimedia Commons

Conclusion

Underscoring the dangers of unchecked bias and overconfidence in the face of breaking news events can lead to catastrophes. Our common Human History is filled with Horrific proof of that.

Let us all be wary of the weaponization of our common cognitive biases in the era of the “Breaking News”, let alone fake news…

It is up to every mind to consistently challenge one’s own bias in the face of breaking news. Easier said than done? Certainly. Yet is it THAT hard? Spoiler alert: No.

Prior to even trying to understand how cognitive biases work, one can and should always bring back reason to the thinking table and take things slow. In his “Rules for the direction of the mind” French philosopher René Descartes emphasized 4 main guidelines to reasoning:

1. Accept nothing as true that is not self-evident

2. Divide problems into their simplest parts

3. Solve problems by proceeding from simple to complex

4. Recheck the reasoning.

Title page of the first edition of René DescartesDiscourse on Method

A purple car…

References:

  1. Heuristics and Biases: The Psychology of Intuitive Judgment by Thomas Gilovich, Dale Griffin, and Daniel Kahneman. 2002

  2. The Salem Witch Trials: A Day-by-Day Chronicle of a Community Under Siege by Marilynne K. Roach. 2002

  3. The Black Death: The World’s Most Devastating Plague by Dorsey Armstrong. 2016

  4. Thinking, Fast and Slow by Daniel Kahneman.

  5. The Rise and Fall of the Third Reich: A History of Nazi Germany by William L. Shirer. 1960

  6. Thirteen Days: A Memoir of the Cuban Missile Crisis” by Robert F. Kennedy. 1969

  7. The Cuban Missile Crisis: A Concise History by Don Munton and David A. Welch. 2011

  8. Essence of Decision: Explaining the Cuban Missile Crisis by Graham T. Allison. 1971

  9. The Vietnam War: A Concise International History by Mark Atwood Lawrence. 2008

  10. The Best and the Brightest by David Halberstam. 1972

  11. On Killing: The Psychological Cost of Learning to Kill in War and Society by Dave Grossman. 1995

Previous
Previous

TikTok / USA : Suspension ou coup de comm?

Next
Next

Les Relations Publiques: Pourquoi faire?