Creating a More Rigorous Belief Filter

Derek Meegan
6 min readDec 9, 2020

The communization of mass information delivery has led to radical changes in public discourse. The internet has allowed everyday people (like me) to be able to communicate their ideas through the same medium as traditional institutions of information and news. There is no doubt that there are many immediate benefits to this arrangement, as the movement of grassroots journalists and internet personalities has grown tremendously. Yet, beneath the mass growth of interconnectivity are extremely adverse mechanisms that undermine stable public discourse, disseminate false and misleading information, and sow division and polarization between groups.

In traditional modes of information warfare, centralized entities restrict the type of information consumed. This sort of repression can be seen in particularly extremist cults or governments that seek to starve their patrons of information that could divert their mindset from reinforcing the power of the central authority. A specific example of this phenomenon is the internet firewall in China, which blocks a number of websites including Google, Youtube, Facebook, and Wikipedia. By restricting the citizenry’s ability to discover and share new information the Chinese government is more easily able to set the narrative surrounding historical or current events; guiding the general discourse of Chinese society.

While information suppression is particularly repressive, its negative effects pale in comparison to the main antagonist of this article: information flooding. The United States may not be subject to rigorous internet firewalls that block the dissemination of information, yet it is subject to the exact opposite problem. Instead of a central authority governing the flow of information, the US is subject to the over flooding of information from millions of internet “sources,” undermining concrete belief and sowing division over the mere nature of reality.

The medium in which internet information is served ambiguates the differences between legitimate and illegitimate sources. Since both are presented on the same platform in similar ways, it becomes increasingly difficult for people to differentiate between truth and untruth. To induce people into believing you on the internet, one only requires the appearance of credibility, and even that has been subject to lower and lower scrutiny. This is particularly aggravated given the ability for people to use technology to fabricate documents, pictures, and videos. Lowering the barrier of necessary credibility has legitimized online pundits of all types that would otherwise have enjoyed no platform at all. And, as the power of these internet personalities and pundits grows, it's fairly reasonable to perceive the phenomena of institutional rebellion as correlated.

The general perception of news outlets has decreased considerably in recent years. Since 1976, trust in news outlets has decreased from 72% to just 40% of American people. That is a particularly sharp decline in trust in less than a 50 year period. And, as the popularity of major news outlets wanes, internet pundits have proved a particularly useful replacement. Irrespective of misinformation, an additional negative effect of the mass switch from major news to online pundits is the polarization of public discourse. Online pundits are held to a much lower level of scrutiny in comparison to those on major news outlets. This allows online pundits greater flexibility in their presentation of opinions and choice of topics; opening the door for misleading or fabricated assertions with little threat of deplatforming (less they engage in egregious misinformation or hate speech [see Alex jones]).

Simultaneous to one’s simple interest in an online pundit is the ensuing echo chamber created by data-oriented content algorithms. These algorithms, designed by platforms like Google, Facebook, Youtube, Twitter, and other big tech giants, were created with one goal in mind: retain users. The algorithm assumes that, when you watch a video or series of videos, you will want to view videos like the ones you have watched in the future. If one’s interest is in bicycles or car repairs this may seem particularly handy, yet, if one's interest lies in outlandish conspiracy theories, they may end up being served increasingly extremist media, leading to radicalization and inadvertent consumption of misinformation.

Given these conditions, it's clear that one can easily construct an insulated echo chamber that serves to reaffirm one’s preexisting beliefs and remove all media that may dissent or undermine those beliefs. All of these factors together greatly threaten people’s ability to think critically about the positions and beliefs they purport. A person should certainly be concerned about the reality of their beliefs and be open to testing them against hard evidence, logic, and reasoning. Yet, when the internet cultivates an environment that can lead one to not only engage with misinformation but also insulation from dissenting opinions, we arrive at a problem that has considerable implications on the functioning of society as a whole.

Misinformation and insulation stunt critical thought. The mere awareness of fake information could even be used to mislabel and discredit credible sources of information. The platform of the internet greatly inhibits the population’s ability to discern truth and creates divisions that undermine realities once considered infallible. What is clear is that each person must undertake a duty to construct a rigorous and apolitical filter for their beliefs.

Allowing the wrong information or beliefs to pervade your mind space can vastly alter your perception of reality and understanding of the world. The internet’s distinct talent of radicalization augments the danger of allowing adverse beliefs to enter your lexicon of truths; as accepting one belief can induce further radicalization and acceptance of misinformation. It’s important now more than ever to understand the exact mechanisms that lead you to coopt certain beliefs or opinions you consume online or even in regular life.

There are several questions one asks themselves when presented with a media piece of opinion:

  1. Does this make sense?
  2. Does this come from a credible source?
  3. Is this corroborated by other sources and/or evidence?
  4. Do I agree with this or does it reinforce my worldview?

Increasingly, it seems as though people are using the fourth question to guide their decision-making process as to what is or is not truth. In my opinion, this has much to do with the insulated echo chambers created by content algorithms. Lacking dissenting opinions, people are becoming more and more sensitive to information adverse to their worldview, oftentimes causing them to search for or misinterpret information that assists them in clinging to their preexisting beliefs. This phenomenon is known as confirmation bias. Generally, it is understood that confirmation bias is a natural reaction for people faced with information that disproves their worldview; tapping into our innate instinct to avoid cognitive dissonance (the state of having opposing beliefs).

Irrespective of the cause, it's important to override the natural instinct of confirmation bias to create a more rigorous and tested lexicon of beliefs. When examining something as truthful or credible, it's much more important to test the first three questions as opposed to the last. And, in my opinion, we should eliminate the fourth question completely when objectively examining the legitimacy of a claim or belief. The fact of the matter is, it is irrelevant to the nature of reality as to if a piece of information agrees or disagrees with your worldview.

What is important is that:

  1. The information is logically sound and realistic; it lacks outlandish claims or claims without substantial evidence.
  2. The information is from a source that is credible and does not habitually disseminate false information.
  3. The information is corroborated by multiple credible sources as defined above.

Without the institutional barriers that once prevented the mass spread of disinformation, it becomes everyone’s personal duty to weed out false or misleading information from being coopted and spread. Each person is responsible for constructing their own rigorous belief filter to cultivate a stable discourse and reestablish a more complete collection of agreed-upon truths. We as a society cannot allow the greatest invention of mankind to be the very thing that desecrates our ability to operate as a society. The internet is one of the greatest enablers in the world, as a functionally unlimited amount of information is available at your fingertips, yet not all information is created equal.

Recognize the sanctity of your lexicon of beliefs and safeguard it with a rigorous and systematic filter for misinformation; the very fate of our republic may depend on it.

--

--

Derek Meegan

Honors student at Temple University majoring in Management Information Systems and Economics and minoring in Political Science