If I give up my false beliefs what’s in it for me?
What do we do when important ideas and concepts are being distorted in this way, when absurdity seems to take over, making serious discussion impossible? What do we do when we seem to be surrounded by warped doubles and imposters?
Klein, Naomi. Doppelganger: A Trip into the Mirror World (p. 71). Farrar, Straus and Giroux. Kindle Edition.
When faced with a double threatening to engulf you and your world (or an army of them), distance offers no protection. Far better to radically upend the table and become, in some sense, their impersonator, their shadow. That, at least, is how I rationalized listening to so much Steve Bannon.
Klein, Naomi. Doppelganger: A Trip into the Mirror World (p. 72). Farrar, Straus and Giroux. Kindle Edition.
Klein raises an interesting question of how to manage misinformation and disinformation. The answer is not straight forward. Other information is needed to answer the question such as whether the misinformation is intentional and known to be false or whether it is unintentional and genuinely believed.
There is an interesting article in the 04/22 - 4/29/24 issue of the New Yorker entitled “Don’t Believe What They Are Telling You About Misinformation” by Manvir Singh in which Singh cites the work of French Philosopher, Daniel Sperber, who makes a distinction between factual beliefs and symbolic beliefs. Believing that gravity is real and that people will fall if they jump from high places is different from believing in the resurrection of Jesus after He was crucified.
Symbolic beliefs are often shared and are the ticket of admission to belonging to groups who promulgate and subscribe to them. In fact, the stronger the belief, the more the group member is accepted as a “true believer” and attains higher status in the group and benefits more from the rewards of group membership. Even if the person starts to doubt, they are reluctant to share their doubts fearing punishment from the group members up to and including ostracism and death.
People who understand these group dynamics often counsel concerned out - group members about how to deprogram the beliefs of the ensconced person. There are many methods and tactics of which perhaps the most important is to offer the doubter a “permission structure” to which the person can shift their identity and attachment to enjoy a sense of safety in belonging to the new group.
When the believer in the misinformation no longer identifies with the group spreading the disinformation, it is more likely that the mistaken beliefs will be set aside so that more appropriate and constructive beliefs can take their place.
The definition of a delusion in psychology is “a fixed false belief.” Presentation of new information and rational argument does not minimize or eliminate delusional beliefs. A competent therapist knows this and does not argue with the person about delusional material. Rather, the person’s attention is redirected to topics and content more appropriate and constructive.
So, I am concerned about Klein’s idea that Steve Bannon’s approach of mirroring ideas and concepts in a distorted way is an effective way of countering them. The countering of the falseness of the ideas backfires and they only become more visible and reinforced and the holder of these beliefs becomes defensive, digs in their heels, and advocates for the misinformation more vociferously. The parties involved enter into a “pissing contest” which has great entertainment value for an audience emotionally aroused by the conflict with its reciprocating attacks, and those engaged become increasingly polarized and adamant.
When a young congressman asked Lyndon Baines Johnson one time what advice he had for him as a young politician, LBJ is reported to have said the most important thing he learned in politics is “Never tell a man to go to hell unless you can make him.”
One of the interesting findings about changing people’s beliefs is that they are willing to do it when you offer them money for the correct belief. Here’s how Singh describes the research:
On the other hand, there’s research implying that many false beliefs are little more than cheap talk. Put money on the table, and people suddenly see the light. In an influential paper published in 2015, a team led by the political scientist John Bullock found sizable differences in how Democrats and Republicans thought about politicized topics, like the number of casualties in the Iraq War. Paying respondents to be accurate, which included rewarding “don’t know” responses over wrong ones, cut the differences by eighty per cent. A series of experiments published in 2023 by van der Linden and three colleagues replicated the well-established finding that conservatives deem false headlines to be true more often than liberals—but found that the difference drops by half when people are compensated for accuracy.
So while LBJ had good advice about not telling a person to go to hell unless you can make them, it may be more likely for a person to change their beliefs if there is something in it for them. What can the believers in false information be offered as an incentive to be more accurate and truthful than membership in their reference group that supports their false belief system?