One Finite Planet

Opportune Arguments: Confirmation Bias Weaponised.

Page Contents
Topics

Relevant Topics:

All Topics
More On This Topic

Carbon Capture and Storage: When it works, and why it doesn’t for work ‘blue hydrogen’.

There are both vocal supporters and vocal opponents of carbon capture and storage, and there are arguments for both sides. This page goes beyond reporting on how carbon capture and storage fails to make current ‘blue hydrogen’ projects sustainable, and looks at the underlying principle that dictates why carbon capture and storage, despite having genuine valuable applications, can never enable ‘blue hydrogen’ to be environmentally sustainable, but is so popular politically.

Read More »

Table of Contents

Most of us have heard of confirmation bias. Opportune arguments are like a weaponisation of Chinese whispers, where the message is intentionally changed before being passed on, with the goal of using peoples confirmation bias as a path through their defences. Start with a message accepted through confirmation bias by a person with one set of beliefs, but add a twist designed to cross belief systems. But who is being manipulated, those promoting the new message, or those receiving it?

Opportune Arguments: Confirmation Bias Weaponised.

Most of us have heard of confirmation bias. Opportune arguments are like a weaponisation of Chinese whispers, where the message is intentionally changed before being passed on, with the goal of using peoples confirmation bias as a path through their defences. Start with a message accepted through confirmation bias by a person with one set of beliefs, but add a twist designed to cross belief systems. But who is being manipulated, those promoting the new message, or those receiving it?

First, Reviewing Confirmation Bias.

The Typical, But Inadequate, Diagram and Explanation.

I looked for diagrams of confirmation bias on the web, and found many with a diagram like that to the right with “facts” on the left.

For me, the problem here is with the word ‘facts’, as I normally assume “facts” to be “information which is objectively correct”.

I feel we receive a range information which is not always factual, and our confirmation bias plays a huge role in our deciding which information is contains “facts”. The key point is that confirmation bias can result in accepting information as ‘facts’ that perhaps should not be accepted as factual, and this is at least as significant as the problem of not accepting information that does not fit with out existing beliefs.

The Challenge Of Determining Which Information Constitutes “Facts”.

I was surprised to find a dictionary definition of a fact as including: “a piece of information presented as having objective reality”, suggesting anything presented as objective reality could be considered a fact. This definition genuinely allows for “alternative facts”. In reality, there is a long history of things thought to be factual becoming replaced with new alternative facts. While recent events may appear to have made it even harder to determine if information presented as factual is even believed by those presenting, being able to determine what is fact and what is no has always been a problem.

René Descartes was reduced to “I think, therefore I am”, as truth not dependant existing beliefs, and almost all truths contain an element of doubt.

Probably the most we can required of a “fact” is that it is genuine belief of the source, but problematically that would allow “flat Earth” to be a fact even though clearly widely accepted as disproven.

Confirmation Bias Alone Does Not Feed Conspiracy Theories.

If all information either confirms existing beliefs, or is rejected, then existing beliefs will not change.

Logically, only a person who already believes the world is flat will accept information supporting that the world is flat on the basis of confirmation bias.

If you do not believe the world is flat, then that ‘information’ suggesting the world is flat will go directly to the bin. The whole principle of confirmation bias is that beliefs will be difficult to change. In order to change opinions, information must be presented as compatible with existing beliefs, or that information will be filtered out by the target audience and ignored.

Opportune Arguments: Weaponizing Confirmation Bias.

The Pillars For Weaponising Confirmation Bias.

Engagement.

Continually feeding people information which exactly conforms to the existing beliefs can arguably make people more certain of what they already believe, but the people in question will still hold the exact same beliefs after hearing the new argument as they did before. But, those people are likely to listen. If your goal is purely to gain ‘engagement’, where people spend time on a platform to hear what is said, and potentially echo those views being promoted, then feeding information that exactly matches a persons pre-existing views will work. So all that is required to potentially generate engagement, is to be able to feed people information that exactly matches their existing views.

Building Trust.

Continually feeding people information that fits their beliefs will to most people appear to be proving information “the way it is” and therefore without any bias. The safest way to appear unbiased may be to apply a bias that matches that of the audience. If the strategy of supplying only information with the correct bias is successfully applied, the audience will come to trust the source of information.

Mining Engagement And Trust: Persuasion.

Having an audience engaged and trusting is still of little value unless you persuade. Persuading is the process of getting people to adopt ideas outside of their existing beliefs. Even if you have someone’s attention and trust, getting them to accept information outside their core beliefs is still a challenge, but is required in order to persuade.

Introducing Opportune Arguments: Modifying Beliefs.

The Principle.

Opportune arguments introduce ideas from outside a person core beliefs, but link these new ideas to those existing core beliefs. By effectively embedding the new belief, not previously part of the audiences existing belief, as an argument to support a persons existing belief, the new argument may alter a persons total set of beliefs. AI as used in social media and other narrowcast media, can learn a persons core beliefs by observing which media generates engagement, and then select content with embedded opportune arguments positioned as supporting existing beliefs within suggested content.

Undocumented Immigrant Example.

Consider a person who has a core belief that deporting undocumented immigrants is usually immoral. To this person, the argument that undocumented workers are essential the the US economy may an acceptable as a new belief, because while it is not the basis of their existing beliefs, if true it is opportune.

Conversely, a person who believe minimum wages are problematic for the economy, may accept that allowing undocumented workers to remain provides a labour force unconstrained by minimum wages.

The person who wishes to support undocumented workers may adopt the argument their low cost work is important, even they oppose exploitation of workers, and a person who feels mandated minimum wages are an interference with a free labour market, may accepted undocumented workers as important, only because they can provide a low cost work force.

Opportune Arguments Work Both Ways.

An “opportune argument” naturally embraces and links two beliefs: an exiting belief and an new aspect that can become linked to the existing belief.

Consider the above undocumented immigrant Example. For the person who considers deporting undocumented immigrants a problem:

  • existing belief: undocumented immigrants bring value to society.
  • new belief: there are jobs paying below minimum wages that need to be done

For the person who feels the minimum wage interferes with a free labour market:

  • existing belief: there are jobs paying below minimum wages that need to be done
  • new belief: undocumented immigrants can bring value to society, by working for less than minimum wages.

Although these people will typically still differ in many beliefs, in some ways, each accept the same combination of “facts” as fitting with their existing beliefs, and bringing new context where those original beliefs can be applied.

Wider Use Of Opportune Arguments.

The above immigration example is an extreme case, with very passionate beliefs at play, and people in a single step adopting and possibly even promoting a point of view from outside their original core beliefs. More subtle examples involve multiple small steps to gradually shift and adapt beliefs sufficiently gradually, such that in the principle of salami tactics, the individual steps go unnoticed.

The formulae for using opportune arguments:

  1. Gain engagement and trust by appealing and to confirmation bias.
  2. Introduce new beliefs by linking that allows them to appear as confirming existing beliefs though opportune arguments.
  3. Repeat, modifying beliefs in small ‘salami’ steps to eventually create a large shift

Applied successfully, this process has been demonstrated to able to get otherwise rational people to accept conspiracy theories as otherwise implausible as “flat earth”.

The Risks Of Opportune Arguments.

The Devil Is Detail, And Is Easily Overlooked.

It is easy to quickly embrace ‘opportune arguments’ and then champion them, because just because of how opportune they are for promoting our existing beliefs. Often without throughout examination of the new ‘belief’ being introduced, provided the new opportune argument gives support for the application of the original belief.

The danger is that the examination of the details and wider implication the details of the new aspect of the opportune arguments is ignored by applying the confirmation bias in deciding what part of the overall story should be focused on. Somewhat similar to when searching for car keys, and the search ends once you have found what your are looking for. In this case the scrutiny can ends once a person has found what they are looking for.

Using undocumented immigration example, the first person gets to the point where there is a reason we need undocumented immigrants and glosses over the rest. The second person gets to the point where there is a way work can be done for less than the minimum wage and glosses over the rest.

Hubris: I Am The Smart One!

When you have a set of beliefs, it is natural to the feel everyone should have those same beliefs, and perhaps even think less of those who hold contrary beliefs.

This could lead to that first person in the undocumented immigrant example thinking “this logic will allow me to convince people who care more about money than immigrants”, and the second person thinking “this will allow me to convince those who care more about immigrants than making money”. Each believing they are having can have a victory over those who do not share their beliefs. The lure of a potential “victory” can make adopting the opportune argument even more compelling, and therefore exacerbating the potential problem of failing to fully examine the new ide being embraced.

Further, this feeling the opportune argument can win others over to what was always a core belief can lead to this new idea being repeated by and becoming engrained as a belief without ever being properly evaluated.

Conclusion

Great caution is required when adopting ‘opportune arguments’ to avoid the dangers listed.

Comment?