Luigi Corvaglia
1- The paradigm of the cross-eyed gunslinger
There are two ways to hit the mark. One is to have a good aim and hit the innermost circle of the target. The other is to hit randomly and draw the target around the hole we have made. This second system is more effective, but only if no one is watching us shoot.
A group of sociologists who have monopolised the study of "new religious movements" is a good example of the second kind. These authors, gathered around the Centro Studi Nuove Religioni (CESNUR) in Turin, Italy, put forward a single and simple thesis in hundreds of mutually recognised articles in a kind of cross-peer review, namely that is, that mind manipulation is a myth. It follows that the "cults" that abuse their followers are nothing more than a "moral panic" created by a phantom anti-cult movement that is "bereft of scientific credibility". In short, people join destructive cults of their own free will and after a rational assessment and stay there. This portrayal is made with complete indifference to the enormous amount of experimental psychology, neuropsychology and social psychology studies on persuasion and social influence. In fact, it has been clear for decades that individual and collective decisions defy rationality and that the human mind is susceptible to suggestion and systematic errors that can be exploited by those who wish to direct them (Tversky & Kahneman, 1979; Cacioppo & Petry, 1984; Damasio, 1984; Zimbardo, 2002; Budzynska & Weger, 2011).
There is someone who received a Nobel Prize for these studies on the manipulability of the mind: Daniel Kahneman. The social influence and power of the perception of oneself as part of a group (self-categorisation) in determining actions is a consolidated legacy of scientific knowledge (Turner, 1987, 1991, Turner & Reynolds, 2012). The existence of persuasion techniques is the basis of marketing and political propaganda strategies (Cialdini, 2017; Sharot, 2018).
Despite this undeniable mass of data on persuasion compiled by the disciplines truly relevant to such studies, the aforementioned sociologists repeat in chorus that "science" has rejected the theory of "brainwashing".
What science? Theirs, i.e. studies based on data such as proselytism and retention rates in new religious movements.
All psychological and neurobiological studies do not count. This approach is akin to a group of boys refusing to play football and therefore deciding to fence in a new, smaller pitch, thereby defining the rules of a new game, by deciding who can and cannot play, and finally declaring that those who play traditional football are not really playing football. It's like drawing the target around the hole.
2-The argumentative fallacies
At this point it is fair to ask what game those we use to call “cult apologists” are playing in their new playing field. It is quickly said: essentially in the use of argumentative fallacies.
There are three main ones:
1 straw man argument
2 poisoning the well
3 petitio principii
The Straw Man argument
The 'straw man argument' is a trick used by those who want to win an argument without addressing its content. It works by attributing to the other side an argument that they have never put forward. Of course, the thesis must not only be false, but also obviously absurd, grotesque or ridiculous and therefore easy to refute. In the case of the apologists, the straw man is 'brainwashing'. Just as all psalms end in glory, all historical reconstructions of the concept of brainwashing made by cult apologists end with a citation of the old film The Manchurian Candidate starring Frank Sinatra. The film tells of a Korean War veteran who, in response to a certain stimulus, was reprogrammed into an alien-controlled automaton to kill the candidate for President of the United States. This cinematic and grotesque version of manipulation serves to highlight the absurdity of the idea and thus protect gurus, demagogues and cult leaders from accusations of practising it. There is only one problem with this reasoning: nobody believes in the manchurian candidate, no one has ever supported the brainwashing thesis. What scholars mean when they speak of mind manipulation has nothing remotely to do with the Manchurian Candidate hypothesis.
To better understand the difference between undue persuasion and Hollywood, however, it is useful to read a book by the Japanese writer Haruki Murakami. In his book Underground (1997), he recounts the sarin gas attack in the Tokyo underground in 1995, in which thirteen people were killed and 6,000 others poisoned. Murakami writes that the followers of the religious cult known as Aum Shinrikyo (The Supreme Truth) who carried out the attack "were not passive victims, but actively sought to be controlled". He describes how most Aum members "deposited all their valuable personal wealth of self-esteem" in the "spiritual bank" of cult leader Shoko Asahara. Their goal was to submit to a higher authority, to someone else's representation of reality. Perhaps what constitutes an abusive and totalitarian group is the premeditated construction of a system that selects and supports this escape from freedom, reinforcing it with slow and gradual steps, playing on guilt and shame. This may not be ‘brainwashing’, but it is certainly manipulation, certainly undue persuasion, because it is aimed at exploitation. We are talking here about mechanisms known to neuroscience, social psychology, the “behavioural economics” of Kahneman - who won a Nobel Prize for revealing the systematic errors (biases) and irrational heuristics of our brains used by marketing and propaganda - and the cognitive linguistics of Lakoff (2004), which emphasises the persuasive nature of language. He clarified how the use of specific terms activates cenceptual frames that guide the listener's perception. To deny this, you have to be very ignorant or very much in bad faith.
A significant mistake in the discussion of the subject has been to define persuasion as a construct made up of a single dimension. If there is only one form of persuasion, it will always be lawful for someone ("we all persuade and are persuaded") while for others it may sometimes be malignant. But they do not know where to draw the line to separate it from lawful persuasion. So it is necessary to introduce an often-ignored dimension: the purpose of the persuader, that is, the dimension of interest.
This is a dimension we can outline in an axis that has egoism (interest in ourselves) and altruism (interest in others) at the two poles. The introduction of this new dimension amplifies the range of connotations and expressive typologies of persuasion. These can be reproduced spatially by intersecting two axes according to the tradition of circumplex models used in psychology (fig. 1).
Two things can be deduced from this:
The first is that the focus must not be on brainwashing through specific methods, but on persuasion for the purpose of exploitation. That is manipulation. The attention must be directed on “why”, non “how”.
The second thing that can be easily deduced from the diagram I presented is that the idea that anti-cultists want to censor persuasion tout-court is false, as only one of the quadrants represents the mind control area. It is basically another straw man argument.
b) Poisoning the well
The expression “poisoning the well” is used to describe an argument in which what the opponent says is delegitimised in advance by questioning his or her credibility or good faith. In this way, anything they say can be ignored, deemed false or irrelevant by the public. “since you’re bad, what you say is not worthy of consideration”. The constant defamation of activists, academics and associations that show concern for totalitarian groups is certainly not aimed at discussing their arguments, but at casting doubt on their credibility. Indeed, activists who oppose the work of cults are nevertheless labelled as unscientific (because of the brainwashing myth), illiberal (because they are hostile to 'freedom of worship') or even complicit in despotism. Whatever the 'anti-cult movement' says is therefore unfounded.
C) Petitio principii (or “begging the question fallacy”)
The most sophisticated technique, which can even be regarded as a genuine mind game, is the “begging the question fallacy” (petitio principii). This is an error in which the premises already contain the statement that the conclusion is true. In other words, the conclusion is already taken for granted in the premise.
Massimo Introvigne (1993) gives us a wonderful example of this. He has found the most ingenious way to propose the concept that anti-cultists believe in a non scientific phenomenon with his division into a secular anti-cult movement and a religious counter-cult movement. He combines the division 'secular-religious' with a division into 'rationalist' and 'post-rationalist' movements.
Rationalists, according to the author, are those who believe that 'cults' attract their followers through fraud, deception. Deception is not supernatural, ergo it is rational. Therefore, there will be both rationalist anti-cult movements and rationalist counter-cult movements.
Introvigne writes:
Anti-cultists will emphasize the secular features of the fraud (e.g. 'bogus'miracles) and the counter-cultists its religious elements (e.g. 'manipulating'the scriptures), but the fraud remains prominent.
Instead, movements that imagine superhuman or supernatural intervention to explain cultsuccess are called post-rationalist. Post-rationalist counter-cult movements theorise the intervention of Satan. The devil is the supernatural explanation favoured by the religious. Referring to the secular critics he calls anti-cult movements, the author writes:
For their secular counterparts of the anti-cult movements, cultists, have themore-than-human power of 'brainwashing their victims; but, as it has beennoted, 'brainwashing' in some anti-cult theories appears as something magical, the modem version of the evil eye.
An extraordinary coup de théâtre! First, we are presented with a dichotomy that is simplified but loaded with meaning. This is then articulated in a further subdivision that produces four boxes: two for the rationalists and two for the post-rationalists, as if there were two floors of a building. One floor is rationalist and the other post-rationalist. On each floor, one flat is occupied by religious people and one by secularists. Introvigned describes the tenants on the first floor, the rationalists, as very similar because they use explanations of the same kind. They are in the same framework (rationality), but he claims to perform the same operation with the tenants of the second floor, the supposed post-rationalists, who are not similar in any way. Only a very low level of critical vigilance can let this analogy pass. A very low vigilance and an effective frame, that of absurdity (“evil eye”, “post-rationalism” and so on). Let us take a look outside the box. Satan's intervention is indeed a supernatural idea, mind manipulation a scientific theory. While it is true that neither hypothesis is universally accepted, the first is not because it is not falsifiable by Popper's definition, while the second is up for debate precisely because it is falsifiable; hence it is a scientific hypothesis. However, a well-designed frame – as George Lakoff teach us - can create an illusion of similarity.
Most importantly, the normal logical processes are reversed in the description presented here. Instead of arriving at the conclusion that the manipulation theory is irrational through a series of successive logical steps, the discourse only spins the argument further by setting this irrationality as a premise! Thus, a tautology is realised that cannot prove anything.
“Since brainwashing is not rational, cult apologists promote unscientific concept”….
Nice try, Massimo!
It is precisely a “begging the question fallacy”, because the same idea is repeated in the premise and the conclusion. Arguments that beg the question can be persuasive and obscure the fact that a debatable claim is being presented as truth.
3 - From argumentative fallacies to smoke and mirrors
The first to take advantage of the systematic errors of the mind and carry out a manipulation are precisely these authors.
a) Cult apologists as cultural parasites
Cult apologists and leaders invoke religious freedom, i.e. the principles of open society that apply outside the cults, the same principles that they deny within the cults. In other words, they claim to defend closed societies on the basis of the principles of open society. I call it “Salvemini’s paradox” (after an Italian liberal thinker).
Besides being a paradox, this is a form of cultural parasitism, because they take nourishment from the open society to feed closed societies.
b) Cult apologists as identitarians
The activism of the organisations defending ‘religious freedom’ associated with these authors is presented as a defence of rights, of freedom, of respect for free choice, in short, of democracy. It is anything but.
Where democracy means the universalisation of rights and respect for minorities, the cult apologists' proposal is not really motivated by respect for minorities, but it is very reminiscent of the differentialism of the identitarian and sovereigntist ideology, a far-right ideology, which, on the contrary, values differences precisely in order to oppose the universalisation of rights.
Identitarians and cult apologists appeal to the “right to be different”. Although this may seem like an affirmation of universalism and ecumenism, the identitarian is an enemy of the open society. dentitarians defend other closed groups against the claims of open society so that it does not interfere with theirs own group.
If the Western citizen is horrified by the practise of infibulation or other female genital mutilations and calls for their abolition, it is because he or she believes that the universalization of rights is a value that precedes respect for a culture that degrades and inflicts violence on women.
The identitarian, on the other hand, believes that the customs and traditions of cultures where individual rights are not respected must be protected because the defence of identity precedes the defence of individual rights. Indenties are superior to human rights.
Cult apologists work in the same way.
The identity of the cult is superior to civil rights that exist outside it. So the call for the defence of rights by cult apologists is a red herring, a smoke screen.
c) The final smoke screen
Finally, it takes a minimal cognitive effort to escape the traps of argumentative fallacies and understand that the New Religious Movements, the term we might ironically consider the “woke” term for cults, obviously have no reason to be defended in the name of vaunted liberal principles, because in the liberal-democratic framework, religious freedom is intangible.Those who need to be defended are abusive and totalitarian cults, i.e. groups where abuse and harassment take place. This defence is necessary for abusive cults precisely because they operate in a liberal democratic system that condemns abuse and harassment. Anything else is drawing the target around the hole.
Thanks for this excellent article. I am a cult survivor and part of my advocacy work for other survivors has been to expose and criticize academic cult apologists who publicly defended the notorious cult Children of God/Family International and thus enabled widespread child abuses to continue in that cult long after it could've been stopped. My memoir was published in September 2023, Misguided: My Jesus Freak Life In A Doomsday Cult, which you can read about on my blog: https://perry-bulwer.blogspot.com/ My book is the only one by a cult survivor, at least that I'm aware of, that documents the unethical efforts of the academic cult apologists you refer to in this article. My discussion and criticisms of those apologis…