top of page

Persuasion in High-Control Groups: A Game Theory Perspective

Updated: Sep 6

Luigi Corvaglia



Abstract


This paper examines the mechanisms of persuasion used in the recruitment and retention of members in high-control groups (commonly referred to as "sects" or "cults") through the lens of game theory. The analysis integrates empirical literature on psychological manipulation processes—particularly the Submission As Preference Shifting model (Corvaglia, 2025)—with formal models of strategic interaction, highlighting how recruitment dynamics can be conceptualized as games of asymmetric information, strategic signaling, and social influence. The proposed theoretical framework provides new insights into individual vulnerability factors, recruiters’ strategies, and possible preventive interventions.


1. Introduction


In January 2025, five years after its first appearance in several Italian academic journals, my model of Submission As Preference Shifting (SAPS) was finally published in English in IJCAM, thus enabling international dissemination. This work addresses a relatively unexplored field in the debate on undue persuasion: that of those who regard it as a real phenomenon but believe that manipulative practices and free choice are not mutually exclusive and, in fact, coexist.

In the model I propose, every choice made by those who join a high-control group is undertaken voluntarily; however, the outcome of subordination is produced by a choice architecture designed to gradually lead to the desired results of those who orchestrated it. In my conception, this outcome is made possible by mechanisms already well known in psychology:


the concept of salience—the relevance that a piece of information acquires in its surrounding environment (Borgida, Nisbett, 1977);


present bias—the tendency to focus on the immediate present rather than the future when making decisions;


the consequent procrastination of disobedience;


and framing—the ability of context to assign meaning to data (Kahneman, Tversky,1979).


Manipulation, therefore, does not lie in unspecified techniques of suggestion or control, but rather in the intentional construction of a choice architecture, that is, in the persuader’s intent.

Here, I return to that model to formalize it in the language of game theory. Its adaptability to the phenomena I illustrate serves both as a validation of the theory’s robustness and as a clearer demonstration that permanence in high-control groups is simultaneously a rational outcome—since it results from decisions made under conditions of strategic uncertainty—and a plastic one, because the environment reshapes desires and evaluations. This reshaping, in turn, is guided by what we may call the “choice architect.”

To this end, it is useful to briefly revisit the model and describe what game theory entails.


1.1 The Submission As Preference Shifting (SAPS) Model


Entering a coercive group is never the result of a single decision but rather of a series of successive choices between two alternatives: collaborate or defect. Various studies have examined the recruitment of the “Moonies,” the followers of the Unification Church (Galanter, 1979; Barker, 1983). Recruitment traditionally occurs through incremental steps, such as invitations to lunches and workshops of increasing intensity. At each invitation, the subject must decide whether to continue or withdraw. Each small step thus functions as a crossroads: those who refuse drop out, those who accept continue.

The high defection rate observed in such contexts was interpreted by Eileen Barker as evidence of the ineffectiveness of manipulative techniques (Barker, 1984). In my view, however, this dynamic represents a process of self-selection: resistant and less interested subjects leave early, while those who remain are more engaged and gradually become more compliant, eventually forming a closed, unanimously aligned group. This homogeneity reproduces conditions akin to those demonstrated in Asch’s experiments (1951), where subjects, exposed to unanimous but patently false consensus, tended to conform rather than contradict the majority.


The SAPS model explains this trajectory through three main mechanisms:


Change of salience: The Milgram experiment (1961) tested obedience to authority by asking participants to administer what they believed were increasingly painful electric shocks to another person. Despite hearing apparent cries of pain, many continued under the experimenter's instructions, showing how ordinary people can comply with authority even against their conscience. As this obedience experiments demonstrated, attention shifts from the final outcome (e.g., inflicting serious harm) to the immediate request (e.g., pressing a button, administering a slightly stronger shock). Each request appears isolated and manageable without revealing the broader picture.


Gradient of differentiation: the incremental difference between one request and the next is minimal. Each new request seems insignificant compared to prior compliance (“I’ve already agreed to something similar”), while the salience of defection appears increasingly costly in moral and social terms (“leaving now would be too costly”). “It is the gradient of differentiation that makes the shift in salience possible, because the short distance between one demand and the next renders each new claim acceptable.

As Hassan (2018) shows, these demands span the BITE domains - Behavior, Information, Thought, and Emotion.


Procrastination of disobedience: defection is postponed to a hypothetical future point when demands become excessive. But since progression is gradual and differentiation is minimal, that moment never arrives. Each delay strengthens commitment, consistent with the consistency principle (Staw, 1976; Cialdini, 2009), while perceived exit costs escalate.


In addition, Asch-like conformity stabilizes the trajectory: once the group is self-selected and unanimously favorable, dissent becomes nearly unthinkable. Here the process of Self-Categorization described by Turner (1987) comes into play: as involvement progresses, the individual increasingly perceives themselves as a member of the group, internalizing its norms and values. This depersonalization reduces the salience of individual judgments and makes conformity to collective expectations the most “natural” outcome. Within the SAPS framework, self-categorization functions as a multiplier that stabilizes the trajectory of collaboration and decreases the likelihood of interruption.The outcome is that subjects end up embracing beliefs and behaviors they would have rejected ex ante but that appear coherent within the gradual path that has transformed their preferences and evaluative criteria through framing.


Already Lalich (2004), with her concept of bounded choice, highlighted how the seemingly free choices of adherents are actually constrained by the cultic context, structured by the interplay of leader charisma, totalizing ideology, group pressure, and control systems. While the bounded choice model emphasizes structural constraints, the SAPS model provides a dynamic explanation, showing how submission emerges as a genuine shift in individual preferences.


This perspective demonstrates that individual choices can be shaped without resorting to either extreme coercion or mythical “brainwashing” techniques.


1.2 Game Theory


Game theory lies at the intersection of psychology, mathematics, biology, and economics, and studies strategic dynamics encountered in everyday life—from cooperation to competition and dilemmas—using the metaphor of “games” to describe players’ behavior when decisions are interdependent. It provides a useful framework for understanding phenomena characterized by information asymmetry, iterative dynamics, and conditional choices.

A game specifies: (i) the players, (ii) the strategies available, (iii) what each player knows (information/rules), and (iv) the payoffs (benefits/costs) associated with each combination of choices.

The advantage of game theory is that it translates intuitively familiar dynamics (courtship, cooperation, conflict, manipulation) into rigorously analyzable and predictable structures.

Game theory distinguishes several types of games:


Cooperative vs. non-cooperative: in the former, players can make binding agreements; in the latter, each acts independently without enforceable agreements.

Zero-sum vs. non-zero-sum: in zero-sum games, one player’s gain equals the other’s loss; in non-zero-sum games, both may win (win–win) or both may lose (lose–lose).

Simultaneous vs. sequential: in simultaneous games, players act at the same time; in sequential games, moves occur in sequence, with each decision informed by prior ones.

Complete vs. incomplete information: in the former, all rules, strategies, and payoffs are known; in the latter, some elements (e.g., intentions or resources) are hidden.


2. Recruitment in High-Control Groups through Game Theory


Applying game-theoretic categories, the first observation is that recruitment in high-control groups can be interpreted as a game of incomplete information. The recruit does not know the group’s true nature, future payoffs, or upcoming demands, while the recruiter holds an informational advantage and controls signals to guide decisions. Recruitment specifically belongs to the class of signaling games:


One player (sender) possesses private information about their state or the nature of a good.

The other (receiver) must make decisions based only on observable signals.

Signals may be truthful or deceptive, and the strategic problem lies in deciphering them.


The game’s equilibrium depends on signal quality. Costly signals are credible and produce a separating equilibrium, distinguishing honest from deceptive senders. Ambiguous signals produce a pooling equilibrium, where distinction is impossible.

A useful analogy is courtship: the suitor (sender) signals qualities not immediately observable (reliability, intentions). Costly signals (gifts, commitment, visible care) generate a separating equilibrium, while ambiguous ones generate a pooling equilibrium.


In cult recruitment:


The recruiter is the sender, privy to the group’s true nature but concealing it.

The potential recruit is the receiver, deciding based on signals (emotional support, positive narratives, testimonials).

The difficulty of distinguishing legitimate from manipulative groups corresponds to pooling equilibrium: all groups use similar introductory techniques, preventing ex ante distinction.


Game theory thus frames recruitment and retention not as irrational coercion but as rational interactions under incomplete information.

At this initial “hooking” stage, manipulation resides not in mysterious techniques but in the recruiter’s dishonesty. The strength of game theory is its operational clarity, cutting through ideological narratives and sophistry.


Furthermore, these dynamics constitute a non-cooperative game: the recruiter maximizes their payoff (group expansion, resources, influence), while the recruit makes step-by-step decisions without binding agreements and under incomplete information.

Ultimately, the structure is asymmetric: one player controls the narrative and progressively reshapes the utility function of the other. Recruitment is therefore a sequential, non-cooperative game of incomplete information and power imbalance, appearing initially as win–win (non-zero-sum) but evolving into a distorted equilibrium where the group benefits at the individual’s expense (zero-sum).


2.1 SAPS as a Game-Theoretic Model


The SAPS model can thus be understood as a sequential game of incomplete information involving two players: the Recruiter (R), who holds private knowledge of the group’s true nature, and the Prospective Member (P), who at each stage must choose to Collaborate (C) or Defect (D).


Structure: Recruitment unfolds incrementally. R begins with minimal, seemingly harmless requests (attend a lunch, join a meeting). P chooses C or D. Refusal ends the game; acceptance triggers the next, slightly more demanding request. This generates an indefinite chain of subgames.


Payoffs: For R, payoffs increase with each act of compliance (more resources, greater control). For P, payoffs are dynamic:


  • Immediate benefits (bᵢ) rise early (attention, belonging).

  • Collaboration costs (cᵢ) gradually increase (restrictions, material demands, loss of autonomy).

  • Exit costs (C_exit(n)) accumulate (loss of investments, social rupture, cognitive dissonance).


Each step appears rational locally: compliance yields benefits, while defection forfeits investments. Yet the global outcome is suboptimal: the recruit is trapped in a game they would not have chosen ex ante.


From the perspective of game theory, SAPS can be seen as a sequence of strategically designed nudges: each request constitutes a micro-game in which the payoff structure is configured to make collaboration locally rational. This links the model to the behavioral theory of Thaler and Sunstein (2008), which shows how a choice that is “nudged” but not imposed maintains the appearance of decision-making freedom while still producing the outcome desired by the choice architect.


Mechanisms in game terms


  • Salience shift (Milgram-like): attention focuses on immediate bᵢ rather than future cᵢ.

  • Gradient of differentiation: minimal increments never trigger resistance.

  • Procrastination of defection: exit costs rise with time, thanks to salience shift, postponing defection indefinitely.

  • Conformity (Asch-like): once selected, unanimous groups make dissent highly costly.


The permanence as Nash Equilibrium


The result of this dynamic can be described as a Nash equilibrium. The Nash equilibrium is a concept that originates in the field of non-cooperative games. It is a state of strategic stability: each player, by observing the moves of the others, chooses the strategy that, even if it does not grant them the optimal condition, still allows them the best possible result under the given conditions. It is also a stable condition, because no one would have anything to gain by unilaterally changing their decision.

An example frequently occurring in everyday life is that of two friends who must decide in which restaurant to go. Both prefer being together rather than eating alone, but one prefers restaurant A and the other restaurant B. For each of the two, the optimal outcome would be for their own preferred restaurant to be chosen, but adapting to the restaurant preferred by the other is still better than eating alone. In Nash equilibrium, the two coordinate on one restaurant, because neither has an interest in changing unilaterally, as doing so would mean being left alone.

In the context of high-control cults, the situation described above arises in an absolutely similar way, because for the recruiter the dominant and most rational strategy is to continue to propose requests; for the recruit, the apparently rational choice at each step is to collaborate, since defecting would entail a greater immediate cost.

Neither of the two actors has an incentive to deviate unilaterally from their own strategy: the recruiter because they are progressively obtaining what they want, the recruit because they perceive defection as too costly. It is precisely a Nash equilibrium. However, it is a distorted equilibrium: the result is suboptimal for the recruit, but it is stabilized by the manipulation of the payoff structure and by informational asymmetry.


Thus, sectarian recruitment requires neither brutal coercion nor mythical “brainwashing.” It suffices to design a choice architecture of credible signals, gradual escalation, and salience management, leading recruits to perceive collaboration as the most reasonable path.


Disorganized Attachment as an Amplifier of Perceived Payoffs


Alexandra Stein (2017) has shown that high-control groups induce a state of disorganized attachment in their members, even in the absence of preexisting vulnerabilities. The charismatic leader simultaneously becomes a source of threat and reassurance, creating an insoluble conflict that drives the subject to continually seek proximity to the very source of danger.

In the SAPS model, this mechanism explains why the subjective value of immediate benefits (bᵢ) is amplified and why the exit costs (Cexit) are perceived as unbearable: defecting would mean losing the figure who—although also a source of threat—is the only one providing protection. Disorganized attachment thus acts as an amplifier of perceived payoffs.

Game theory interprets this dynamic as a modification of P’s utility function: the payoff from cooperation increases not only for instrumental reasons but also because it reduces anxiety. The resulting Nash equilibrium is therefore even more stable and resistant to deviations.


Repeated Games and Escalation:


The process evolves through three phases:


  1. Hooking: immediate low-cost benefits (support, belonging).

  2. Investment: gradually rising demands; defection procrastinated under Milgram-like conditions.

  3. Lock-in: exit costs outweigh benefits, producing a strategic trap stabilized in Nash equilibrium.


SAPS clarifies how escalation occurs: the recruiter reshapes the recruit’s utility function step by step, producing submission that appears as free choice.


3. Conclusions


The application of game theory to persuasion in high-control groups offers a perspective rich in theoretical and practical implications.

Key conclusions include:


  • Recruitment processes follow predictable strategic logics that can be formally modeled as signaling games under incomplete information.

  • Individual vulnerabilities correspond to incentive and constraint configurations that make compliance locally rational.

  • Control strategies operate by systematically modifying payoff structures rather than through direct coercion.


This theoretical perspective not only enriches academic understanding of the phenomenon but also provides conceptual tools for more effective preventive and therapeutic strategies.


The SAPS model serves as a theoretical bridge connecting three distinct lines of research.


First, it integrates the qualitative descriptions of Lifton (1961), Singer (1995), and Schein (1961), as well as the more dynamic approach of Lalich (2004), who demonstrates how individual choices become progressively constrained by the interaction among charismatic leadership, totalistic ideology, systems of control, and the member’s own commitment.


Second, it links these structural insights to the psychological contributions of Stein (2017) and Shaw (2014), who highlight the power of emotional bonds and relational trauma in sustaining submission.


Finally, it introduces a strategic formalization through game theory (Nash, 1950) and behavioral economics (Thaler & Sunstein, 2008), demonstrating that collaboration is the locally rational outcome of a decision architecture intentionally designed to guide choices.


Appendix: Mathematical Formalization of the SAPS Model


Sectarian recruitment can be described as a sequential game of incomplete information between two actors: the Recruiter (R) and the Prospective Member (P).


Structure: R proposes a sequence of requests {x₁, x₂, …, xₙ}, each slightly more demanding (xᵢ₊₁ ≈ xᵢ + δ, with small δ). At each stage i, P chooses Collaborate (C) or Defect (D). If D, the game ends.


Utility function of P: Uₚ(n) = Σ(bᵢ) – Σ(cᵢ) – C_exit(n) where bᵢ = immediate benefit at step i; cᵢ = cumulative cost; C_exit(n) = exit cost after n steps.


Dynamics in payoff terms:


  • Salience shift: bᵢ outweighs future cᵢ → Uₚ(C|xᵢ) > Uₚ(D|xᵢ).

  • Gradient of differentiation: (cᵢ₊₁ – cᵢ) ≈ 0 → minimal difference, collaboration appears less costly.

  • Procrastination: C_exit(n+1) > C_exit(n), inducing continuous postponement.

  • Conformity: unanimous group raises payoff of C, reduces that of D.


Equilibrium


R’s dominant strategy: continue incremental requests.

P’s rational perceived strategy: collaborate at each step.

Formal result: ∀i, Uₚ(C|xᵢ) ≥ Uₚ(D|xᵢ) ⇒ P continues until lock-in.


Thus, SAPS maps sectarian recruitment as a sequential game of incomplete information stabilizing in a suboptimal Nash equilibrium.


In more specific terms:


  1. Choice rule


Define the possible actions at time t: conform C, deviate D, exit E.

The expected utility of an action a is:


U(a,t) = lambda_t V_G(a; mu_t) + (1 - lambda_t) V_0(a) + Psi_t(a) - k_t(a)


Where:


  • V_G: value according to the group (depends on beliefs mu_t).

  • V_0: value according to the individual's baseline values.

  • Psi_t: emotional/social payoff (praise, shame, belonging, fear, ...).

  • k_t: costs (time, money, sanctions, exit cost).

  • lambda_t in [0,1]: identity weight (how much the group's values count).


Choice rule (explained): a_t = argmax_a U(a,t)


This means: at time t choose the action with the highest utility among C, D, E.


  1. When submission "seems voluntary"


Conformity C is "voluntary" when it truly maximizes U even without strong punishments.For every alternative A in {D, E}:


U(C) - U(A) = lambda_t Delta_VG + (1 - lambda_t) Delta_V0 + Delta_Psi_t - Delta_k_t >= 0.


If Delta_VG > Delta_V0 (the group values C more than A, beyond your baseline values), a threshold follows:


lambda_t >= lambda_starwherelambda_star = ( Delta_k_t - Delta_Psi_t - Delta_V0 ) / ( Delta_VG - Delta_V0 )


Explanation: when the identity weight lambda_t exceeds lambda_star, conforming truly maximizes your subjective utility. The choice appears free because the preference weights have been shifted (SAPS).


  1. Dynamics of the shift (core of SAPS)


SAPS says that lambda_t (identification) and the salience weights w_t over value-attributes move over time due to exposure, reinforcement, and isolation:


lambda_{t+1} = sigma( alpha lambda_t + beta_1 exposure_t + beta_2 reinforcement_t - beta_3 external_ties_t - beta_4 * falsifying_info_t )

w_{t+1} proportional_to w_t elementwise_multiply exp( eta * r_t ) (then normalized to sum to 1)


Thus, the group-valued component becomes:V_G(a; mu_t) = sum over j of ( w_t^j * v_j^(G)(a; mu_t) )


  1. Mini numerical example (intuitive)


Two actions: C (conform), D (deviate).

Assume: V_G(C) = 5, V_G(D) = 1V_0(C) = 0, V_0(D) = 2Delta_Psi = Psi(C) - Psi(D) = +1Delta_k ~ 0


Then: U(C) - U(D) = lambda*(5 - 1) + (1 - lambda)(0 - 2) + 1 - 0 = 6lambda - 1


If lambda = 0.10: 60.10 - 1 = -0.4 => better D

If lambda = 0.30: 60.30 - 1 = +0.8 => better C

Threshold: lambda_star = 1/6 ~ 0.167


Above that, C is optimal and looks voluntary because your preference weights have shifted toward the group.


In one line :SAPS is a dynamics that shifts preference weights (lambda_t, w_t) until conforming maximizes U even without strong coercion; therefore submission appears voluntary.


References


Asch, S. E. (1951). Effects of group pressure upon the modification and distortion of judgments. In H. Guetzkow (Ed.), Groups, Leadership and Men (pp. 177–190). Pittsburgh, PA: Carnegie Press.


Barker, E. (1984). The Making of a Moonie: Choice or Brainwashing? Oxford: Basil Blackwell.


Borgida, E., & Nisbett, R. E. (1977). The Differential Impact of Abstract vs. Concrete Information on Decisions. Journal of Applied Social Psychology, 7(3), 258–271.


Cialdini, R. B. (2009). Influence: Science and Practice (5ª ed.). Boston, MA: Pearson.


Corvaglia, L. (2025). Cults and persuasion: Submission as preference shifting. International Journal of Coercion, Abuse, and Manipulation, 8, 45–67.


Foerster, M. (2015). Strategic communication under persuasion bias in social networks. Social Science Research Network.


Franke, M., & van Rooij, R. (2015). Strategies of persuasion, manipulation and propaganda: Psychological and social aspects. In Persuasion and Propaganda (pp. 143–164). Springer.


Galanter, M. (1979). Cults: Faith, Healing, and Coercion. New York, NY: Oxford University Press.


González, I., Moyano, M., Lobato, R. M., & Trujillo, H. M. (2022). Evidence of psychological manipulation in the process of violent radicalization: An investigation of the 17-A cell. Frontiers in Psychiatry, 13, 789051.


Grandi, U., Lorini, E., & Perrussel, L. (2016). Strategic disclosure of opinions on a social network. In Adaptive Agents and Multi-Agent Systems (pp. 161–176). Springer.


Hassan, S. (2015). Combating cult mind control: The #1 bestselling guide to protection, rescue, and recovery from destructive cults (Updated ed.). Freedom of Mind Press.


Kahneman, D. (2011). Thinking, Fast and Slow. New York, NY: Farrar, Straus and Giroux.


Kahneman, D., & Tversky, A. (1979). Prospect theory: An analysis of decision under risk. Econometrica, 47(2), 263–291.


Laibson, D. (1997). Golden eggs and hyperbolic discounting. Quarterly Journal of Economics, 112(2), 443–478.


Lalich, J. (2004). Bounded Choice: True Believers and Charismatic Cults. Berkeley: University of California Press.


Lifton, R. J. (1961). Thought reform and the psychology of totalism: A study of “brainwashing” in China. W. W. Norton.


Milgram, S. (1963). Behavioral study of obedience. Journal of Abnormal and Social Psychology, 67(4), 371–378.


Nash, J. (1950). Equilibrium points in n-person games. Proceedings of the National Academy of Sciences, 36(1), 48–49.


Shaw, D. (2014). Traumatic narcissism: Relational systems of subjugation. Routledge.


Schein, E. H. (1961). Coercive persuasion: A socio-psychological analysis of the brainwashing” of American civilian prisoners by the Chinese Communists. W. W. Norton.


Schelling, T. C. (1960). The Strategy of Conflict. Cambridge, MA: Harvard University Press.


Singer, M. T. (1995). Cults in our midst. Jossey-Bass.


Staw, B. M. (1976). Knee-deep in the big muddy: A study of escalating commitment to a chosen course of action. Organizational Behavior and Human Performance, 16(1), 27–44.


Stein, A. (2017). Terror, love and brainwashing: Attachment in cults and totalitarian systems. Routledge.


Thaler, R. H., & Sunstein, C. R. (2008). Nudge: Improving Decisions about Health, Wealth, and Happiness. New Haven, CT: Yale University Press.


Turner, J. C., Hogg, M. A., Oakes, P. J., Reicher, S. D., & Wetherell, M. S. (1987). Rediscovering the social group: A self-categorization theory. Basil Blackwell.


Von Neumann, J., & Morgenstern, O. (1944). Theory of Games and Economic Behavior. Princeton, NJ: Princeton University Press.


Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating

© 2023 by Train of Thoughts. Proudly created with Wix.com

bottom of page