Lessons from The Hunger Games 5C – Identifying Cults: Ultimate Vision, Reductionist Language, Ideological Conformity, and Social Ostracism

5. How Do We Discern Dystopian Dynamics and Totalitarian Tactics? POST SUMMARY: This post introduces and overviews Robert Jay Lifton’s eight criteria for totalitarian thought reform (“brainwashing”) systems. It also gives some learning exercises for two groups: survivors of spiritual abuse and their personal network, and organizational designers/leaders who want to develop healthy and sustainable ministries. Note: I have split this material into three parts so readers can receive the best benefit from it.

Part 5A prepares our thinking with a review of previous points in the series for discerning an abusive/dystopian system, thoughts on totalitarian tactics from The Hunger Games trilogy, and the “before” part of the learning exercise.

Part 5B summarizes Dr. Lifton’s system for identifying “cults” and how the various elements work together. It then explores the first four of his eight criteria, dealing with: communications, motivations, absolutism, and confession.

Part 5C explores the final four of Dr. Lifton’s eight criteria: ultimate vision, language, ideological conformity, and ostracism. It also gives the “after” part of the learning exercise, and draws out three key issues for putting “brainwashing” into perspective.

Thought Control, Toxic Churches, and Lessons from The Hunger Games Trilogy

5. How Do We Discern Dystopian Dynamics and Totalitarian Tactics? ~ Part C

Introduction and the “While You Read” Exercise

Before we get into the specifics of Dr. Lifton’s criteria, I’m going to suggest that you take a moment to review what you learned during the “Before” exercise that went with Part 5A. Then read through the exercise below. (It is the same for survivors of spiritual abuse as for designers and leaders of sustainable systems.) Follow what it says in the “While You Read” section as you work your way through the overviews of the eight criteria. At the end of the post, you’ll find an “Afterwards” exercise to help tie things together.

While You Read Dr. Lifton’s Definitions and Criteria

For survivors of spiritually abusive/toxic organizations and their family, friends, and advocates; and for organizational systems designers and/or leaders:

  • How do you respond to the definitions of key terms? Do any strike you as odd, unrealistic, amazing, … ?
  • Note items that come to mind to add to your own list of coercive/abusive experiences.
  • Note items that come to mind to add to your own list of ways your current church, ministry, or organization is potentially coercive, and ways it exhibits health.
  • Assuming you have read The Hunger Games trilogy, make a list of insights and questions you have about how various of Lifton’s eight criteria apply to Panem’s Capitol, to its outlying districts, and to District 13.

Reference List of Dr. Lifton’s Eight Criteria: Ideological Totalism and Thought Reform

  1. Milieu Control – restrict what communication modes are allowed.
  2. Mystical Manipulation – appeal to some higher purpose, as set by the leader or organization.
  3. The Demand for Purity – require purity of thinking, that is, with a black-and-white mentality where all our group thinks is absolutely correct.
  4. The Cult of Confession – use a radical level of personal confession to unburden people from their crimes (real or imagined) against the organization.
  5. The “Sacred Science” – promote our moral vision as ultimate: Our way of life is the only right one.
  6. Loading the Language – create code words and insider jargon that reduces complex problems to simplistic solutions, and condenses categories into judgmental labels.
  7. Doctrine Over Person – require people to conform to our perfect system of truth so that individuality is eradicated and sublime conformity is the sacred norm.
  8. The Dispensing of Existence – exercise the “right” to decide who has the right to exist in public and who needs to be isolated or excommunicated.

5. The “Sacred Science”

Leaders and followers in a cult organization present its ideology as the ultimate moral vision – as if theirs is the only right and righteous Way of life. They act as if it has an airtight logic and unassailable methodology for achieving that vision. So, since their system of precepts and practices is absolutely perfect, that elevates it to the status of dogma – “orthodox faith.” It constitutes “sacred science.”

Because the entire system is sacred and transcends “normal” and “worldly” wisdom, it is right for guiding every aspect of life. Adherence to it is a moral responsibility. Protecting it is a moral obligation – even if it means taking actions others do not understand, or may even find “immoral.” But protecting the organization, its leaders, and its ideology may require it. No one is allowed to “mess” with the sacred texts or to oppose the highest leaders in the hierarchy, because they mediate that truth to the rest of the congregation, community, or movement. Everyone must show reverence for the truth and never, ever question it or those who dispense it.

“This sacredness is evident in the prohibition (whether or not explicit) against the questioning of basic assumptions, and in the reverence which is demanded for the originators of the Word, the present bearers of the Word, and the Word itself” (pages 427-428).

The totalist system as sacred science offers comfort and security, because it makes no distinction between logical reasoning and mystical insights. By fusing the two, it offers a starting place of common ground for various kinds of people. Problems arise, however, when peoples’ mind, feelings, or gut instincts tell them “something is off.”

“Since the distinction between the logical and the mystical is, to begin with, artificial and man-made, an opportunity for transcending it can create an extremely intense feeling of truth. But the posture of unquestioning faith – both rationally and nonrationally derived – is not easy to sustain, especially if one discovers that the world of experience is not nearly as absolute as the sacred science claims it to be” (page 428, emphasis added).

As Brian Herbert and Kevin J. Anderson state in Dune: The Butlerian Jihad, “Assumptions are a transparent grid through which we view the universe, sometimes deluding ourselves that the grid is that universe.” So, perhaps the presumptive claim that the cult’s “sacred science” is a complete, perfect, closed system is eventually what leads to discontent for followers. It presents itself as everything, but at some point, problems will arise because reality is bigger than the cult’s box. As with the cult of confession, the more that followers attempt to adhere to the fullness of the totalistic system, the greater the possibility that they will see how it does not fit. And so, the system itself creates the probability of internal resistance which provides fertilizer for growth of rebellion.

6. Loading the Language

Because my professional training is in linguistics, a lot of things occurred to me that might not for others as I read the section on Loading the Language. For instance, the concept of simulacra came to mind as a metaphor for the problem of language in a totalist ideology cult. A simulacra is when you pass around a copy of a copy of a photocopy, ad infinitum, to the point where you don’t know where the original is (if it even exists anymore), and every generation of duplication away from that original loses clarity of content. The more blurry the latest copies become (as they inevitably will, cloned from a mega-copy instead of taken freshly off a clear original), the less the level of meaning that might be decipherable to read from.

Cult language is like a simulacra. The originators of the jargon may have found a certain term meaningful, because they did the work to synthesize (or create from scratch) what it meant in the first place. But eventually, all anyone else can do is repeat the term. They never experienced the underlying origins, never processed it for themselves. For them, the meaning has collapsed. It becomes mere code for some dense point of orthodox dogma. It transforms into what Dr. Lifton calls an “interpretive shortcut” so followers don’t have to (nor do they get to) think for themselves. Jargon just promotes passivity and mental laziness.

As an example, think about how Communist theory talked about “the bourgeois” – an over-class that oppresses “the proletarians.” Say a Communist revolution killed off the bourgeois, and within a few generations, no one grows up with a living memory of who those bourgeois were and what they did. But the code word bourgeois still gets used repeatedly. It no longer has any historical anchoring, no context, no real content anymore. And yet, young people who are born generations after the revolution still use the term. Is it out of following a charismatic leader? Or because they don’t have vocabulary for other possibilities? Or has the limited language actually constricted their worldview? Questions like these are why the peculiar use of language is crucial to identifying organizational cults.

That’s all backstory that I’m filling in. Dr. Lifton seems to assume that his readers know enough about general issues of language and communication that he dives right in. Here is how he opens this section of the book:

“The language of the totalist environment is characterized by the thought-terminating cliché. The most far-reaching and complex of human problems are compressed into brief, highly reductive, definitive-sounding phrases, easily memorized and easily expressed. These become the start and finish of any ideological analysis” (page 429).

By using language that is more loaded, authoritarian leaders create dependence (the leaders know what these terms mean; I don’t have to) and limit thought (the truth is encoded in a limited number of terms that tell what is good-pure-right and what is evil-impure-wrong; those are the only categories that matter).

Granted, every social group uses “insider language” to some degree. It helps us identify others as “one of us” and can give a common vocabulary for conversing about issues of interest. Over time, though, such group language should grow and change. However, Dr. Lifton notes that totalistic organizations take labeling to the extreme, giving their code words the status of sacred and therefore being perfect and unchangeable. That is part of how they manipulate language usage to serve their purposes.

“Totalist language, then, is repetitiously centered on all-encompassing jargon, prematurely abstract, highly categorical, relentlessly judging, and to anyone but its most devoted advocate, deadly dull: in Lionel Trilling’s phrase, ‘the language of nonthought’” (page 429).

And, in the extreme cases, what happens with the followers? He says that “[I]magination becomes increasingly dissociated from … actual life experiences and may even tend to atrophy from disuse” (page 430). I find this particularly interesting because, in my understanding, our imagination is closely tied with hope. Imagination helps us consider positive horizons and constructive scenarios that are outside the realm of our experience. So, to capture the imagination – and not just mental cognition and emotions – brings more of people’s personhood into alignment with the authoritarian leader.

7. Doctrine Over Person

But what happens when people don’t align with the authority of the group’s sacred word or its leaders?

The basic thought in this section is: You modify people to fit the system, not the system to fit being human. So, when it comes right down to it, People AREN’T more important than things (i.e., the ideology). There is no room for challenge, growth, or variety. The system is perfectly absolute and absolutely perfect: “We have the truth, they don’t. They need to change.” (Therefore, it is not a huge leap forward to consider that those who continue to reject The Truth and resist thought reform do not qualify as human – which is the essence of criterion #8.)

This absolutist language is sterile. Everything allowable – thoughts, feelings, concepts – gets clinically categorized and labeled. If the ideology does not name something, it does not exist. Accordingly, any experiences that “seem” outside the bounds of the catalog of dogma must be immediately reinterpreted “correctly” to fit within it, or otherwise their existence must be denied. People must conform to it, not vice versa.

Also, the catalog of recognized realities is complete. It is frozen in time and will not be expanded – unless, of course, the leaders have some great new revelation. So, instead of any humanistic broadening of the doctrinal/philosophical model to accommodate peoples’ experience or cultural changes that it does not yet cover, it continues the dogmatic limitations on them. People who are trapped in the system must find a way to ignore, isolate, or otherwise excise those experiences – if they wish to survive.

Thus, the “human” drama has a very limited set of stock scripts and roles to play out. Leader/follower. Hero/villain. Compliant/convict. There is no room for creativity either inside or outside those bounds, because difference constitutes deviance. So, the totalist system is mechanistic, not humanistic. It either turns people into automatons, or treats them like non-humans. Even the sacred nature of the system requires overlooking the variety of human experiences for the sake of maintaining the official sanctioned ideology, history, and psychology:

“[P]ast historical events are retrospectively altered, wholly rewritten, or ignored, to make them consistent with the doctrinal logic. … The same doctrinal primacy prevails in the totalist approach to changing people; the demand that character and identity be reshaped, not in accordance with one’s special nature or potentialities, but rather to fit the rigid contours of the doctrinal mold. The human is thus subjugated to the ahuman” (page 431).

Dr. Lifton refers to Benjamin Schwartz’s description in his 1951 book, Chinese Communism and the Rise of Mao, of a will to orthodoxy, saying it “requires an elaborate façade of new rationalizations designed to demonstrate the unerring consistency of the doctrine and the unfailing foresight which it provides” (pages 431-432). This is exactly what we witness in Orwell’s novel, Nineteen Eighty-four, and the work of Winston Smith. He serves the so-called Ministry of Truth by rewriting the newspapers and books of the past to make them appear as if Big Brother and the Ingsoc Party accurately predicted the realities of the present – and then destroys the old evidence that contradicts it.

How often these days do we find that authoritarian ministers rewrite their personal past to manage their current public persona? Or edit organizational documents and drop pages from websites in an effort to do damage control? In the internet era, however, digital documentation is often retrievable, which means old facts can re-find their way, back to the light of day.

In such settings, identity and equilibrium become key struggles for people with a conscience and a consciousness of how they are being dehumanized by the dominating cult of ideological totalism and the “doctrine-dominated pressure to change.” He is “thrust into an intense struggle with his own sense of integrity, a struggle which takes place in relation to polarized feelings of sincerity and insincerity. In a totalist environment, absolute ‘sincerity’ is demanded; and the major criterion for sincerity is likely to be one’s degree of doctrinal compliance – both in regard to belief and to direction of personal change” (page 432).

Those who fail to comply have their beliefs and behaviors labeled as “deviant.” Always, the person, not the system, is blamed for any failure to conform to The Truth. How very much like so many authoritarian, doctrinaire churches and ministries this is. If someone falls short of the exact doctrinal representation of biblical truth, or falls short of full submission and obedience to leaders as Hebrews 13 (supposedly) demands, it’s because of their own problems, not because of flaws in the ministry’s doctrinal statement, leaders, vision, mission, etc. The purity of these things is always more important than any of the people.

8. The Dispensing of Existence

Once an authoritarian “hive” organization has set its standard of absolute truth, it isn’t that big of a leap of logic to jump from allowed/disallowed doctrines to allowed/disallowed people, solely based on who carries or doesn’t carry those doctrines. The dogma and its mediators determine who deserves to survive and thrive, and who doesn’t. That’s because, as has been noted many times, the cult must protect its purity. And that requires isolating the impurities so they don’t contaminate the majority. Identify-isolate-remove is the core process to get rid of any kind of impurity.

So, what happens to people who refuse to bend their will and yield their mind to thought reform? It makes sense that the cult will progress from identifying such rebels and denouncing them, to either detainment and imprisonment or to expulsion and shunning. And, if need be, there is always execution and genocide for those individuals and groups who will never be like the obedient masses. Admittedly, the exact actions of a given cult state or organization may not go as far as genocide, but the underlying thinking is all of the same piece. Jesus said that anger was akin to murder and lust was akin to adultery. The inner temptation can lead to the outer action, which may turn out to be quite extreme. There is a logic involved, even if there is no longer a conscience:

“Under conditions of ideological totalism, in China and elsewhere, nonpeople have often been put to death, their executioners then becoming guilty (in Camus’ phrase) of ‘crimes of logic.’ But the thought reform process is one means by which nonpeople are permitted, through a change in attitude and personal character, to make themselves over into people. The most literal example of such dispensing of existence and nonexistence is to be found in the sentence given to certain political criminals: execution in two years’ time, unless during that two-year period they have demonstrated genuine progress in their reform” (page 433).

If someone wants to ensure surviving in such an environment, sadly, they must choose to sacrifice themselves to the system. “Existence comes to depend upon creed (I believe, therefore I am), upon submission (I obey, therefore I am) and beyond these, upon a sense of total merger with the ideological movement” (pages 434-435). The totality of one’s being must conform and comply to the totalism of the cult. But authoritarian leaders who make such demands of others act as if they are gods:

“Are not men presumptuous to appoint themselves the dispensers of human existence? Surely this is a flagrant expression of what the Greeks called hubris, of arrogant man making himself God. Yet one underlying assumption makes this arrogance mandatory: the conviction that there is just one path to true existence, just one valid mode of being, and that all others are perforce invalid and false. Totalists thus feel themselves compelled to destroy all possibilities of false existence as a means of furthering the great plan of true existence to which they are committed” (page 434).

How many spiritually abusive leaders in how many toxic churches and malignant ministries show this kind of absolute arrogance? They defraud God and damage His people. And though we as survivors are not to treat them as they have treated us, we know that their day of reckoning will come. They will be accountable for their hubris and the harm they cause …

Conclusion – Criteria #5 through #8

Again, I find it intriguing to see how much space Dr. Lifton dedicated to each of these final four of his eight criteria for identifying ideological “cults.” Number of words used gives some indicator of the importance and/or complexity of each element. And he spent the most time in this group on doctrine over person (criterion #7, 81 lines of text), the second most on the dispensing of existence (#8, 80 lines), the third most on the “sacred science” (#5, 56 lines), and the least on loading the language (#6, 51 lines). Hopefully I’ve reflected those priorities in the details of my attempts to explain and translate this material.

If you want to read more online directly from this chapter, here is where you’ll find an edited excerpt of Chapter 22.

After You’ve Read Dr. Lifton’s Definitions and Criteria

For those who are survivors of spiritually abusive/toxic organizations and their family, friends, and advocates:

  • What particular criteria did you most notice, and how could you tell they were stirring you?
  • What personality factors, family background, cultural background, and life experiences do you think may have contributed to your being susceptible to spiritually abusive leaders? How were you “hooked” into their personality and/or their system?
  • From these criteria, would you say that the abusive leader(s) you experienced qualified to some degree as being “authoritarian”? How many of the criteria did their communications and actions demonstrate? Which seemed to come across the strongest?
  • How did you end up leaving the situation – or, if you have not left, do you think now that you might? What is sparking these decisions?
  • What are the biggest “take-aways” in what you read about Dr. Lifton’s criteria? If you also processed this series with other people, what are the best nuggets of wisdom you got from your conversations?
  • What areas do you sense you might want to (or need to) investigate more fully in your own life to safeguard you from abusive leaders and totalistic systems in the future?
  • Are there actions you are thinking of taking, because of reading this series? What do you plan to do about them?

For organizational systems designers and/or leaders:

  • What vulnerabilities toward authoritarianism and totalism within yourself, your organization, and/or your ministry plans did you notice while reading this series?
  • Does your organization exhibit emerging or ongoing issues with communications? Expectations? Guilt/shame? Valuing or not valuing yourself, other people, other organization members or former members?
  • What particular criteria surprised you as being identified as features of an ideological cult? Which ones did you already know about?
  • What are the biggest “take-aways” in what you read about Dr. Lifton’s criteria? If you also processed this series with other people, what are the best nuggets of wisdom you got from your conversations?
  • What areas do you sense you need to investigate more fully in your own life or organization, as a result of studying these eight criteria?
  • Are there specific strategies, infrastructures, processes, and/or procedures you sense need to be changed? How will you go about researching what to do and then changing them?
  • How do you plan on sharing what you’ve learned with those you serve with?

Final Thoughts: “Brainwashing” and the Big Picture

As I absorbed what I could of Dr. Lifton’s approach and mulled it over in light of other academic works on thought reform and “cult studies,” it gradually become apparent that there are at least three crucial big-picture issues we need to understand. Otherwise, we could miss the point and misuse his criteria.

(Sidenote on learning styles: I test out as “Very Analytic” in my “cognitive field orientation,” so I work best by building from details up to the themes. These three points emerged toward the end of my studies on Dr. Lifton’s system for identifying cults. So, in this case, I decided to put the points here instead of at the beginning, where they might have proven more helpful to those who are “Very Global” in their field orientation, as they work from themes to the details. However, remember that for the sake of the global-oriented student, I did put at the very beginning of the eight criteria that one long paragraph to overview themes interwoven from all eight parts. That synthesis emerged even later, after I had developed these three points and spent a significant amount of time analyzing the criteria already.)

First, “ideological totalism” and “thought reform” (more commonly called brainwashing) are not the same thing. Totalism refers to a complete paradigm, which is the content and goal of re-education. It includes core assumptions for processing information, as well as values, beliefs, behaviors, social organization strategies and infrastructures, cultures, collaborations, and lifestyles. Meanwhile, thought reform is about the means and methods used to achieve the goal of integrating the content of a specific paradigm into people. The point: People often get all caught up in the tactics of brainwashing, and perhaps even equate brainwashing to torture. But Lifton’s criteria cover the what, why, and how of both totalism as a system AND the tactics used to get people to conform to it. And most of those means for thought reform involve intense social and psychological pressure.

“The more clearly an environment expresses these eight psychological themes, the greater its resemblance to ideological totalism; and the more it utilize such totalist devices to change people, the greater its resemblance to thought reform (or ‘brainwashing’)” (page 435).

Lifton goes on to note that totalism is perhaps most noticeable when thought reform is being carried out more openly and vigorously. Also, totalism tends to be more prominent at the beginnings of a mass movement, when enthusiasm, zeal, and hope are high. (Could this have any correlation with why church planting and multi-campus development and mergers have such prominence these days among ministry movements that are being identified as authoritarian?)

Second, the type/flavor of the ideological system at issue can differ widely. And thus, cult – a sociological term used of any kind of totalistic organization or totalitarian system – can be applied to totalistic systems that are political, philosophical, or religious. However, Christians typically think of as a cult as holding to false doctrine.

So, we Christians need to be careful and intentional whenever we use the term cult, and specify whether we’re using it in the sociological sense to refer to a totalistic organizational system, in the theological sense to refer to an organization with a doctrinal heresy, or a combination of both. (And to this, I would add a caution on the use of the term toxic. In church realms, it is generally being used these daysto describe a ministry system that is definitely destructive but not necessarily a ideologically totalistic cult or doctrinally heretical cult.)

Third, not all forms of personal change lead to ideological totalism. Within the broader issue of through reform, holistic personal “conversion” through an individual transformation process is not the same as totalistic paradigm “re-education” through a social conformity process. For instance, some people will make blanket comments like, “All religion is brainwashing. All churches are cults.” But then, why isn’t counseling likewise brainwashing? Or why doesn’t atheism automatically equate to brainwashing? What are the distinct differences between personal growth (even if/when it involves “conversion” to another faith or philosophy) and complete thought reform?

This issue of conversion versus totalistic compliance is important in academic-level cult studies. Researchers seem to be far more careful than we Christians are when examining new religious sects and movements. These do not automatically qualify as sociological cults simply because they are new or because people convert to them – even if they do qualify as theological cults. Likewise, researchers tend to be more careful in looking at slow conversion/growth processes that allow for individual pace and identity and diversity, versus one that conforms people to a required social form at a particular pace and with no room for individuality or creativity.

Such emerging studies that distinguish personal transformation from social conformation may prove very important. That is because totalistic cults using Christian language and structures often hide behind practices that they say are for “personal growth” or that “bring transformation” – but in actuality are conditioning people for conformity. But real growth lets God’s unique design for each individual flourish and genuine transformation promotes diversity, not uniformity. The counterfeit version squashes both identity and individuality.

Bonus Content: Dr. Lifton’s Psychology of Trauma Studies

Through the years, psychiatrist Robert Jay Lifton – now in his mid-80s – has conducted studies into some of the darkest sides of human nature and the most difficult of historical events. Through these, he has become a leader in “traumatic stress” research. His has done interviews, research, analysis, and publications on such intense topics as:

  • Mass killings – the Holocaust, Hiroshima.
  • Global issues and ethics – eugenics and medical experiments of the Nazi doctors, nuclear war or disarmament, capital punishment.
  • Totalitarian countries and doomsday cults – brainwashing in Communist China, Aun Shinrikyo (Japanese apocalyptic cult that released poisonous gas).

So, when Dr. Lifton writes about psychological and social characteristics of cult organizations, he knows whereof he speaks. In fact, I will be referring later in this series to a book he co-edited with Dr. Jacob Lindy, Beyond Invisible Walls: The Psychological Legacy of Soviet Trauma. Published a decade into the post-Soviet era, it focuses on counseling for trauma and loss issues that were common among children and adults of the former Soviet Union and Eastern European Bloc.

But I do have to wonder what gives Dr. Lifton the fortitude to do this kind of deep research into such difficult topics. He has studied real-world events that are as horrific as the fictional world of Panem and the Hunger Games. Where does he find the strength to carry on? I don’t know what Dr. Lifton’s spiritual views are, but I have often pondered this quote from the foreword in his 1986 book, The Nazi Doctors: Medical Killing and the Psychology of Genocide.

“One cannot expect to emerge from a study of this kind spiritually unscathed, all the more so when one’s own self is the instrument for taking in forms of experience one would have preferred not to have known about. But the other side of the enterprise for me has been the nourishing human network, extending throughout much of the world, within which I worked. Survivors were at the heart of it, and they provided a kind of anchoring … [We are] capable of learning from carefully examined past evil. I undertook this study, and now offer it, in that spirit of hope” (page xiii.).

Whatever his philosophical or religious beliefs, clearly he embodies the concept of “redemptive investment.” Somewhere, sometime, it really costs someone in order to provide a meaningful blessing to others – even if those so blessed remain completely unaware of those who served to their benefit. Dr. Lifton’s research work cost him personally and spiritually, calling forth sacrifice and transformation through engagement with suffering. Can we expect our efforts at discernment in the realm of toxic churches and malignant ministers to involve it any less?