Believers left jobs, colleges, and spouses to join the group that would be saved. They gave up their possessions and lives to prepare for the event. December 20th approaches, and they gather. The clock strikes midnight--the appointed time for salvation. There is no alien savior. The group waits--perhaps their clocks were fast. By five in the morning, though, it is clear no one is coming for them. But wait--the leader suddenly receives a new piece of automatic writing, declaring that the flood has been called off, thanks to the light spread by the group of believers.
The individuals here gave up their entire lives on the promise that they would be rescued from a flood at midnight of December 20th, 1954. You might expect that once they received clear and irreparable disconfirmation of that belief, they would angrily reject it, and demand compensation. Instead, psychologist Leon Festinger reports (in his book on the event), the crowd grew more attached to the belief.
This was a case study that grew into a robust literature in social psychology on cognitive dissonance theory. Bluntly, the theory states that if someone has two opposing cognitions--i.e. a belief and an opposing action--they will experience an unpleasant psychological tension. As such, they will resolve the tension somehow, most likely by changing their belief to accommodate their action. The individuals in the Clarion cult had devoted too much behavior to the cult. When push came to shove, there was no way for them to reconcile their actions with a belief that the UFO-flood story wasn't true--so they adjusted their beliefs in line with their actions. (Achrei ha'peulot nimshachim halevavot may be true, after all.) Participants in experimental psych labs do the same thing all the time, albeit with lower stakes. For example, participants will get paid only one dollar to write an essay they disagree with, and will end up agreeing with the position more than they used to--because if they don't agree with it, their unconscious reasoning goes, why are they defending it well for such little reward? Participants who are paid twenty dollars, on the other hand, do not need to rationalize their actions to themselves, and so do not change their beliefs.
The nimshal? If you are in the position of doubting your previous beliefs, you have a few options. You can look at your actions, and everything you have devoted to orthodoxy, and conclude that you simply must believe it after all--and come up with new justifications for said belief. Or, your actions can eventually change in line with your new beliefs. Or, you can become orthoprax, and just live with constant intellectual dissonance. If you're lucky, you'll alleviate some of the dissonance by finding some justification for the actions you are doing, such that they seem merited (for example, you like the community). Then you end up like an experimental participant offered $20 to write an essay you disagree with. You don't have to change your beliefs to match your actions, or change your actions to match your beliefs. You know exactly why you wrote the essay. It's not because you believe in it; it's because they gave you twenty bucks.
And if you're not lucky...cognitive dissonance it is.
2 comments:
Nice analysis.
Chances are that anyone who's Orthoprax has come up with justifications for their continued observance despite their lack of beleif. Those who haven't stay Orthodox.
The rabbonim were right when they said shelo lisham ba lishma.
Thanks.
Yup, I suppose they would have to, more or less, but I, for one, still feel like I experience a lot of fluctuating dissonance over it--I waver on whether I feel like my justifications really justify it.
Post a Comment