Remember that party from college? That crazy time where we went and picked up the security guard from our apartment complex on the Strip at 2 am? And instead of picking him up, we ended up going out with him and walking down the Strip barefoot. Then the sorority! And what about when the police pulled us over later?* No? Well, no worries if it’s true or not, as long as the story is good. Wired Science points out a study that people don’t let the truth get in the way of a good story:
The neuroscientists were interested in how the opinion of other people can alter our personal memories, even over a relatively short period of time. The experiment itself was straightforward. A few dozen people watched an eyewitness style documentary about a police arrest in groups of five. Three days later, the subjects returned to the lab and completed a memory test about the documentary. Four days after that, they were brought back once again and asked a variety of questions about the short movie while inside a brain scanner.
This time, though, the subjects were given a “lifeline”: they were shown the answers given by other people in their film-viewing group. Unbeknownst to the subjects, the lifeline was actually composed of false answers to the very questions that the subjects had previously answered correctly and confidently. Remarkably, this false feedback altered the responses of the participants, leading nearly 70 percent to conform to the group and give an incorrect answer. They had revised their stories in light of the social pressure.
The question, of course, is whether their memory of the film had actually undergone a change. (Previous studies have demonstrated that people will knowingly give a false answer just to conform to the group. We’re such wimps.) To find out, the researchers invited the subjects back to the lab one last time to take the memory test, telling them that the answers they had previously been given were not those of their fellow film watchers, but randomly generated by a computer. Some of the responses reverted back to the original, but more than 40 percent remained erroneous, implying that the subjects were relying on false memories implanted by the earlier session. They had come to believe their own bullshit.
We all tell ourselves, and each other, stories. It’s how we interpret the world. It’s all a giant narrative written by its participants. “All the world’s a stage,” as the Bard would say. But we don’t tell true stories. It’s impossible really. We tell each other versions of a story from a certain perspective. And when we get together to share stories, well, we don’t let our own perspective interfere with the group’s general perspective if it’s a better story.
The scientists speculate:
Altering memory in response to group influence may produce untoward effects. For example, social influence such as false propaganda can deleteriously affect individuals’ memory in political campaigns and commercial advertising and impede justice by influencing eyewitness testimony. However, memory conformity may also serve an adaptive purpose, because social learning is often more efficient and accurate than individual learning. For this reason, humans may be predisposed to trust the judgment of the group, even when it stands in opposition to their own original beliefs.
Human memory is unreliable at best, skewed to some perceived version of events. Relying on the outside social influence may be useful when we want to work in conjunction towards a common goal, but not so much when relying upon the testimony of a witness.
*Note: The above may or may not actually be a true story.