Planting Misinformation in the Human Mind: A 30-Year Investi

Gathered together in one place, for easy access, an agglomeration of writings and images relevant to the Rapeutation phenomenon.

Planting Misinformation in the Human Mind: A 30-Year Investi

Postby admin » Thu Jun 12, 2014 8:11 am

PLANTING MISINFORMATION IN THE HUMAN MIND: A 30-YEAR INVESTIGATION OF THE MALLEABILITY OF MEMORY
by Elizabeth F. Loftus

NOTICE: THIS WORK MAY BE PROTECTED BY COPYRIGHT

YOU ARE REQUIRED TO READ THE COPYRIGHT NOTICE AT THIS LINK BEFORE YOU READ THE FOLLOWING WORK, THAT IS AVAILABLE SOLELY FOR PRIVATE STUDY, SCHOLARSHIP OR RESEARCH PURSUANT TO 17 U.S.C. SECTION 107 AND 108. IN THE EVENT THAT THE LIBRARY DETERMINES THAT UNLAWFUL COPYING OF THIS WORK HAS OCCURRED, THE LIBRARY HAS THE RIGHT TO BLOCK THE I.P. ADDRESS AT WHICH THE UNLAWFUL COPYING APPEARED TO HAVE OCCURRED. THANK YOU FOR RESPECTING THE RIGHTS OF COPYRIGHT OWNERS.


Department of Psychology and Social Behavior, University of California–Irvine, Irvine, California 92697-7085, USA

Abstract

The misinformation effect refers to the impairment in memory for the past that arises after exposure to misleading information. The phenomenon has been investigated for at least 30 years, as investigators have addressed a number of issues. These include the conditions under which people are especially susceptible to the negative impact of misinformation, and conversely when are they resistant. Warnings about the potential for misinformation sometimes work to inhibit its damaging effects, but only under limited circumstances. The misinformation effect has been observed in a variety of human and nonhuman species. And some groups of individuals are more susceptible than others. At a more theoretical level, investigators have explored the fate of the original memory traces after exposure to misinformation appears to have made them inaccessible. This review of the field ends with a brief discussion of the newer work involving misinformation that has explored the processes by which people come to believe falsely that they experienced rich complex events that never, in fact, occurred.

In 2005 the journal Learning & Memory published the first experimental work using neuroimaging to reveal the underlying mechanisms of the “misinformation effect,” a phenomenon that had captured the interest of memory researchers for over a quarter century (Okado and Stark 2005). These new investigators used a variation of the standard three-stage procedure typical in studies of misinformation. Their subjects first saw several complex events, for example one involving a man stealing a girl's wallet. Next some of the subjects got misinformation about the event, such as the fact that the girl's arm was hurt in the process (rather than her neck). Finally the subjects were asked to remember what they saw in the original event. Many claimed that they saw the misinformation details in the original event. For example, they remembered seeing the arm being hurt, not the neck. Overall, the misinformation was remembered as being part of the original event about 47% of the time. So, expectedly, a robust impairment of memory was produced by exposure to misinformation— the misinformation effect. But the researchers' new work had a twist: They went on to show that the neural activity that occurred while the subjects processed the events and later the misinformation predicted whether a misinformation effect would occur.

In an essay that accompanied the Okado and Stark findings, I placed their results within the context of 30 years of research on behavioral aspects of the misinformation effect (Loftus 2005). Their work received much publicity, and boosted public interest in the misinformation effect as a scientific phenomenon. For example, WebMD (Hitti 2005) touted the new findings showing that brain scans can predict whether the memories would be accurate or would be infected with misinformation. And the Canadian press applauded the study as being the first to investigate how the brain encodes misinformation (Toronto Star 2005).

So what do we know about the misinformation effect after 30 years? The degree of distortion in memory observed in the Okado and Stark neuroimaging study has been found in hundreds of studies involving a wide variety of materials. People have recalled nonexistent objects such as broken glass. They have been misled into remembering a yield sign as a stop sign, hammers as screwdrivers, and even something large, like a barn, that was not part of the bucolic landscape by which an automobile happened to be driving. Details have been planted into memory for simulated events that were witnessed (e.g. a filmed accident), but also into memory for real-world events such as the planting of wounded animals (that were not seen) into memory for the scene of a tragic terrorist bombing that actually had occurred in Russia a few years earlier (Nourkova et al. 2004). The misinformation effect is the name given to the change (usually for the worse) in reporting that arises after receipt of misleading information. Over its now substantial history, many questions about the misinformation effect have been addressed, and findings bearing on a few key ones are summarized here.

Under what conditions are people particularly susceptible to the negative impact of misinformation? (The When Question)

Can people be warned about misinformation, and successfully resist its damaging influence?

Are some types of people particularly susceptible? (The Who Question)

When misinformation has been embraced by individuals, what happens to their original memory?

What is the nature of misinformation memories?

How far can you go with people in terms of the misinformation you can plant in memory?

The When Question

Long ago, researchers showed that certain experimental conditions are associated with greater susceptibility to misinformation. So, for example, people are particularly prone to having their memories be affected by misinformation when it is introduced after the passage of time has allowed the original event memory to fade (Loftus et al. 1978). One reason this may be true is that with the passage of time, the event memory is weakened, and thus, there is less likelihood that a discrepancy is noticed while the misinformation is being processed. In the extreme, with super-long time intervals between an event and subsequent misinformation, the event memory might be so weak that it is as if it had not been presented at all. No discrepancy between the misinformation and original memory would be detected, and the subject might readily embrace the misinformation. These ideas led to the proposal of a fundamental principle for determining when changes in recollection after misinformation would occur: the Discrepancy Detection principle (Tousignant et al. 1986). It essentially states that recollections are more likely to change if a person does not immediately detect discrepancies between misinformation and memory for the original event. Of course, it should be kept in mind that false memories can still occur even if a discrepancy is noticed. The rememberer sometimes thinks, “Gee, I thought I saw a stop sign, but the new information mentions a yield sign, I guess I must be wrong and it was a yield sign.” (Loftus and Hoffman 1989).

The other important time interval is the period between the misinformation and the test. One study asked subjects to say whether a key item was part of the event only, part of the misinformation, in both parts, or in neither. Misinformation effects occur when subjects say that the item is part of the event only, or that the item was in both parts. Overall, subjects were slightly more likely to say “both” (22%) than “event only” (17%). But the timing of the test affected these ratios. With a short interval between the misinformation and the test, subjects are less likely to claim that the misinformation item was in the event only (Higham 1998). This makes sense. If subjects have recently read the misinformation they might well remember doing so when tested and at the same time might also incorrectly believe that they also saw the misinformation detail during an original event.

Temporarily changing someone's state can increase misinformation effects. So for example, if people are led to believe that they have drunk alcohol, they are more susceptible (Assefi and Garry 2002), and when people are hypnotized, they are more susceptible (Scoboria et al. 2002). These temporary states may have the effect of disrupting the ability of subjects to detect discrepancies between the misinformation and what remains of their original memory.

Warnings

Long ago, researchers showed that warning people about the fact that they might in the future be exposed to misinformation sometimes helps them resist the misinformation. However, a warning given after the misinformation had been processed did not improve the ability to resist its damaging effects (Greene et al. 1982). The lack of effectiveness of post-misinformation warnings presumably occurred because the misinformation had already been incorporated into the memory and an altered memory now existed in the mind of the individual. The research on warnings fits well with the Discrepancy Detection principle. If people are warned prior to reading post-event information that the information might be misleading, they can better resist its influence, perhaps by increasing the likelihood that the person scrutinizes the post-event information for discrepancies.

More recent work suggests that warning people that they may have in the past been exposed to misinformation (post-misinformation warnings) may have some success, but only in limited circumstances. In one study, an immediate post-misinformation warning helped subjects resist the misinformation, but only when the misinformation was in a relatively low state of accessibility. With highly accessible misinformation, the immediate post-misinformation warnings didn't work at all. (The accessibility of misinformation can be enhanced by presenting it multiple times versus a single time). Moreover, it didn't seem to matter whether the warning was quite general or item-specific (Eakin et al. 2003). The general warning informed subjects that the narrative they had read referred to some objects and events from the slides in an inaccurate way. The specific warning explicitly mentioned the misleading details (e.g., they would be told the misinformation was about the tool). Eakin et al. explained these results with several hypotheses. They favored a suppression hypothesis, which states that when people get a warning, they suppress the misinformation and it has less ability to interfere with answering on the final test. Moreover they suggested that the entire context of the misinformation might be suppressed by the warning. Suppression might have more trouble working when misinformation is too accessible. Also, highly accessible misinformation might distract the subject from thinking to scrutinize the misinformation for discrepancies from some presumably overwhelmed original event memory.

The Who Question

Misinformation affects some people more than others. For one thing, age matters. In general young children are more susceptible to misinformation than are older children and adults (see Ceci and Bruck 1993). Moreover, the elderly are more susceptible than are younger adults (Karpel et al. 2001; Davis and Loftus 2005). These age effects may be telling us something about the role of cognitive resources, since we also know that misinformation effects are stronger when attentional resources are limited. In thinking about these age effects, it should probably be emphasized that suggestion-induced distortion in memory is a phenomenon that occurs with people of all ages, even if it is more pronounced with certain age groups.

In terms of personality variables, several have been shown to be associated with greater susceptibility to misinformation such as empathy, absorption, and self-monitoring. The more one has self-reported lapses in memory and attention, the more susceptible one is to misinformation effects. So, for example, Wright and Livingston-Raper (2002) showed that about 10% of the variance in susceptibility to misinformation is accounted for by dissociation scores that measure the frequency of such experiences as how often a person can't remember whether he did something or just thought about doing that thing (see Davis and Loftus 2005 for a review of these personality variables).

Interestingly, misinformation effects have also been obtained with some unusual subject samples, including three-month-old infants (Rovee-Collier et al. 1993), gorillas (Schwartz et al. 2004), and even with pigeons and rats (Harper and Garry 2000; M. Garry and D.N. Harper, in prep.). One challenging aspect of these studies is finding ways to determine that misinformation had taken hold in species that are unable to explicitly say so. Take pigeons, for example. They have an amazing ability to remember pictures that they were shown as long as two years earlier (Vaughan and Greene 1983, 1984). But their otherwise good memory can be disrupted by misinformation. In two different studies, Harper and Garry examined misinformation effects in pigeons by using an entirely visual paradigm (see also M. Garry and D.N. Harper, in prep.). First, the pigeons saw a light (let's say a red light). They had been trained over many many trials to peck the light to show that they had paid attention to it. After they pecked the light, it turned off. After a delay, the pigeons were exposed to post-event information, where they saw either the same colored light or a different colored light. They had to peck this light, too. Then came the test: The pigeons saw the original light and a novel colored light. If they pecked the originally correct color, they got food. If they pecked the novel color, they got no food. The pigeons were more accurate when the post-event experience did not mislead them. Moreover, like humans, pigeons are more susceptible to the misinformation if it occurs later in the original–final test interval than if it occurs early in that interval. M. Garry and D.N. Harper (in prep.) make the point that knowing that pigeons and humans respond the same way to misleading information provides more evidence that the misinformation effect is not just a simple matter of retrograde interference. Retrograde interference is a mere disruption in performance, not a biasing effect. That is, it typically makes memory worse, but does not pull for any particular wrong answer. But for pigeons, like humans, who use the misinformation differentially depending on when they are exposed to it, the misinformation appears to have a specific biasing effect too. The observation of a misinformation effect in nonverbal creatures also suggests that the misinformation effects are not a product of mere demand characteristics. That is, they are not produced by “people” who give a response just to please the experimenter, even when it is not the response they think they should give.

The fate of the original memory?

One of the most fundamental questions one can ask about memory is the question about the permanence of our long-term memories. If information makes it way into our long-term memories, does it stay there permanently even when we can't retrieve it on demand? Or do memory traces once stored become susceptible to decay or damage or alteration? In this context, we can pose the more specific question: When misinformation is accepted and incorporated into a person's recollection, what happens to their original memory? Does the misinformation impair the original memory, perhaps by altering the once-formed traces? Or does the misinformation cause retrieval impairment, possibly by making the original memory less accessible?

A lively debate developed in the 1980s when several investigators rejected the notion that misinformation causes any type of impairment of memory (McCloskey and Zaragoza 1985). Instead, they explicitly and boldly pressed the idea that misinformation had no effect on the original event memory. Misinformation, according to this view, merely influences the reports of subjects who never encoded (or for other reasons can't recall) the original event. Instead of guessing at what they saw, these subjects would be lured into producing the misinformation response. Alternatively, the investigators argued that misinformation effects could be arising because subjects remember both sources of information but select the misleading information because, after deliberation, they conclude it must be correct.

To support their position, McCloskey and Zaragoza (1985) devised a new type of test. Suppose the subjects saw a burglar pick up a hammer and received the misinformation that it was a screwdriver. The standard test would allow subjects to select between a hammer and a screwdriver. On the standard test, control subjects who had not received the misinformation would tend to select the hammer. Many subjects exposed to misinformation (called misled subjects) would, of course, select the screwdriver, producing the usual misinformation test. In the new test, called the “Modified Test,” the misinformation option is excluded as a response alternative. That is, the subjects have to choose between a hammer and a totally novel item, wrench. With the modified test, subjects were very good at selecting the original event item (hammer, in this example), leading McCloskey and Zaragoza to argue that it was not necessary to assume any memory impairment at all—neither impairment of traces nor impairment of access to traces. Yet later analyses of a collection of studies using the modified test showed that small misinformation effects were obtained even when these unusual types of tests were employed (Ayers and Reder 1998), and even when nonverbal species were the subjects of the experiments.

While space is too limited to present the myriad paradigms that were devised by investigators wishing to explore the fate of the original memory (e.g., Wagenaar and Boer 1987; Belli 1989; Tversky and Tuchin 1989), suffice it to say that the entire debate heightened appreciation for the different ways by which people come to report a misinformation item as their memory. Sometimes this occurs because they have no original memory (it was never stored or it has faded). Sometimes this occurs because of deliberation. And sometimes it appears as if the original event memories have been impaired in the process of contemplating misinformation. Moreover, the idea that you can plant an item into someone's memory (apart from whether you have impaired any previous traces) was downright interesting in its own right.

The nature of misinformation memories

Subjectively, what are misinformation memories like? One attempt to explore this issue compared the memories of a yield sign that had actually been seen in a simulated traffic accident, to the memories of other subjects who had not seen the sign but had it suggested to them (Schooler et al. 1986). The verbal descriptions of the “unreal” memories were longer, contained more verbal hedges (I think I saw...), more references to cognitive operations (After seeing the sign the answer I gave was more of an immediate impression...), and fewer sensory details. Thus statistically a group of real memories might be different from a group of unreal ones. Of course, many of the unreal memory descriptions contained verbal hedges and sensory detail, making it extremely difficult to take a single memory report and reliably classify it as real or unreal. (Much later, neurophysiological work would attempt to distinguish real from unreal memories, a point we return to later).

A different approach to the nature of misinformation memories came from the work of Zaragoza and Lane (1994) who asked this question: Do people confuse the misleading suggestions for their “real memories” of the witnessed event? They asked this question because of the real possibility that subjects could be reporting misinformation because they believed it was true, even if they had no specific memory of seeing it. After numerous experiments in which subject were asked very specific questions about their memory for the source of suggested items that they were embracing, the investigators concluded that misled subjects definitely do sometimes come to remember seeing things that were merely suggested to them. They referred to the phenomenon as the “source misattribution effect.” But they also noted that the size of the effect can vary, and emphasized that source misattributions are not inevitable after exposure to suggestive misinformation.

How much misinformation can you plant in one mind?: Rich false memories

It is one thing to change a stop sign into a yield sign, to make a person believe that a crime victim was hurt in the arm instead of the neck, or to add a detail to an otherwise intact memory. But it is quite another thing to plant an entire memory for an event that never happened. Researchers in the mid-1990s devised a number of techniques for planting whole events, or what have been called “rich false memories.” One study used scenarios made up by relatives of subjects, and planted false memories of being lost for an extended time in a shopping mall at age 6 and rescued by an elderly person (Loftus 1993; Loftus and Pickrell 1995). Other studies used similar methods to plant a false memory that as a child the subject had had an accident at a family wedding (Hyman Jr. et al. 1995), had been a victim of a vicious animal attack (Porter et al. 1999), or that he or she had nearly drowned and had to be rescued by a lifeguard (Heaps and Nash 2001).

Sometimes subjects will start with very little memory, but after several suggestive interviews filled with misinformation they will recall the false events in quite a bit of detail. In one study, a subject received the suggestion that he or she went to the hospital at age 4 and was diagnosed as having low blood sugar (Ost et al. 2005). At first the subject remembered very little: “... I can't remember anything about the hospital or the place. It was the X general hospital where my mum used to work? She used to work in the baby ward there... but I can't... no. I know if I was put under hypnosis or something I'd be able to remember it better, but I honestly can't remember.” Yet in the final interview in week 3, the subject developed a more detailed memory and even incorporated thoughts at the time into the recollection: “... I don't remember much about the hospital except I know it was a massive, huge place. I was 5 years old at the time and I was like `oh my God I don't really want to go into this place, you know it's awful'... but I had no choice. They did a blood test on me and found out that I had a low blood sugar...”

Taken together these studies show the power of this strong form of suggestion. It has led many subjects to believe or even remember in detail events that did not happen, that were completely manufactured with the help of family members, and that would have been traumatic had they actually happened.

Some investigators have called this strong form of suggestion the “familial informant false narrative procedure” (Lindsay et al. 2004); others find the term awfully cumbersome, and prefer to simply call the procedure the “lost-in-the-mall” technique, after the first study that used the procedure. Across many studies that have now utilized the “lost-in-the-mall” procedure, an average of ~30% of subjects have gone on to produce either partial or complete false memory (Lindsay et al. 2004). Other techniques, such as those involving guided imagination (see Libby 2003 for an example), suggestive dream interpretation, or exposure to doctored photographs, have also led subjects to believe falsely that they experienced events in their distant and even in their recent past (for review, see Loftus 2003).

Image

Figure 1. Fake advertisements showing Bugs Bunny at a Disney resort, used to plant false beliefs in Braun et al. (2002) and Braun-LaTour et al. (2004).

A concern about the recent work showing the creation of very rich false beliefs and memories is that these might reflect true experiences that have been resurrected from memory by the suggestive misinformation. To counter that concern, some investigators have tried to plant implausible or impossible false memories. In several studies subjects were led to believe that they met Bugs Bunny at a Disney Resort after exposure to fake ads for Disney that featured Bugs Bunny. An example of an ad containing the false Bugs Bunny information is shown in Figure 1; subjects simply evaluate the ad on a variety of characteristics. In one study, the single fake ad led 16% of subjects to later claim that they had met him (Braun et al. 2002), which could not have occurred because Bugs Bunny is a Warner Brothers character and would not be seen at a Disney resort. Later studies showed even higher rates of false belief, and that the ads that contained a picture of Bugs produced more false memories than ads that contained only a verbal mention (Braun-LaTour et al. 2004.) While obviously less complex, these studies dovetail nicely with real-world examples in which individuals have come to develop false beliefs or memories for experiences that are implausible or impossible (e.g., alien abduction memories, as studied by McNally and colleagues 2004).

Concluding remarks

Misinformation can cause people to falsely believe that they saw details that were only suggested to them. Misinformation can even lead people to have very rich false memories. Once embraced, people can express these false memories with confidence and detail. There is a growing body of work using neuroimaging techniques to assist in locating parts of the brain that might be associated with true and false memories, and these reveal the similarities and differences in the neural signatures (e.g., Curran et al. 2001; Fabiani et al. 2000). Those with strong interests in neuroscience will find interesting the recent neuroimaging and electrophysiological studies suggesting that sensory activity is greater for true recognition than false recognition (Schacter and Slotnick 2004). These studies suggest, more explicitly, that the hippocampus and a few other cortical regions come into play when people claim to have seen things that they didn't see. But, keep in mind that for the most part these studies are done with relatively pallid sorts of true and false memories (e.g., large collections of words or simple pictures). With the Okado and Stark (2005) neuroimaging investigation of misinformation we are one step closer to developing some techniques that might enable us to use neural activity to tell whether a report about a complex event is probably based on a true experience or whether it is based on misinformation. We are still, however, a long way from a reliable assessment when all we have is a single memory report to judge.

In the real world, misinformation comes in many forms. When witnesses to an event talk with one another, when they are interrogated with leading questions or suggestive techniques, when they see media coverage about an event, misinformation can enter consciousness and can cause contamination of memory. These are not, of course, the only source of distortion in memory. As we retrieve and reconstruct memories, distortions can creep in without explicit external influence, and these can become pieces of misinformation. This might be a result of inference-based processes, or some automatic process, and can perhaps help us understand the distortions we see in the absence of explicit misinformation (e.g., Schmolck et al.'s [2000] distortions in recollections of the O.J. Simpson trial verdict).

An obvious question arises as to why we would have evolved to have a memory system that is so malleable in its absorption of misinformation. One observation is that the “updating” seen in the misinformation studies is the same kind of “updating” that allows for correction of incorrect memories. Correct information can supplement or distort previously stored error, and this, of course, is a good thing. Whatever the misinformation reveals about normal memory processes, one thing is clear: the practical implications are significant. The obvious relevance to legal disputes, and other real-world activities, makes it understandable why the public would want to understand more about the misinformation effect and what it tells us about our malleable memories.

________________

Notes:

Article published online ahead of print. Article and publication date are at http://www.learnmem.org/cgi/doi/10.1101/lm.94705.
Cold Spring Harbor Laboratory Press

References

Assefi, S.L. and Garry, M. 2002. Absolute memory distortions: Alcohol placebos influence the misinformation effect. Psychol. Sci. 14:77 -80.

Ayers, M.S. and Reder, L.M. 1998. A theoretical review of the misinformation effect: Predictions from an activation-based memory model. Psychon. Bull. Rev. 5: 1-21.

Belli, R.F. 1989. Influences of misleading postevent information: Misinformation interference and acceptance. J. Exp. Psychol. Gen. 118:72 -85. MedlineWeb of Science

Braun, K.A., Ellis, R., and Loftus, E.F. 2002. Make my memory: How advertising can change our memories of the past. Psychol. Marketing 19:1 -23.

Braun-LaTour, K.A., LaTour, M.S., Pickrell, J., and Loftus, E.F.2004 . How (and when) advertising can influence memory for consumer experience. J. Advertising 33: 7-25.

Ceci, S.J. and Bruck, M. 1993. The suggestibility of the child witness: A historical review and synthesis. Psychol. Bull. 113:403 -439. MedlineWeb of Science

Curran, T., Schacter, D.L., Johnson, M.K., and Spinks, R.2001 . Brain potentials reflect behavioral differences in true and false recognition. J. Cogn. Neurosci. 13:201 -216. MedlineWeb of Science

Davis, D. and Loftus, E.F. 2005. Age and functioning in the legal system: Perception memory and judgment in victims, witnesses and jurors. In Handbook of Forensic Human Factors and Ergonomics (eds. I. Noy and W. Karwowski) Taylor and Francis, London.

Eakin, D.K., Schreiber, T.A., and Sergent-Marshall, S.2003 . Misinformation effects in eyewitness memory: The presence and absence of memory impairment as a function of warning and misinformation accessibility. J. Exp. Psychol. Learn. Mem. Cogn. 29:813 -825. MedlineWeb of Science

Fabiani, M., Stadler, M.A., and Wessels, P.M. 2000. True but not false memories produce a sensory signature in human lateralized brain potentials. J. Cogn. Neurosci. 12:941 -949. MedlineWeb of Science

Greene, E., Flynn, M.S., and Loftus, E.F. 1982. Inducing resistance to misleading information. J. Verbal Learn. Verbal Behav. 21:207 -219.

Harper, D.N. and Garry, M. 2000. Postevent cues bias recognition performance in pigeons. Animal Learn. Behav. 28:59 -67.

Heaps, C.M. and Nash, M. 2001. Comparing recollective experience in true and false autobiographical memories. J. Exp. Psychol. Learn. Mem. Cogn. 27:920 -930. MedlineWeb of Science

Higham, P.A. 1998. Believing details known to have been suggested. Br. J. Psychol. 89:265 -283. Web of Science

Hitti, M. 2005. Brain doesn't always spot false memories. http://my.webmd.com/content/article/100/105579.htm

Hyman Jr., I.E., Husband, T.H., and Billings, F.J.1995 . False memories of childhood experiences. Appl. Cogn. Psychol. 9:181 -197. Web of Science

Karpel, M.E., Hoyer, W.J., and Toglia, M.P. 2001. Accuracy and qualities of real and suggested memories: Nonspecific age differences. J. Gerontol. Psychol. Sci. 56B:103 -110.

Libby, L.K. 2003. Imagery perspective and source monitoring in imagination inflation. Mem. Cogn. 7:1072 -1081.

Lindsay, D.S., Hagen, L., Read, J.D., Wade, K.A., and Garry, M.2004 . True photographs and false memories. Psychol. Sci. 15:149 -154. Abstract/FREE Full Text

Loftus, E.F. 1993. The reality of repressed memories. Am. Psychol. 48:518 -537. MedlineWeb of Science
———. 2003. Make-believe memories. Am. Psychol. 58:864 -873. Medline
———. 2005. Searching for the neurobiology of the misinformation effect. Learn. Mem. 12: 1-2. FREE Full Text

Loftus, E.F. and Hoffman, H.G. 1989. Misinformation and memory: The creation of memory. J. Exp. Psychol. Gen. 118:100 -104. MedlineWeb of Science

Loftus, E.F. and Pickrell, J.E. 1995. The formation of false memories. Psychiatr. Ann. 25:720 -725. Web of Science

Loftus, E.F., Miller, D.G., and Burns, H.J. 1978. Semantic integration of verbal information into a visual memory. J. Exp. Psychol. Hum. Learn. Mem. 4:19 -31.

McCloskey, M. and Zaragoza, M. 1985. Misleading postevent information and memory for events: Arguments and evidence against memory impairment hypotheses. J. Exp. Psychol. Gen. 114: 1-16. MedlineWeb of Science

McNally, R.J., Lasko, N.B., Clancy, S.A., Macklin, M.L., Pitman, R.K., and Orr, S.P. 2004. Psychophysiological responding during script-driven imagery in people reporting abduction by space aliens. Psychol. Sci. 15:493 -497. Abstract/FREE Full Text

Nourkova, V.V., Bernstein D.M., and Loftus, E.F. 2004. Altering traumatic memories. Cognition and Emotion 18:575 -585. Web of Science

Okado, Y. and Stark, C.E.L. 2005. Neural activity during encoding predicts false memories created by misinformation. Learn. Mem. 12:3 -11. Abstract/FREE Full Text

Ost, J., Foster, S., Costall, A., and Bull, R. 2005. False reports in appropriate interviews. Memory (in press).

Porter, S., Yuille, J.C., and Lehman, D.R. 1999. The nature of real, implanted, and fabricated memories for emotional childhood events: Implications for the recovered memory debate. Law Hum. Behav. 23:517 -537. MedlineWeb of Science

Rovee-Collier, C., Borza, M.A., Adler, S.A., and Boller, K.1993 . Infants' eyewitness testimony: Effects of postevent information on a prior memory representation. Mem. Cogn. 21:267 -279. MedlineWeb of Science

Schacter, D.L. and Slotnick, S.D. 2004. The cognitive neuroscience of memory distortion. Neuron 44:149 -160. MedlineWeb of Science

Schmolck, H., Buffalo, E.A., and Squire, L.R. 2000. Memory distortions develop over time: Recollections of the O.J. Simpson trial verdict after 15 and 32 months. Psychol. Sci. 11: 39-45. Abstract/FREE Full Text

Schooler, J.W., Gerhard, D., and Loftus, E.F. 1986. Qualities of the unreal. J. Exp. Psychol. Learn. Mem. Cogn. 12:171 -181. MedlineWeb of Science

Schwartz, B.L., Meissner, C.A., Hoffman, M., Evans, S., and Frazier, L.D. 2004. Event memory and misinformation effects in a gorilla. Anim. Cogn. 7:93 -100. MedlineWeb of Science

Scoboria, A., Mazzoni, G., Kirsch, I., and Milling, L.S.2002 . Immediate and persisting effects of misleading questions and hypnosis on memory reports. J. Exp. Psychol. Appl. 8: 26-32. Medline

Toronto Star 2005. Imaging shows how brain can create false memories. Feb. 4, p.D3 .

Tousignant, J.P., Hall, D., and Loftus, E.F. 1986. Discrepancy detection and vulnerability to misleading post-event information. Mem. Cogn. 14:329 -338. MedlineWeb of Science

Tversky, B. and Tuchin, M. 1989. A reconciliation of the evidence on eyewitness testimony: Comments on McCloskey and Zaragoza 1985. J. Exp. Psychol. Gen. 118: 86-91.

Vaughan, W. and Greene, S.L. 1983. Acquisition of absolute discriminations in pigeons. In Quantitative Analyses of Behavior (eds. M.L. Commons, A.R. Wagner, and R.J. Herrnstein), Vol. 4, Discrimination processes. Ballinger, Cambridge, MA.

Vaughan, W. and Greene, S.L. 1984. Pigeon visual memory capacity. J. Exp. Psychol. Anim. Behav. Process. 10:256 -271. Web of Science

Wagenaar, W.A. and Boer, H.P.A. 1987. Misleading postevent information: Testing parameterized models of integration in memory. Acta Psychologica 66:291 -306.

Wright, D.B. and Livingston-Raper, D. 2002. Memory distortion and dissociation: Exploring the relationship in a non-clinical sample. J. Trauma Dissociation 3: 97-109.

Zaragoza, M.S. and Lane, S.M. 1994. Source misattributions and the suggestibility of eyewitness memory. J. Exp. Psychol. Learn. Mem. Cogn. 20:934 -945. MedlineWeb of Science
admin
Site Admin
 
Posts: 36125
Joined: Thu Aug 01, 2013 5:21 am

Return to A Growing Corpus of Analytical Materials

Who is online

Users browsing this forum: No registered users and 19 guests