Which of the following statements is true of the decay theory of forgetting in psychology?

  • Journal List
  • HHS Author Manuscripts
  • PMC4241183

Q J Exp Psychol (Hove). Author manuscript; available in PMC 2017 Oct 1.

Published in final edited form as:

PMCID: PMC4241183

NIHMSID: NIHMS588793

Abstract

This work takes a historical approach to discussing Brown’s (1958) paper, “Some Tests of the Decay Theory of Immediate Memory”. This work was and continues to be extremely influential in the field of forgetting over the short-term. Its primary importance is in establishing a theoretical basis to consider a process of fundamental importance, memory decay. Brown (1958) established that time-based explanations of forgetting can account for both memory capacity and forgetting of information over short periods of time. We discuss this view both in the context of the intellectual climate at the time of the paper’s publication and in the context of the modern intellectual climate. The overarching theme we observe is that decay is as controversial now as it was in the 1950s and 1960s.

Brown (1958) was a landmark article that marked a shift in memory research during the early stages of the cognitive revolution. In this work, Brown proposed a theory of forgetting based upon memory traces that lose activation, or decay, with the passage of time. This theory was accompanied by experiments showing forgetting in a short amount of time, whereas previous work had only showed long-term forgetting. Brown’s account of memory was evidence-based and addressed more than simply a forgetting curve. While others had proposed that decay exists, Brown took the further step of incorporating the idea of memory decay into a larger theoretical framework that included limits on the capacity of memory and rules describing the conditions under which decay should and should not operate. This framework largely carries through to the present, although much work has been done to refine the theory and identify how it plays a role in human cognition more generally. Beyond this, Brown offers a spirited rebuke of those who had dismissed the first whisperings of decay as misinterpreted consequences of interfering information.

In an attempt to do justice to this seminal article and its legacy, our investigation of Brown (1958) begins with a consideration of its continuing importance for the field. We then move to a more in-depth account of the empirical and theoretical contributions of the article. Elaborating upon these contributions, for a fuller understanding and appreciation of the work, we ponder the possible meanings of memory decay and then consider the historical context in which Brown’s contribution was made.Moving from past to present and future, we consider some of the subsequent models that incorporate decay, the likely status of decay given recent research findings, and the future of decay and of Brown’s ideas.

Continuing Importance of Brown (1958)

The continuing importance of Brown (1958) is evident in that decay may be integral to the modern conceptualization of memory as two separable parts (e.g., Atkinson & Shiffrin, 1968; Broadbent, 1958; Miller, 1956): the large amount of information that we have memorized over a lifetime, or long-term memory, and the small amount of information that is temporarily in a state of heightened availability, or short-term (or working) memory. The fundamental difference between the two, if they are separable, would appear to be that only the contents of short-term memory are limited to a small number of items or to a short period of time, whereas the same limits do not apply to long-term memory. Short-term memory as a theoretical construct is therefore like a roof that stands on just two massive pillars, and decay is one of those pillars.

Brown (1958) opens by saying, “The hypothesis of decay of the memory trace as a cause of forgetting has been unpopular.” In many ways the ideas put forward by Brown (1958) are as controversial today as they were 60 years ago. Contemporaries of Brown such as Underwood (1957) and Melton (1963) claimed that all forgetting could be explained though processes involving interfering information. In the last decade several prominent researchers have made similar claims (Lewandowsky, Oberauer & Brown, 2009; Oberauer & Kliegl, 2006; Nairne, 2002). Nairne (2002) claims that, “appeals to either rehearsal or decay are unlikely to explain the particulars of short-term forgetting”. Similarly, Lewandowsky et al. (2009) assert that “reliance on decay is not justified by the data”. In their day, Brown and others (Conrad, 1957; Murdock, 1961; Peterson & Peterson, 1959) gave strong refutations of this approach to forgetting, just as some do today (Barrouillet, Bernardin, & Camos, 2004; McKeown & Mercer, 2012; Ricker & Cowan, 2010, 2013). Nonetheless, controversy continues.

Researching this paper has been an interesting experience. In discovering, and rediscovering, many papers from the opening days of experimental psychology we have been struck by the similarity of the arguments against decay in Brown’s day to those we receive today when discussing our research supporting decay theories of memory. An often-made complaint is that nothing can happen as a function of time and an analogy to the accumulation of rust frequently follows. The analogy goes like this. Although rust accumulates as a function of time, its cause is the oxidation processes, not the passage of time itself. Following this logic, trace decay must not be a true cause of forgetting but rather a simplistic proxy. This line of thought can be seen even before Brown and his contemporaries proposed theories of memory decay. Earlier in the 20th century, the Law of Disuse was used to refer to the idea that information was lost with the passage of time. Pratt (1936), in his defense of the Law of Disuse, felt the need to state (p.91),

“The objection that a trace can not possibly disintegrate through disuse, that time in and of itself does nothing to any event in nature, must be regarded largely as a verbal quibble. Science often speaks of change as a function of time, meaning thereby that alterations are produced by processes internal to the event in question rather than by external forces acting upon the event. Let A and B represent two neural traces. Either one may change as the result of the action of the other upon it. A change may also occur in A by way of metabolic processes within its own organization which have nothing to do with B, and vice versa. There is therefore no reason on these grounds to dismiss the principle of disuse.”

Clearly the same critiques occurred then as now. What Pratt elegantly states is that, for our purposes, time is the important variable for understanding behavior.

As we compose this article, Google Scholar indicates that Brown (1958) has been cited over 1153 times. Any reference section of a paper dealing with decay is almost assured to cite Brown or if they do not, many papers within the reference section will themselves include a citation of Brown. It is probably safe to assume that all main stream theories of memory decay draw on Brown (1958) as a predecessor influential in their formation. When it comes to time-based forgetting, the impact and influence of Brown (1958) is unquestionable. A second reason for the high citation rate of Brown (1958) is that the paper was among the first to propose a procedure in which information has to be maintained over a short interval of time which is filled with a demanding task. This short-term procedure is often referred to as the Brown-Peterson procedure (e.g., Baddeley & Warrington, 1970), after Brown (1958) and Peterson and Peterson (1959) who investigated complementary aspects of memory over short time intervals. While Brown focused on the effects of rehearsal and interference, Peterson and Peterson, examined the effects the duration of short-term retention.

Brown (1958), Some Tests of the Decay Theory of Immediate Memory: Findings and Theory

The thesis of Brown (1958) is that memory traces decay over a brief time period, until some threshold is reached and the memory becomes unreliable. Brown argues that this theory offers a simple explanation of both why we forget and why we have a capacity limit in memory. In this approach memory spans are the direct result of decay because items take time to be perceived and recalled and decay of items already in memory occurs throughout perception and recall. This process results in a cap on the maximum number of items that can be perceived or recalled before forgetting of other items occurs.

Brown goes beyond simply stating his theory and providing data to give a firm rebuke of the previous literature dismissing decay of immediate memory. Broadbent (1957) offered an earlier decay model of immediate memory but his work of theory did not provide the spirited defense of decay offered by Brown (1958) and was in part inspired by the work of Brown himself. Work by others at the time provided tantalizing evidence in favor of decay (Conrad, 1957; Murdock, 1961; Peterson & Peterson, 1959), but none gave a powerful defense of the theory as a whole. In a sense, Brown provided the planks on which other evidence favoring decay could stand.

The heart of Brown’s argument against the evidence used to discredit decay was a rejection of the assumption that interference always caused forgetting. Rather, Brown argued it could be that supposed interference effects happen once forgetting has already occurred. For example,it was well known that when remembering a precise value for later judgments, such as a weight or a length, presenting interpolated stimulation of the same type tended to influence memory for the value in the direction of the interpolated stimulation (Guilford & Park, 1931). Brown countered this critique by pointing to his earlier work which argued that distortions in remembering can be due to constructive and inferential processes at recall which occur naturally once forgetting has already occurred (Brown, 1956). Likewise, Brown notes that similarity effects in which greater distracter similarity to memoranda leads to a greater rate of distracter response intrusions could also be due to processes which occur after forgetting has already taken place. Specifically, once forgetting of the memory items has occurred it may be that the distracters present in memory are mistaken for the memory items and then reported. If gradual decay is assumed and the competing response strength of the distracter item is high it may only require modest forgetting for the distracter to outcompete the memory item at recall. To round out his argument, Brown asserts that in the case of discrimination errors there must be some forgetting of the discriminating information in order to allow for an erroneous response. The two items being discriminated are not initially identical in all respects, so discriminating information which is absent at the time of recall must have existed at some point.

A key factor in Brown’s research was that it examined memory over short time intervals while most other researchers were using long delays between study and test. The idea that decay might occur only over a brief time immediately after item perception was a novel idea and proved key in the observation of decay. In his experimentation, Brown demonstrated a number of important phenomena across a series of three experiments. The findings from these experiments demonstrated that trace decay could explain a range of findings that other theoretical approaches would find difficult.

Brown (1958)’s first experiment (see Figure 1, upper panel) showed that even a short, filled delay could result in forgetting. Forgetting occurred after a short delay provided that consonant pairs could not be rehearsed and more than one pair of consonants were to be recalled. Rehearsal was prevented in some conditions by presenting digits to be read aloud during the retention interval. When an unfilled delay was used the items could be rehearsed and performance was much better, only below ceiling at a set size of four (see Figure 1, lower panel). It is important to note that at the time it was thought that the categorical difference between digits and letters would mean that the perception and processing of digits would not interfere with memory for the consonants. Thus while current theorists favoring decay theories would admit the possibility of interference in this procedure, most researchers would not have believed this at the time. Given the beliefs at the time, the lack of perfect performance with a short delay even when the span length was below capacity seemed to provide strong evidence in favor of decay occurring during the retention interval (see Figure 1, lower panel). This result demonstrated that it was theoretically viable to propose that decay could account for not only forgetting across a period of time but also capacity limits in immediate memory.

Which of the following statements is true of the decay theory of forgetting in psychology?

Method (upper panel) and Results (lower panel) of Experiment 1 reported in Brown (1958). In the upper panel, each row represents a single trial condition. The consonants shown in boxes represent the memory stimuli, whereas the floating text details the secondary task timing. In this experiment, set sizes of two and three pairs were also used, but are not depicted graphically in this figure.

In a second experiment, Brown (1958) explored how distracter items could lead to forgetting. This experiment was primarily a control to refute simple interference accounts of the first experiment which were common at the time, but produced data which are still relevant but often overlooked today. In this experiment participants remembered four consonant pairs over a brief retention period. Distracter stimuli, consisting of either digits or consonants to be read aloud, were presented either before or after memory item presentation (see Figure 2, upper panel).A control condition with no distracter stimuli was also used. This resulted in four conditions of interest (digit distractors before, digit distractors after, consonant distractors before, consonant distractors after) and a baseline control condition (no distracting stimuli). We have provided a graph of the results from this experiment in Figure 2 (lower panel). Performance was similar in both the digit and consonant distractor conditions, but with a very small additional deficit in performance when consonants were used as distractors, regardless of whether they were presented before or after the memory items. This was in conflict with interference accounts of the time which claimed that a large amount of interference should be produced by the presentation of similar items, such as the consonants used here, but not dissimilar items, such as the digits used here.In these interference accounts, disruptions from similar items should occur no matter whether these items are presented before the items to be remembered (proactive interference) or after those items (retroactive interference).

Which of the following statements is true of the decay theory of forgetting in psychology?

Method (upper panel) and Results (lower panel) of Experiment 2 reported in Brown (1958). In the upper panel, each row represents a single trial condition. The consonants shown in boxes represent the memory stimuli, whereas the floating text details the secondary task timing. In the lower panel, the labels along the x-axis represent when distractor stimuli occurred relative to presentation of the memory stimuli. The control condition contained no distractor stimuli.

The other important finding relates specifically to the presentation of distracting stimuli before memory item presentation. Relative to the baseline control condition, there was only a small amount of forgetting when consonant distracters were presented before the memory items and no observed forgetting when numerical distracter items were presented before the memory items (see Figure 2, lower panel). Further examination of the data revealed that when the distracter stimuli were consonants and presented before the memory items, error responses did not contain the distracter items beyond chance levels. This finding indicates that proactive interference does not replace the memory items being held during the retention interval, as would be predicted by proactive interference account’s such as that of Keppel and Underwood (1962). When distracter consonants were presented after the memory items, error responses were composed of the distracter items at levels only slightly above chance. This aspect of the results of Brown (1958) is problematic for many proactive interference accounts, including modern ones,which posit that distracter items outcompete the target memory representation and create a response that is strongly influenced by the distracter identity (Brown, Neath, & Chater, 2007; Keppel & Underwood, 1962). Some current interference accounts of forgetting that seem better able to handle this result include superposition, keeping multiple items in the same mental space where they unfortunately interfere with one another (Farrell & Lewandowsky, 2002) or feature overwriting, in which features of the newer items can displace similar features of an older item (Nairne, 1990; Oberauer & Kliegl, 2006). Decay accounts are also well-suited to handling these results (Barrouillet et al., 2004; Cowan, 1995; Ricker & Cowan, 2010) as they don’t require that the distractor items influence recall, but at least some small interference element is clearly needed to account for the slight decrease in accuracy when consonant distractors were used.

The third experiment in Brown (1958) is interesting but provides a mixed result when compared to Experiments 1 and 2. In this experiment participants remembered three consonant pairs over a short delay period, the second portion of the delay being filled with distracters. The amount of time between offset of the memory items and onset of the distracter items was varied while the number and presentation time of the distracters was held constant. See Figure 3 (upper panel) for a graphical example of this experiment design. According to Brown’s decay theory there should be little additional effect of lengthening the delay between memory item offset and distractor item onset. Rehearsal should prevent decay during the initial unfilled delay, while decay should occur during the later filled portion of the delay. Interference theories of the day often predicted that consolidation of the trace should occur during the initial unfilled period leading to protection against any potential forgetting and that there should otherwise be no effect of the distracting stimuli due to their dissimilarity from the memory items. Performance increased as the delay between memory item offset and distractor item onset increased (see Figure 3 , lower panel), as predicted by interference theory, but there was still a large effect of the distracting stimuli as predicted by decay theory, with performance being below 60 percent correct at the longest delay. This can be compared to performance on the first experiment which was better than 90 percent correct with a similar delay but no distracters. Brown attributes the increased accuracy with longer delays to a short-term learning effect brought on by rehearsal or by chunking strategies (Miller, 1956). Chunking is associating multiple items to form a single meaningful unit, such as a known acronym (e.g., CIA=Central Intelligence Agency). This is one way to ease the memory demand. Another strategy of this sort is elaborative encoding (Craik & Tulving, 1975), in which meanings from long-term memory are used to help memory in a short-term task (e.g., GL FB NR = Good Luck Finding Bears North of the River).

Which of the following statements is true of the decay theory of forgetting in psychology?

Method (upper panel) and Results (lower panel) of Experiment 3 reported in Brown (1958). In the upper panel, each row represents a single trial condition. The consonants shown in boxes represent the memory stimuli, whereas the floating text details the secondary task timing.

Brown (1958) provided not only an argument in favor of the existence of decay in immediate memory, but also an integrated theory of memory performance built upon decay. While his contemporaries demonstrated decay by varying the length of retention itself (Peterson & Peterson, 1959; Murdock, 1961), Brown (1958) provided strong arguments in favor of decay as a mechanism of forgetting and well thought out counterarguments to those dismissing decay as improbable. These theoretical points were backed up by innovative experiments that are still cited today. In the following pages we will discuss decay itself in more detail, the historical context of Brown (1958), and the legacy of Brown’s decay theory.

The Different Meanings of Decay

Brown’s central thesis was that items in immediate memory decay, meaning that they are forgotten as a function of the passage of time with the cause being some unidentified internal process. Although he gives a theory of how decay produces forgetting in a general sense, there are many possible interpretations of what the process of decay actually means. In the years following Brown’s work, a plethora of theories have offered different explanations of how decay functions. These differing meanings of decay often lead to fundamentally different predictions about the stability of memory over brief periods of time. In this section we seek to explain the various different forms decay may take. By this we do not mean that all or some of these forms exist, we merely wish to state some of the interpretations of the concept of decay that seem plausible.We believe that this inquiry into the meaning of decay is in the spirit of Brown’s approach and will put the reader in a better position to appreciate the interchanges that are described subsequently.

Cowan, Saults, & Nugent (1997) offered a hierarchy of three definitions of decay, going from the least restrictive, easiest definition to satisfy down to increasingly restrictive definitions that would be harder to satisfy. The first definition of decay was the loss of the ability to recall the target item across a period of time, not caused by interference. It was allowed that interference could also occur and could increase the rate of forgetting observed over time; decay and interference could have effects that summate. Cowan et al. believed that they had evidence for this kind of decay in auditory memory for a single tone over some seconds, in a situation in which (1) retroactive interference was minimized, in that there was no stimulus presented between the studied item and test; and (2) proactive interference was also minimized, in that the critical comparison across retention intervals was based on trials in which a key ratio between two durations was kept constant (specifically, the time between the previous and current trial, divided by the current retention interval). A second definition of decay added that the loss over time could not be due to some controllable mental process during the retention interval which can be altered by instructions, but rather by some uncontrollable mental process, such as neuronal fatigue. Given that Keller, Cowan, and Saults (1995) manipulated attention in the absence of retroactive interference and found only a small effect of attention, and found forgetting even with attention directed to the tone to be remembered, Cowan et al. suggested that this definition was met for auditory memory. The third, most restrictive definition given by Cowan et al. added to the previous definitions the supposition that decay proceeds at a constant rate which cannot be changed by any intervening process or control, whether internal or external. It was suggested that this definition was not met, and was not intended by any theory; but bringing it up is important because some investigators have used it as a convenient straw man to be knocked down.

In its most straightforward form a decaying memory trace is simply a gradually blurring mental perception that eventually becomes unidentifiably vague. Although this is a common understanding of decay, it is not necessary that the trace itself becomes less precise as a function of time for all reasonable definitions of decay to be met. It may be that an underlying activation of some sort is needed to maintain a memory trace decays but that the trace itself does not degrade until a critical amount of activation is lost and the trace is forgotten completely. An illustration of this process would be a gradual weakening of the electrical difference gradient across a cellular membrane that supports recurrent neural firing to maintain a memory. The gradient could be lost gradually, but the neural assembly is either firing or it is not. A more liberal interpretation of the meaning of decay could also encompass a process that unfolds across time but is not directly a function of the trace at all. For example, it may be that memories have a certain probability of being forgotten at any given moment in time due to the failure of some necessary internal process such as maintained attention. Increasing retention intervals would then result in a gradually increasing probability of memory failure at some point during retention. Although neither the memory trace itself nor the activation supporting the trace decays, the probability of successfully maintaining the item across a retention interval would decrease with retention time, meeting various definitions of decay.

Although decay models posit that the passage of time leads to forgetting, it is not strictly necessary for a decay model to argue that time is the ultimate cause of the observed forgetting. There are many processes in the world for which we use time as a useful measurement tool and shorthand despite knowing that time itself is not at work. Take baking as an example. When we bake, we know that the amount of time in the oven is not the causal factor behind bread rising, yet we still refer to time in the oven as leading the bread to rise. Although the process of yeast consuming sugars and producing carbon dioxide in a high heat environment is the more reductionist approach to describing the rising of the bread, it is generally not useful to discuss the fine points of yeast-related chemistry. Similarly if internal noise leads to forgetting, but occurs at a constant or predictable rate under known conditions, then we can use time as a proxy through which to discuss forgetting. Using time as a proxy has the advantage of allowing the process to be measureable, falsifiable, and predictable, whether or not an internally-generated process is later identified as the basis of this decay. Although Pratt (1963) made this point decades ago, it is conveniently forgotten all too often.

We do not mean the preceding statement to suggest that decay cannot be denied based on reasonable evidence. Several interference-only theories of forgetting argue that there is no decay because the authors do not observe time-based loss of information in their experimentation. Although we generally disagree with the resulting conclusions, this approach is a valid attempt to disprove decay theories. What we wish to emphasize here is that a theory of decay can be useful even if it is not literally correct on the level of individual neurons or difficult to observe internal brain functions.

The Historical Context of Brown (1958)

During the decades leading up to Brown (1958), behaviorism dominated psychological investigations. Given the behaviorist focus on stimulus and response, the very ideas of memory and forgetting were fundamentally different from our conceptualization today. Memory in the behaviorist period meant successfully reproducing a trained response in the appropriate context, whereas forgetting meant failure to reproduce a trained response in the appropriate context. The experimental procedures used by researchers primarily interested in changes in the behavioral response focused largely on learning and how it interacted with memory. As such, the memory paradigms were almost exclusively long-term in nature. This makes a lot of practical sense given the views of the time. From the point of view of research on learning, who would care about what happens in the very short term? If you know the information 1 s after its presentation but you’ve forgotten it by 10 s after its presentation, then for all practical purposes you haven’t learned anything useful. What wasn’t considered at the time was the possibility that cognitive information processing could depend on a limited-capacity processor (Miller, 1956) to allow comprehension of sentences, form new thoughts, and so on. When considering this second possibility which is largely accepted as true today, we need to know what the limits of processing are over the short-term.

Under the intellectual atmosphere prevalent during the 1920s and 30s there began to be an emerging consensus that memory was not affected by the passage of time but instead by the presence of interpolated stimuli (Guilford & Park, 1931; Jenkins and Dallenbach, 1924; McGeoch, 1932; Robinson, 1927). This was called retroactive inhibition, retroactive interference in today’s language, and is in many ways similar to some current theories such as Oberauer, Lewandowsky, Farrell, Jarrold, and Greaves’ (2012) version of the model with the moniker Serial Order in a Box. The basic idea was that the more different the interfering material was from the memorized material the more it would interfere with memory, so long as it was not too dissimilar (Cheng, 1929; Robinson, 1927). At a certain point though it was thought that dissimilarity becomes too great and the materials cease to interfere with one another and are instead considered separately according to the Laws of Similarity and Assimilation (Yum, 1931).

Before this consensus emerged Thorndike (1913)’s Law of Disuse enjoyed much success and went largely unquestioned due to the focus on behavior rather than cognitive processes. The Law of Disuse stated that memory traces which were not used weakened with time. This was a good description of the external behavioral consequences of the passage of time which melded neatly with the experimental approaches of the period. Note that this theory referred to the weakening of a memory as if it were an extended period of unlearning, a fundamentally different process from decay despite some shared concepts. As researchers started thinking more about the causes of forgetting the simple description given by the Law of Disuse failed to satisfy. While most began to see forgetting as an interference-based process,some authors instead tried to elaborate on the Law of Disuse, beginning the move toward decay theories of memory. Pratt (1936) argued for the concept of disintegration in the presence of interaction between stimuli. While interaction referred to the tendency of a sensory memory to become influenced in intensity toward the direction of any similar stimuli heard later, disintegration refers to the constant decrease in intensity of the memory trace.

It is clear that the behaviorist focus on long-term learning as memory provided a strong drive away from time-based approaches to forgetting. In a very influential argument against the Law of Disuse by McGeoch (1932) much of the evidence provided consists of long-term trends in memory performance or behavioral characteristics, in which performance actually improved rather than diminishing over sufficiently long time periods. Examples of this evidence are better performance on a memory task after a delay of several days (Ballard, 1913; Williams, 1926), and memory performance increases dependent upon the length of time spent sleeping between learning and recall (Jenkins and Dallenbach, 1924). In retrospect from today’s view, although these findings provide exciting evidence about long-term memory retrieval, it is hard to see how these situations could provide information about memory performance in short-term situations.

This was the general state of forgetting theory entering the nineteen-fifties. However, an important and influential theory surfaced in the years just before Brown (1958) which would later be invoked to challenge the findings of his decay theory. Forgetting through proactive interference was observed by Greenberg & Underwood (1950) and formally proposed as a theory by Underwood (1957). As we mentioned earlier, this term refers to a potential phenomenon through which previous similar information interferes with the recall of target memory information. Thus the number of previous memory trials is viewed as the critical determinant of performance.

In reaction to findings in support of decay by a number of researchers (Brown, 1958; Conrad, 1957; Murdock, 1961; Peterson & Peterson, 1959), some claimed that the results supposedly demonstrating decay occurred instead because of the presence of interference (Keppel & Underwood, 1962; Melton, 1963; Waugh & Norman, 1965). The most cited evidence against decay during this initial reaction is by Keppel and Underwood (1962) who claim to demonstrate a lack of forgetting during the first trial of an experiment and increased rates of forgetting as the number of previous trials increases. Their explanation was that as the length of the retention interval increases so too does the probability of proactively interfering traces spontaneously recovering the strength to compete for expression during response execution. Although many view this study as extremely convincing in the present day, much as their contemporaries did, we find both the data and the explanation provided by Keppel and Underwood (1962) unconvincing. Although the study shows a clear decrease in performance from trial one to trial two of an experiment, the data contain serious issues with ceiling effects and noise, making further conclusions questionable. In Experiment 1, where there was no ceiling effect, trial 1 showed the same slope of time-based forgetting as did trials 2 and 3. This finding is in clear agreement with Brown (1958)’s theory of decaying memory traces.

Dismissing decay due to findings demonstrating support for proactive interference frequently occurs today, just as it did in after the publication of Keppel and Underwood. This conclusion against decay seems premature given that decay and proactive interference could coexist. The idea that there could be two memory mechanisms functioning during the same procedure is often overlooked. It could easily be that when there is little or no proactive interference (i.e., on trial 1) long-term memory retrieval is used, which is not subject to decay, because it requires less sustained effort across the retention interval than does short-term retention. Once proactive interference becomes strong (i.e., on the subsequent trials), short-term mechanisms subject to decay are required because long-term retrieval becomes unreliable. Further, Keppel & Underwood’s (1962) explanation of spontaneous recovery of competing memory traces during the retention interval seems to lack an apt mechanism responsible for this spontaneous recovery.Although Keppel and Underwood show clear evidence for proactive interference, it is difficult to argue that it logically follows that this interference can account for any observed time-based forgetting.

Those who argued in favor of proactive inhibition as the mechanism leading to forgetting across a retention interval and not trace decay also seem to have overlooked the most crucial evidence of proactive interference itself. If items on previous trials are leading to interference with responses on current trials by outcompeting them at recall, then we should expect that participants recall these previous trial items as responses quite frequently on subsequent trials. However, Keppel & Underwood (1962) do not report these intrusion error rates. Greenberg and Underwood (1950) support proactive inhibition as the time-based mechanism of forgetting but paradoxically state that the observed intrusion rates from items on previous trials are unrelated to forgetting. Greenberg and Underwood even demonstrate that, by their logic, proactive interference does not occur for retention intervals of less than ten minutes. Brown (1958) analyzed his Experiment 2 in order to look for proactive intrusion errors and found an intrusion rate that was below chance levels. When the proactive inhibition approach to time-based forgetting was analyzed in detail, it clearly did not hold up to scrutiny. Instead of leading to time-based effects, proactive interference may work non-specifically through an initial drop in accuracy once the context for an experimental trial changes from memory for a single event on the first trial or two, to a stereotypical task to be repeated as of the second, third, or fourth trials. This change from event to task may lead to a more automated mental process and a general increase in errors resulting from this automation. There are a number of similar alternative versions of proactive interference effects in which proactive interference leads to poorer recall without directly causing forgetting. Many of these explanations would not logically prohibit trace decay or even be able to account for time-based effects.

Another challenge to the decay of immediate memory proposed by Brown came from Melton (1963) who argued that intra-item interference between the memoranda caused what seemed to be decay. Melton showed that when more items were maintained, the rate of loss over time was greater. Although he argued that the ability to modify the rate of loss shows passive decay to be untenable, it should be noted that this interpretation can be challenged. As we noted before, there are different definitions of decay (Cowan et al., 1997). Some of Melton’s most important data were assessed through a method of accuracy, which requires correct recall of all items in a list for the response to be judged correct. With that definition of decay, even if each item were forgotten at exactly the same rate, a list length effect in the observed decay rate could be expected because fresher representations may be required to allow repetition of more items while overcoming output interference. Moreover, decay might well include the concept of variability. One cannot predict the exact moment when a particular radioactive atom will disintegrate, and it could be the same for individual words in a list. If the probability of each item being lost by the time of test is p, then for a two-item list the probability of a correct response is (1-p)2 whereas, for a five-item list, it is (1-p)5. If p=.2, for example, the expected levels of correct list recall are .64 versus .33, respectively. By such a mechanism, the list length effect would increase as a function of time as p increased.

Some Models of Memory Incorporating Decay

The existence or absence of decay in immediate memory was not resolved in the years immediately following Brown (1958)’s landmark paper. Debate continues to the present day. For the sake of brevity we skip forward to the status of trace decay in experimental psychology today. In this section we start the discussion by presenting several of the most popular models of immediate memory which incorporate trace decay, as well as the critical evidence used to support them. In the next section we will discuss some recent arguments for and against decay.

Broadbent’s (1958) model

Broadbent wrote a quite influential book that helped to begin the cognitive revolution. The book was written basically in behaviorist terminology but it envisions the field of information processing. Sketched in a footnote is a simple model of processing in which information we now term sensory memory is fed into an attentional filtering mechanism that determines which information will go on to become part of what we now would term working memory. This information also feeds into long-term memory. Within this model, it is clear from the literature reviewed (including a great deal of research on auditory memory) that the sensory component was very short-lived. Information that did not make it through the filter was lost. It seems less clear from the model what the status of the working memory component was; decay was not discussed in the context of that component. A similar relation between short-lived sensory processes and more stable,categorical, working-memory processes can be gleaned from the work of Sperling (1960) on visual memory. The later work of Atkinson and Shiffrin (1968) added recurrent processes and focused on how those strategic processes might operate, but the implied reliance on decay was similar. None of these models explicitly used decay beyond sensory information.

The Multi-Component Model

Baddeley (1986)’s Multi-Component Model of Working Memory has been the most successful model of working memory in terms of adoption by clinicians and researchers in other fields. Its simple modular form makes for good diagrams and allows for a quick understanding of the basics of a complicated system by those not involved in memory research. The model’s success also convinced a large number of researchers of the existence of decay in immediate memory. It still is very widely used, typically in an amended form (Baddeley, 2000).

The multi-component was originally proposed by Baddeley and Hitch (1974) as a response to the Atkinson-Shiffrin (1968) model in which information was proposed to flow from the environment through sensory memory into short-term memory. This latter system was thought of as responsible for encoding information into and retrieving information from long-term memory. Importantly, Atkinson and Shiffrin proposed that the short-term store was unitary in nature and capable of both processing and storing of information. Baddeley and Hitch (1974) replaced this unitary short-term store with a multi-component working memory system in which domain-specific maintenance resources were proposed that are not available for processing. The simplest form of this Multi-Component system was laid out by Baddeley (1986). In this version, there are two short-term buffers, one for verbal information called the phonological store and one for visual information called the visuo-spatial sketch pad. Information held in these buffers was said to decay with the passage of time. Information in the phonological store can be maintained indefinitely, however, through the use of a supplementary mechanism, covert verbal rehearsal. Storage plus rehearsal together is termed the phonological loop. Covert rehearsal restores the activation of decaying phonological traces, which begin to decay again as soon as an iteration of rehearsal ends. Successful recall depends on whether the information is still active at the time when it is needed. More tentatively, a corresponding mechanism is proposed for internal mental scanning and refreshing of nonverbal, visuo-spatial information (e.g., Baddeley & Logie, 1999; Logie, 2009).

A different faculty called the central executive is responsible for a number of higher level functions including managing what information enters each of the buffers, processing of stored information, and formulating output based upon information stored in the short-term buffers. This faculty is proposed to be capacity limited, but in exactly what manner is left open to research and speculation. In a more recent instantiation of the model, information that is neither phonological nor visuo-spatial in nature, including bindings between different kinds of information, is held in an episodic buffer with qualities that are still under investigation (Baddeley, 2000).

Thus, in contrast to the proposal of Atkinson and Shiffrin who conceptualized working memory as a unitary resource that could be used for both processing and storage, the multi-component model proposed that processing is handled by the central executive, while storage is taken care of by two independent domain-specific mechanisms. This model leads to several key predictions. First, it predicts that visual and verbal information should not interfere with one another when stored in short-term memory simultaneously because information of each modality is maintained in separate buffers. When information is from a common domain the opposite is true. When the information load reaches buffer capacity verbal information should interfere strongly with other verbal information, and visual with visual, due to the limited storage capacity of each buffer. Second, it predicts that processing and storage activities should not interfere with one another, especially when they involve information that pertains to different domains. Several studies have shown that verbal short-term memory is more disrupted by concurrent verbal activities than by visuo-spatial activities while the inverse has been observed for visuo-spatial short-term memory (e.g., Farmer, Berman, & Fletcher, 1986; Logie, 1986; Logie, Zucco, & Baddeley, 1990). Clinical neurophysiology studies have supported the proposed dissociation between verbal and visuo-spatial material in working memory because there are patients with selective lesions that are impaired in one domain but not the other (e.g., De Renzi & Nichelli, 1975; Basso, Spinnler, Vallar, & Zanobio, 1982; Vallar and Baddeley 1984a, 1984b).

The model’s other prominent feature is the one of which we are most concerned at present, the inclusion of decay. Years before formulation of this model Baddeley and Scott (1971) provided a strong rebuke of the proactive inhibition account of time-based forgetting by demonstrating that time-based loss can be observed even on the first trial performed in a memory experiment (based on the procedure of Peterson & Peterson, 1959, described in more detail later). The finding was soon followed by the discovery of the word length effect (Baddeley, Thompson, & Buchanan, 1975).This effect refers to the finding that more short words can be remembered than long words if rehearsal is allowed (Cowan et al., 1992; Cowan, Nugent, Elliott, & Geer, 2000; Mueller, Seymour, Kieras, & Meyer, 2003). Baddeley et al. (1975) found that participants could remember the same number of words as they could articulate in about two seconds. This time limit in memory rather than word limit, led to inclusion of the phonological loop in the Multi-Component Model to serve as an internal rehearsal mechanism and an estimate of the timeframe for an item to decay of about two seconds. While decay was explicitly included to explain the functioning of the phonological loop and to account for forgetting of verbal material, this was less explicitly the case for visuo-spatial material. Indeed, even though the visuo-spatial sketchpad is often presented as a system that is analogous to the phonological loop, with a visual cache for storage and a spatial rehearsal mechanism, decay is only implicitly called upon to explain forgetting of visuo-spatial material.

Whereas decay was limited to sensory information in the model of Broadbent (1958), Baddeley’s model makes little reference to sensory information. The reason is that Baddeley heeded strong evidence that information is recoded from a sensory level to a more abstract level. The key background for that conclusion is the finding of Conrad (1964) that confusions in memory among printed letters occurred much more often among letters that sound similar (B, D, C, P, T, etc., which rhyme) than it did among letters that look similar (e.g., R and P or b and d). This suggested that a phonological code was used even for non-acoustic verbal information. It ignores, however, other evidence that modality-specific sensory coding makes a huge difference in memory for recent list items, with superior ordered recall of spoken items (e.g., Penney, 1989). There appear to be both sensory and abstract codes used together. In any case, it is noteworthy that Baddeley’s model clearly applied the concept of decay to abstract or amodal information, not just sensory information.

Despite its popularity, or perhaps because of it, the assumptions of the Multi-Component Model have been subject to much criticism. Questions have been raised about the legitimacy of segregated storage systems (Morey & Cowan, 2004, 2005; Ricker, Cowan, & Morey, 2010; Vergauwe, Barrouillet, & Camos, 2009, 2010; Vergauwe, Dewaele, Langerock, & Barrouillet, 2012). Doubts have also been raised about the need to posit decay to handle findings of the word length effect. Several researchers have shown that item complexity effects can easily account for most findings in support of the word length effect (Caplan, Rochon, & Waters, 1992; Service, 1998; Lovatt, Avons, and Masterson, 2000). While Mueller et al. (2003) makes a strong defense of the word length effect as a decay-based phenomenon, Lewandowsky and Oberauer (2008) have pointed out some important unknowns that may undermine the logical foundation of Mueller et al. (2003)’s decay interpretation of the word length effect. At present, the issue has been left unresolved, but the evidence seems to point to the word length effect as an interference-based phenomenon which occurs due to word complexity effects rather than as a result of articulation duration. Although the word length effect as a proof of decay may not stand, it does not negate the idea of immediate memory decay proposed by Brown (1958) and others over the years. Later in this paper we will discuss the state of decay as presented in the immediate memory literature of the present.

Over the decades since the creation of the Multi-Component Model there have been a number of reimaginings of the basic model (Baddeley, 2000; Baddeley & Logie, 1999; Repovš & Baddeley, 2006), sometimes including substantial changes to address new findings in the literature. For example Baddeley (2000) added the episodic buffer, a third memory buffer that deals with cross-modal storage, to the model in order to accommodate emerging findings showing that all information was not strictly modality segregated. Several authors have also proposed mathematical models outlining a short-term memory system influenced by the Multi-Component Model (Burgess, & Hitch, 1999; Henson, 1998; Page, & Norris, 1998). Despite the challenges to this model it has maintained a strong following and continues to evolve.

Without decay, at least the verbal components of the model would have to change dramatically. To explain the word length effect, continual interference of list items with each other as they are rehearsed in a repeating loop might serve a role similar to the role that decay is said to play in the model. A difficulty for that approach, however, would be to find an alternative explanation for why individuals can recall about as much as they can recite in 2 s (Baddeley et al., 1975). Individuals who can recite more quickly also would cause themselves interference at a commensurate rate, so the amount recalled would theoretically not be influenced by the speed of recital according to the current account. A non-decay account of the word length effect might have to attribute the speech rate-span correlation to a third variable. For example, individuals who can recite more quickly may do so because they have better-developed phonological or verbal systems, which also is of assistance in finding chunks within lists to be recalled.

The Embedded Process Model

This model, or modeling framework actually, can be traced to a re-examination (Cowan, 1988) of how we use graphical models to represent human information processing and how we might do so most productively. It seemed to Cowan that previous modeling did not produce a logically consistent result. Broadbent (1958) showed modeling components as if they were stages in series,but they could not truly be so. A change in physical properties of the environment, such as a sudden lightning flash or thunder bolt, can disrupt one’s deliberate focus of attention, bypassing the attentional filter’s current setting and entering the changed stimulus into working memory and consciousness. Information from long-term memory is used to determine the way in which information is organized in working memory; for example, the acronym USA quickly is stored in working memory as a single item, not three random letters. That is, one does not use the steps in the model of Broadbent in a stepwise, sequential fashion. Baddeley (1986) used a different convention, showing information shuttled back and forth between separate modules. His model, however, showed working memory as if phonological and visuo-spatial information could adequately describe its contents. This seemed unlikely. For example, a printed word might concurrently activate phonological, orthographic, lexical, and semantic features in memory. Other modalities and codes also exist and make a simple modular structure seem incomplete or inaccurate.Cowan set about to see if core beliefs about processing could be integrated into a diagram in a manner that was exhaustive (including such things as touch stimuli and associations between very different types of codes, such as name-face associations), but that left plenty of room for subsequent clarification based on future evidence.

One proposal of Cowan (1988) was to conceive of the memory system as embedded. Some features in memory are in a current state of activation, and this activation plays the role taken by storage modules in Baddeley’s model. Two different stimuli will most likely have overlapping features, and the degree of overlap can account for the basic finding seeming to support modularity, namely the finding that items with more similar features interfere with one another more within working memory tasks.

Within the activated portion of working memory, a handful of items were said to reside in the focus of attention. These items are presumably represented as integrated wholes, not loose features as is presumably found outside of the focus of attention. This focus is said to be aimed by a combination of voluntary processes (a central executive like that of Baddeley, 1986) and involuntary processes. The brain presumably forms a neural model of the perceived aspects of the environment, with more detail present for items in the focus of attention. When a new stimulus is processed and found to differ in a dramatic way from the current neural model, an orienting response occurs (Sokolov, 1963) and it draws attention involuntarily toward the new aspect of the environment. This can be caused by a physical change in the environment, or by a semantic change if it has been attentively processed. This focus of attention was in line with older thinking about working memory (e.g., Atkinson & Shiffrin, 1968; Broadbent, 1958; James, 1890) but was less similar to Baddeley’s conception. In Cowan’s conception, working memory could be divided into attentive storage as in the earlier conceptions (in the focus of attention) and non-attentive storage (in the remaining activated portion of long-term memory).

Within the model, information in the activated portion of long-term memory was said to decay over a few seconds. The purpose of this postulate was to account for the same types of information that Baddeley (1986) took into consideration, and also sensory memory information and other information, such as semantic information in short-term memory. (The old literature, carefully considered, did show the need for these aspects, such as studies showing semantic confusions in working memory as reviewed by Cowan, 1988.) It seemed plausible, at least, that the decay of all sorts of features occurred at the same rate. The literature review in favor of this idea was then expanded upon and updated by Cowan (1995, 1999).

Without decay, the modeling framework is likely to run aground. There is no known basis for distinguishing between the activated portion of long-term memory and the inactivated portion, except that the activated portion decays and becomes inactivated. After all, there is plenty of evidence suggesting that there is no strict limit in the number of features that can be activated at once, at least not sensory features (e.g., Sperling, 1960). Persistent sensory information must be avoided in order to find a strict limit in capacity of just a few items (Cowan, 2001). It might be possible to have a model in which features stay active until subsequent items with the same features overwrite them (Nairne, 1990). If this proposal proves to be inadequate, however, it is for reasons similar to Baddeley’s (1986) model; there appear to be phenomena that are time-based rather than item-based. The next model elaborates upon this point with newer information relevant to decay.

The Time-Based Resource Sharing Model

The Time-Based Resource-Sharing (TBRS) Model of Barrouillet et al. (2004; see also Barrouillet, Bernardin, Portrat, Vergauwe, & Camos, 2007; Barrouillet, Portrat, & Camos, 2011) is different from other models of decay in that even though time-based decay plays a central role in the model, it does not predict the stereotypical forgetting curve over time. This is so because the model proposes that attention can be used to prevent decay by rapidly cycling through the items in memory and refreshing them. Refreshing an item refers to a process by which the decayed activity of the trace is restored by focusing attention on the decaying memory trace. This refreshing process is assumed to function with material of any modality or combination of modalities, i.e., verbal, visual, spatial.According to the model, recall performance depends on a balance between the time during which memory traces decay while attention is occupied by other activities and the time during which attention is available for the refreshing of memory traces. The model was originally based on procedures in which there were lists of items (e.g., letters) to be recalled, with interfering activities (e.g., reading numbers) between the presentation of these items as distractions. The model proposes that the effect of an intervening activity on recall performance can be understood through its cognitive load, i.e., the ratio between the time taken to perform the intervening task (i.e., processing time) and the free time available to reactivate memory traces (i.e., free time). Increasing this ratio, either by increasing processing time while keeping constant free time or by decreasing free time while keeping constant processing time, is assumed to result in poorer recall performance.In this way, the TBRS model does not necessarily predict that longer retention intervals will result in poorer recall performance. Instead, the effect of lengthening the duration of retention intervals will depend on how it changes the balance between decay and refreshing. In some ways this model holds remarkably true to the heart of Brown’s (1958) theory of decay. Most notably the balance between decay and refreshing dictates the capacity limit in TBRS, thereby upholding Brown’s assertion that decay itself can account for capacity limits in short-term memory.

Some unique predictions have been derived from this model that are not to be expected from time-based forgetting alone: (1) If the duration of retention intervals is increased while the pace of the intervening activity is kept unchanged, then the balance between decay and refreshing remains unaffected and no additional forgetting is expected. (2) Longer retention intervals are even expected to improve recall performance when total processing time is kept constant because, in that case, there is more free time during which decaying memory traces can be refreshed (The boundary condition is that the rule applies only to individuals who use refreshing, which is not always true, for example, in 5-year-old children; see Camos & Barrouillet, 2011). (3) The balance between decay and refreshing can be changed while keeping constant the duration of the retention interval in such a way that different amounts of forgetting can be observed for any given duration of the retention interval. Each of these predictions has been confirmed using a specialized serial recall task (e.g., Barrouillet et al., 2004, 2007). Additionally, the importance of the balance between decay and refreshing has been shown to hold within and across different modalities and domains of working memory (Vergauwe et al., 2009, 2010, 2012).

Oberauer and Kliegl (2006) offered a different explanation of the effect of cognitive load in which recall performance depends on a balance between interference and repair instead of a balance between decay and refreshing. It was proposed by these authors that the intervening activity produces interference and that free time is used to repair the damage of interference. In such account, only the amount of free time is critical while the duration of processing is not.To test this alternative account, Portrat, Barrouillet and Camos (2008) used a complex span task in which participants are asked to maintain series of letters while judging the spatial location of a square on screen. Processing time was manipulated by using spatial locations that were either easy to discriminate or hard to discriminate, the latter condition taking longer. Importantly, each of these squares was followed by a constant period of free time. The results showed that processing time is crucial for recall performance, even though free time was kept constant; longer processing times resulted in poorer recall performance. However, Lewandowsky and Oberauer (2009) suggested that harder discriminations might result in more errors which, in turn, might result in a portion of the free time being used for post-error processes instead of refreshing or repairing. Recently, Barrouillet, De Paepe, and Langerock (2012) showed that, even after equalizing error rates and while keeping constant free time, longer processing times resulted in more forgetting, thereby providing what we believe to be strong evidence for the idea that memory traces decay over time.Oberauer and Lewandowsky (2013) subsequently agreed that secondary task post-error processes cannot account for Portrat et al.’s (2008) results, although they still maintain that decay is not necessary within their account.

Even with this bevy of research in support of the TBRS model, there are some circumstances it cannot readily explain. Ricker and Cowan (2010) found evidence for decay, but it did not completely match the expectations of the TBRS model. They presented arrays of unfamiliar characters and then, during a retention interval, easier or harder tasks before a probe to examine recognition of the array items. The TBRS expectation is that there should be no loss of memory across retention intervals within a particular distractor condition, because of its prediction of equilibrium between decay and refreshing. In contrast to this prediction, there was decreased accuracy with longer retention intervals within each of the distractor conditions, though the concept of equilibrium was needed to explain different overall performance levels for each level of distractor-activity difficulty.Modification of the model may be necessary to accommodate Ricker & Cowan (2010)’s results and similar findings (see McKeown & Mercer, 2012; Zhang & Luck, 2009). In the next section we will address this and other topics in our search for the ideal model of decay.

The Status of Decay Theory in Current Thought about Forgetting from Immediate Memory

The popularity of decay as a theory of forgetting has waxed and waned over the years since Brown proposed his theory of immediate memory. It is a major challenge to assess the existence of decay because doing so requires eliminating a plethora of alternative explanations. Likewise, assessing the existence of any one non-decay mechanism would be difficult because it, too, would be pitted against a plethora of alternatives. The best one can do is to pursue what appear to be the most parsimonious explanatory principles, and we believe that decay is among them.

At present, the influence of Baddeley (1986)’s Multi-Component Model has spread the acceptance of decay wide in those psychological research fields not involved in work on memory. The community of memory researchers within cognitive psychology, however, has largely viewed decay skeptically over the past decade, although a vocal minority of memory researchers advocate for its existence. Given the topic of the present work, it is clear that we stand in the later camp (although at this point, not necessarily for the main reason described by Baddeley and colleagues previously, i.e., as the means to explain the word length effect). In this final section of the paper we relate the recent arguments for and against decay and provide our own view on how they can be reconciled. For an overview of alternative views on forgetting in short-term memory, see Table 1.

Table 1

Explanation of some of the major principles of forgetting

PrincipleExplanation of Forgetting
Trace Decay Information is stored as memory traces. This activation decreases with the passage of time. Once activation drops below a threshold level, the memory is lost from short-term memory and must be retrieved from long-term memory for use.
Proactive Interference Relevant information must be successfully retrieved prior to use in any memory related task. Instead of forgetting, erroneous memory comes from retrieval of incorrect information from previous similar sources.
Temporal Distinctiveness The strength of disruption from proactive interference is determined by a ratio of the temporal distances. These are the distance between the encoding time of the distracting event and the encoding time of the target memory compared to the distance between the present time and the encoding time of the target memory.
Retroactive Interference Information in memory is damaged or removed through the perception, encoding, or processing of other information. This disruption is caused by information or events which occur after the memory items are already in memory.
Similarity-Based Interference Proactive or retroactive interference in which the amount of memory disruption is determined by the similarity of the memory information to the distracting information.
Novelty-Based Interference All information is assumed to be encoded into a common memory space. New information is distorts old information through utilizing the same storage space. The strength and amount of influence on the common memory space of newly encoded items, whether target memory information or distractor information, is determined by the amount of habituation which the perceptual or encoding mechanisms have undergone. Information similar to previous information will be encoded with less strength, leading to less disruption of previous information.

Historically the biggest hurdle to providing evidence for or against the existence of decay has been preventing verbal rehearsal of memoranda while at the same time avoiding the introduction of interference. New approaches in the last decade though seem to have been fairly successful in this aim. In a series of well-designed studies several researchers have used minimally disrupting tasks to prevent verbal rehearsal such as articulatory suppression, meaning continuous articulation of an irrelevant word, combined with mathematical modeling of the predictions from different approaches to forgetting to argue against decay (Lewandowsky, Duncan, & Brown, 2004; Oberauer et al., 2012; Oberauer & Lewandowsky, 2008; Oberauer & Lewandowsky, 2011). As a simple example of this work, Lewandowsky et al. (2004)’s Experiment 2 required participants to remember lists of six letters presented sequentially and promptly recalled in the correct serial order. In between recall of each item participants were to speak a suppressing word one, two, or three times. Increasing the number of times the suppressing word was spoken increased the amount of time participants had to wait between the recall of each item, but had only a very little effect on the accuracy of recall. The authors argued that this result is not as predicted by decay hypothesis, although the data did appear to show a consistent but non-significant effect of time. This finding fits well with the interference model of forgetting favored by the authors, Serial Order in a Box (Farrell & Lewandowsky, 2002), because this model does not predict additional interference from the repeated presentation of the same distractor due to system habituation. Indeed it is a compelling argument if one cannot explain why decay would not function in this context. We believe recent work can explain why decay will not be observed in this context (Ricker & Cowan, 2013) and will detail this explanation later. For now though, it should be noted that this study represents a strong case against the existence of decay, which must be addressed.

Oberauer & Lewandowsky (2013) complemented this argument against decay by focusing on the evidence in support of the TBRS model of Barrouillet et al. (2011), in which cognitive load determines the amount of decay. In a series of experiments impressive for their breadth and detail, Oberauer and Lewandowsky showed that the level of the cognitive load during memory retention did not determine performance and argued that it is instead time pressure which causes forgetting in procedures manipulating cognitive load. Although they were committed to no specific theory of how time pressure leads to forgetting, these authors suggested that it may be due to interference caused by the recruitment of executive control processes and the memory representations this recruitment may entail.

Time pressure is an intriguing possible mechanism that would not seem to implicate decay. Nevertheless, we are not convinced of the role of time pressure as opposed to cognitive load with decay. The nature of such a time-pressure mechanism and how it would affect memory remains somewhat vague. It seems to us that there are dual possible interpretations of each part of the study of Oberauer and Lewandowsky (2013). (1) In the first few experiments, a dual task was used with an auditory-vocal response first and the visual search second, and the results suggested that at least 75% of the response time for visual search was taken up by attention, in that it could not proceed in parallel with Task 1. Our concern is that although it is impossible to coordinate these two externally-determined tasks in parallel, it may still be possible to coordinate visual search with a second task that is internal and can be carried out at the most convenient moments rather than when the experimenter demands it to be carried out (i.e., intermittent refreshing). (2) Next, using visual search and also a visual distraction task from Vergauwe et al. (in Experiment 5), but using no time pressure for the distracting task, it was found that there was no effect of cognitive load. The authors attributed this finding to the absence of interference between memory and the distracting task, but we believe it may be possible for intermittent refreshing to take place during the distracting task, potentially wiping out the differences between conditions in cognitive load. (3) Finally, in the last few experiments, time pressure was manipulated using visual search as the distracting task, and with cognitive load held constant. The authors say (p. 19), “Effectively, time pressure squeezes all time components, thereby leaving their ratio largely unchanged.” An effect of time pressure was found. The alternative interpretation of this result, however, is that time pressure reduces the possibility of intermittent refreshing during the distracting task. The notion that covert refreshing may occur intermittently during visual search underscores an ongoing debate of whether visual search relies on controlled attention for its execution (e.g., Kane, Poole, Tuholski, & Engle, 2006; Woodman, Vogel, & Luck, 2001). If future research addresses these remaining issues, this approach could prove to be a promising one.

In certain circumstance at least, we and others have produced results that clearly indicate decay as predicted by Brown. Using difficult to verbalize materials has turned out to be a fruitful way to observe decay as both Ricker and Cowan (2010), using arrays of unfamiliar characters, and McKeown and Mercer (2012), using tones with complex timbre, have observed clear decay over a retention interval with no secondary task to produce interference. The lack of a secondary task is notable because researchers since Brown’s day have been using secondary tasks to prevent rehearsal while at the same time introducing potential retroactive interference. In some circumstances other researchers have also found the signatures of decay when using simple visual stimuli, such as colors and shapes, and only articulatory suppression (repeating a single word over and over to prevent rehearsal of the items to be remembered) as a secondary task (Woodman, Vogel, & Luck, 2012; Zhang & Luck, 2009) or even without any secondary task at all (Cowan & AuBuchon, 2008; Morey & Bieler, 2013). Data favoring decay of information in short-term memory outside of the verbal domain has accumulated to a point where some of the staunchest opponents of decay have conceded that there is a good case at least for decay of visual materials (e.g., Oberauer & Lewandowsky, 2013).

Additionally, forgetting as a function of the cognitive load of a concurrent task that involves information that has little to no feature overlap with the information to be maintained has been shown. Specifically, Vergauwe et al. (2010) used a procedure with letters or locations to be recalled and distraction from either a verbal or a spatial task. Recall performance for series of letters decreased as a function of the cognitive load involved in a concurrent spatial task and recall performance for series of locations decreased as a function of the cognitive load involved in a concurrent verbal task. Moreover, the effect of cognitive load on spatial recall performance was not influenced by the nature of the concurrent task: judging words impaired spatial recall to the same extent as judging spatial configurations did (see Barrouillet, Portrat, Vergauwe, Diependaele, & Camos, 2011). Further reduction of feature overlap by preventing overlap in input modality (visual vs. auditory) and in output modality (manual vs. vocal) does not appear to affect the relationship between recall performance and cognitive load (Vergauwe et al., 2012).This set of findings seems to us very difficult to account for without the notion of time-based forgetting, presenting difficulties for views proposing that forgetting in short-term memory is exclusively caused by interference.One possible exception is if future evidence supports the time-pressure argument of Oberauer and Lewandowsky (2013).

It is important to note that many arguments in favor of decay do not deny the existence of some interference-based forgetting in short-term memory tasks. Our own thoughts follow this line of thinking. Even though we advocate in favor of time-based explanations here, we do agree that forgetting due to interference is likely to occur as well. Where we disagree with interference-based explanations is often in the extent to which interference can account for forgetting, not in the basic principle. In the next section we move away from the contrast of decay with interference-based forgetting and instead finish by focusing on the remaining questions about decay which we find most interesting.

The Future of Decay over the Next Sixty Years

Many of the remaining questions about decay are questions about the precise relationship between time and the amount of memory loss. Most prominently, there is wide variability in belief about the time scale on which decay occurs. Brown himself did not only propose that memory traces in immediate memory decay; he proposed that memory traces decay rapidly. Despite this specification, no estimate of the appropriate time scale was provided. The use of short retention intervals that did not exceed 7 seconds suggests, however, that memory traces were expected to decay on a time scale of several seconds. Brown did not manipulate the length of the retention interval, but the results of Experiment 1 show that about three pairs of letters are lost from short-term memory after reading digits for just under 5 seconds. This suggests a rate of loss of about 1 letter per second if one assumes a linear decay rate. A much slower rate can be derived from the first experiment of Peterson and Peterson (1959) who showed that three letters are almost completely forgotten after 18 seconds of backwards counting, suggesting a rate of loss of about 1 letter per 6 seconds, i.e., .167 letter per second. It should be noted, however, that although a rate of loss, if taken literally, implies a linear loss function over time, the shape of the Peterson & Peterson loss function appears to take on an exponential form. Exponential loss assumptions characterize forgetting as a fixed proportion of the currently held information rather than loss of a static amount of information per unit of time. Here we simply use a linear metric for ease of communication given the limited attention to the topic in the current work.We do not mean to suggest that the linear metric is correct.

Theorists have remained surprisingly silent on the rate of decay in short-term memory. Except for Baddeley (1986) who proposed that, in the absence of any rehearsal, the entire contents of the phonological loop become unusable after about 2 seconds, we do not know of other claims when it comes to the rate at which information gets lost over time. However, the idea that individuals differ in the rate at which information is lost over time and that this rate of decay plays a role in developmental changes in short-term memory span can be found within several theoretical frameworks (Towse, Hitch, & Hutton, 1999; Hitch, Towse, & Hutton, 2001; Barrouillet & Camos, 2012; Barrouillet, Gavens, Vergauwe, Gaillard, & Camos, 2009; Cowan, Nugent, Elliott, & Saults, 2000; Saults & Cowan, 1996). Demonstrating the existence of time-based decay in short-term memory and pinpointing its rate is especially hard because our cognitive system has at least two mechanisms at its disposal to fight against forgetting: attentional refreshing and articulatory rehearsal (Camos, Lagner, & Barrouillet, 2009; Hudjetz & Oberauer, 2007). Hence, forgetting due to the pure passage of time can only be shown in conditions that prevent the use of both maintenance mechanisms during that specific period of time, something that has proven to be extremely difficult.

In a recent attempt to demonstrate forgetting due to temporal decay, Barrouillet, et al. (2012) designed two complex span tasks in which participants were asked to maintain series of either letters or spatial locations while verifying multiplication solutions. The duration of multiplication verification was manipulated by presenting the multiplication either in digit format (e.g., 3 × 4 = 12) or in word format (e.g., three x four = twelve), the latter condition taking longer to solve. As such, the time during which memory traces fade because attention was diverted by concurrent activities was varied. Importantly, this was done while keeping constant the time during which refreshing and/or rehearsal can take place, in line with the TBRS approach to thinking about trace decay. From this study we can derive a decay rate of 1 letter per 8 seconds, i.e., .125 letters per second (Experiment 1) and a decay rate of 0.63 locations per 3.5 seconds, i.e., .180 locations per second (Experiment 2). These estimated rates of decay (.125 letters/second and .180 locations/second) are remarkably close to the rate observed by Peterson and Peterson (1959) but, suggest a slower rate of information loss than the one derived from Brown (1958).

Although evidence is accumulating in favor of decay, there is the question of why some researchers have failed to find the signature of decay in their data, despite reasonable attempts to do so, and why the decay rates observed by different researchers seem to differ. In our recent work we may have found an answer in the amount of time given for working memory consolidation (Ricker & Cowan, 2013). We define working memory consolidation separately from encoding into working memory, as did Jolicouer & Dell’acqua (1998). Whereas encoding into working memory is defined as the time during which the perception of the stimulus is transferred into memory and is stopped by the presentation of a masking stimulus (Turvey, 1973; Vogel, Woodman, & Luck, 2006), working memory consolidation continues even after a masking stimulus is presented and refers to a process which makes the encoded memory trace more resistant to forgetting. We believe that studies which tend to find little or no effect of decay on memory do so because they allow for relatively long periods of consolidation compared to studies that do tend to find an effect of decay on memory. Ricker & Cowan (2013) demonstrated that when all other things were held constant the amount of time allowed for working memory consolidation seemed to be the crucial factor in determining the rate of decay. Although the exact mechanisms through which working memory consolidation occurs are as of yet unclear and its effect on memory has not yet been replicated, its discovery could prove pivotal in uniting disparate findings on a long contested topic.

Thus far we have primarily focused on cognitive approaches to thinking about decay, but it can also be helpful to think about decay from a neuroscience perspective. There are a number of brain mechanisms which could underlie the decay of memories. We wish to mention them briefly here in order to demonstrate that a reasonable neural model of decay is not as far-fetched as some have suggested. Perhaps the most obvious model can be derived from Lisman & Idiart’s (1995) synchronized neural firing proposal, brought up to date by Lisman and Jensen (2013). They propose that memories are maintained by concurrent firing of the neurons that make up the memory trace. In this account several items are maintained concurrently by firing each memory representation at a different point within a neural oscillation lasting roughly 100-200 ms. It could be that the representations decay by gradually or probabilistically losing their place in this cycle and begin to fire at times overlapping with another representation, leading to disruption of both memories. Alternatively, it could be that decay is the process of individual groups of neurons which represent a memory trace gradually desynchronizing from one another, causing the loss of information (or other unrelated neurons probabilistically synchronizing with the trace group, adding noise to the representation). A third approach would be to assume that memory traces require a top down attentional signal to perpetuate their firing cycles. Removal of this signal would lead to reduced excitatory input and allow neuronal activation to drop, leading to quick failure of the trace as a whole.

As suggested above, it may be that, on an individual-trial basis, decay results from the sudden death of individual neural representations rather than a gradual degradation of representations (Winkler, Schröger, & Cowan, 2001; Zhang & Luck, 2009). If patterns of activation representing items in working memory collapsed at different time points for different items in a trial, or at different time points across trials, the result would be, on average, a gradually declining average performance level over time or decay.

In sum, we believe that the next 60 years or research on decay may well clarify when decay will be observed. Beyond this, existing lines of research are likely to reveal the rate at which items decay from memory. This is a complex question which will take some time given that differences in materials and conditions may lead to varying rates of loss. We find consideration of basic neural concepts helpful in communicating the idea of decay as a viable theoretical approach to forgetting over the short-term, though we have described them only briefly here. The novel use of existing neuroscience techniques and the emergence of new ones in the coming years may allow these and other neural models to be tested and the neural basis of decay to be discovered.

Concluding Remarks on Brown (1958) and Trace Decay Theory

Many of the phenomena predicted by Brown (1958) are still heavily debated,including the existence of decay over short periods of time. Nevertheless, the paper was and continues to be a source of inspiration for many experimental psychologists. Here we have argued that decay most likely does exist, as indicated by numerous recent findings which are not confounded by retroactive interference during the retention interval (Morey & Bieler, 2012; McKeown & Mercer, 2012; Ricker & Cowan, 2010, 2013). It also seems likely that the capacity limit in working memory is at least partially determined by decay. The findings in support of the TBRS model are difficult to explain without recourse to decay, especially those by Portrat et al. (2008) and Barrouillet et al. (2012), although some do propose alternative accounts (Oberauer et al., 2012; Oberauer & Lewandowsky, 2011, 2013).

Aside from decay, another premise of Brown (1958) is strongly in accord with more recent findings. In particular, Brown (1958) asserted that proactive-interference accounts of forgetting come up short. This point is upheld in modern findings (Barrouillet et al., 2012; Lewandowsky et al., 2004; Oberauer & Lewandowsky 2008; Ricker, Spiegel, & Cowan, submitted). On the point that a period of unfilled time immediately following presentation of the memoranda will not increase the strength of the trace, Brown appears to be incorrect. In recent work we have shown that increased free time immediately following memory presentation allows for increased working memory consolidation which, in turn, increases the robustness of the memory trace against decay (Ricker & Cowan, 2013).

It appears that the existence of decay is a question which is so basic to the function of human memory that extraordinary evidence is needed to come to a decisive conclusion upon which a diversity of researchers can agree.Time will tell! What is beyond question is that the way in which Brown (1958) posed the question and discussed it has inspired years of fruitful research into the nature of human memory.

Acknowledgments

Funding for this project was provided by NICHD Grant #2R01HD021338 to Nelson Cowan and by Swiss National Science Foundation Grant # PA00P1_139604 to Evie Vergauwe. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

References

  • Atkinson RC, Shiffrin RM. Human memory: A proposed system and its control processes. In: Spence KW, Spence JT, editors. The psychology of learning and motivation: Advances in research and theory. Vol. 2. New York: Academic Press; 1968. pp. 89–195. [Google Scholar]
  • Baddeley AD. Working memory. Oxford University Press; New York: 1986. [Google Scholar]
  • Baddeley A. The episodic buffer: A new component of working memory? Trends in Cognitive Sciences. 2000;4:417–423. doi: 10.1016/S1364-6613(00)01538-2. [PubMed] [CrossRef] [Google Scholar]
  • Baddeley AD, Hitch G. Working memory. In: Bower GH, editor. The psychology of learning and motivation. Vol. 8. New York, NY: Academic Press; 1974. pp. 47–89. [Google Scholar]
  • Baddeley AD, Logie RH. Working memory: The multiplecomponent model. In: Miyake A, Shah P, editors. Models of working memory: Mechanisms of active maintenance and executive control. Cambridge, England: Cambridge University Press; 1999. pp. 28–61. [Google Scholar]
  • Baddeley AD, Scott D. Short-term forgetting in the absence of proactive inhibition. Quarterly Journal of Experimental Psychology. 1971;23:275–283. doi: 10.1080/14640746908401822. [CrossRef] [Google Scholar]
  • Baddeley AD, Thomson N, Buchanan M. Word length and the structure of short-term memory. Journal of Verbal Learning and Verbal Behavior. 1975;14:575–589. doi: 10.1016/S0022-5371(75)80045-4. [CrossRef] [Google Scholar]
  • Baddeley AD, Warrington EK. Amnesia and the distinction between long- and short-term memory. Journal of Verbal Learning and Verbal Behavior. 1970;9:176–189. doi: 10.1016/S0022-5371(70)80048-2. [CrossRef] [Google Scholar]
  • Ballard PB. Obliviscence and Reminiscence. British Journal of Psychology. 1913;1:82. [Google Scholar]
  • Barrouillet P, Bernardin S, Camos V. Time constraints and resource sharing in adults’ working memory spans. Journal of Experimental Psychology: General. 2004;133:83–100. doi: 10.1037/0096-3445.133.1.83. [PubMed] [CrossRef] [Google Scholar]
  • Barrouillet P, Bernardin S, Portrat S, Vergauwe E, Camos V. Time and cognitive load in working memory. Journal of Experimental Psychology: Learning, Memory, and Cognition. 2007;33:570–585. doi: 10.1037/0278-7393.33.3.570. [PubMed] [CrossRef] [Google Scholar]
  • Barrouillet P, Portrat S, Camos V. On the law relating processing to storage in working memory. Psychological Review. 2011;118:175–192. doi: 10.1037/a0022324. [PubMed] [CrossRef] [Google Scholar]
  • Barrouillet, Camos As time goes by: Temporal constraints in working memory. Current Directions in Psychological Science. 2012;21:413–419. doi: 10.1177/0963721412459513. [CrossRef] [Google Scholar]
  • Barrouillet P, De Paepe A, Langerock N. Time causes forgetting from working memory. Psychonomic Bulletin & Review. 2012;19:87–92. doi: 10.3758/s13423-011-0192-8. [PubMed] [CrossRef] [Google Scholar]
  • Barrouillet P, Gavens N, Vergauwe E, Gaillard V, Camos V. Working memory span development: A Time-Based Resource-Sharing Model account. Developmental Psychology. 2009;45:477–490. doi: 10.1037/a0014615. [PubMed] [CrossRef] [Google Scholar]
  • Barrouillet P, Portrat S, Vergauwe E, Diependaele K, Camos V. Further evidence for temporal decay in working memory. Journal of Experimental Psychology: Learning, Memory, and Cognition. 2011;37:1302–1317. doi: 10.1037/a0022933. [PubMed] [CrossRef] [Google Scholar]
  • Basso A, Spinnler H, Vallar G, Zanobio M. Left hemisphere damage and selective impairment of auditory verbal short-term memory. A case study. Neuropsychologia. 1982;20:263–274. doi: 10.1016/0028-3932(82)90101-4. [PubMed] [CrossRef] [Google Scholar]
  • Broadbent DE. Immediate memory and simultaneous stimuli. Quarterly Journal of Experimental Psychology. 1957;9:1–11. doi: 10.1080/17470215708416214. [CrossRef] [Google Scholar]
  • Broadbent DE. Perception and communication. New York: Pergamon Press; 1958. [Google Scholar]
  • Brown GDA, Neath I, Chater N. A temporal ratio model of memory. Psychological Review. 2007;114:539–576. doi: 10.1037/0033-295X.114.3.539. [PubMed] [CrossRef] [Google Scholar]
  • Brown J. Distortions in immediate memory. Quarterly Journal of Experimental Psychology. 1956;8:134–139. doi: 10.1080/17470215608416812. [CrossRef] [Google Scholar]
  • Brown J. Some tests of the decay theory of immediate memory. The Quarterly Journal of Experimental Psychology. 1958;10:12–21. doi: 10.1080/17470215808416249. [CrossRef] [Google Scholar]
  • Burgess N, Hitch GJ. Memory for serial order: A network model of the phonological loop and its timing. Psychological Review. 1999;106:551–581. doi: 10.1037/0033-295X.106.3.551. [CrossRef] [Google Scholar]
  • Camos V, Barrouillet P. Developmental change in working memory strategies: From passive maintenance to active refreshing. Developmental Psychology. 2011;47:898–904. doi: 10.1037/a0023193. [PubMed] [CrossRef] [Google Scholar]
  • Camos V, Lagner P, Barrouillet P. Two maintenance mechanisms of verbal information in working memory. Journal of Memory and Language. 2009;61:457–469. doi: 10.1016/j.jml.2009.06.002. [CrossRef] [Google Scholar]
  • Caplan D, Rochon E, Waters GS. Articulatory and phonological determinants of word length effects in span tasks. The Quarterly Journal of Experimental Psychology. 1992;45:177–192. doi: 10.1080/14640749208401323. [PubMed] [CrossRef] [Google Scholar]
  • Cheng NY. Retroactive effect and degree of similarity. Journal of Experimental Psychology. 1929;12:444–449. doi: 10.1037/h0071397. [CrossRef] [Google Scholar]
  • Conrad R. Decay theory of immediate memory. Nature. 1957;179:831–832. doi: 10.1038/179831b0. [PubMed] [CrossRef] [Google Scholar]
  • Conrad R. Acoustic confusion in immediate memory. British Journal of Psychology. 1964;55:75–84. doi: 10.1111/j.2044-8295.1964.tb00899.x. [CrossRef] [Google Scholar]
  • Cowan N. Evolving conceptions of memory storage, selective attention, and their mutual constraints within the human information processing system. Psychological Bulletin. 1988;104:163–191. doi: 10.1037/0033-2909.104.2.163. [PubMed] [CrossRef] [Google Scholar]
  • Cowan N. Attention and memory: An integrated framework. Oxford, England: Oxford University Press; 1995. [Google Scholar]
  • Cowan N. An embedded-processes model of working memory. In: Miyake A, Shah P, editors. Models of working memory: Mechanisms of active maintenance and executive control. Cambridge, U.K.: Cambridge University Press; 1999. pp. 62–101. [Google Scholar]
  • Cowan N. The magic number 4 in short-term memory: A reconsideration of mental storage capacity. Behavioral and Brain Sciences. 2001;24:87–114. doi: 10.1017/S0140525X01003922. [PubMed] [CrossRef] [Google Scholar]
  • Cowan N, AuBuchon AM. Short-term memory loss over time without retroactive stimulus interference. Psychonomic Bulletin & Review. 2008;15(1):230–235. doi: 10.3758/PBR.15.1.230. [PMC free article] [PubMed] [CrossRef] [Google Scholar]
  • Cowan N, Day L, Saults JS, Keller TA, Johnson T, Flores L. The role of verbal output time in the effects of word length on immediate memory. Journal of Memory and Language. 1992;31:1–17. doi: 10.1016/0749-596X(92)90002-F. [CrossRef] [Google Scholar]
  • Cowan N, Nugent LD, Elliott EM, Geer T. Is there a temporal basis of the word length effect? A response to Service (1998) Quarterly Journal of Experimental Psychology. 2000;53A:647–660. doi: 10.1080/713755905. [PubMed] [CrossRef] [Google Scholar]
  • Cowan N, Nugent LD, Elliott EM, Geer T. Is there a temporal basis of the word length effect? A response to Service (1998) Quarterly Journal of Experimental Psychology. 2000;53A:647–660. doi: 10.1080/713755905. [PubMed] [CrossRef] [Google Scholar]
  • Cowan N, Saults JS, Nugent LD. The role of absolute and relative amounts of time in forgetting within immediate memory: The case of tone pitch comparisons. Psychonomic Bulletin & Review. 1997;4:393–397. doi: 10.3758/BF03210799. [CrossRef] [Google Scholar]
  • Craik FI, Tulving E. Depth of processing and the retention of words in episodic memory. Journal of Experimental Psychology: General. 1975;104:268–294. doi: 10.1037/0096-3445.104.3.268. [CrossRef] [Google Scholar]
  • De Renzi E, Nichelli P. Verbal and non-verbal short-term memory impairment following hemispheric damage. Cortex. 1975;11:341–354. [PubMed] [Google Scholar]
  • Farmer EW, Berman JV, Fletcher YL. Evidence for a visuo-spatial scratch-pad in working memory. The Quarterly Journal of Experimental Psychology. 1986;38:675–688. doi: 10.1080/14640748608401620. [CrossRef] [Google Scholar]
  • Farrell S, Lewandowsky S. An endogenous distributed model of ordering in serial recall. Psychonomic Bulletin & Review. 2002;9:59–79. doi: 10.3758/BF03196257. [PubMed] [CrossRef] [Google Scholar]
  • Greenberg R, Underwood BJ. Retention as a function of stage of practice. Journal of Experimental Psychology. 1950;40:452–457. doi: 10.1037/h0062147. [PubMed] [CrossRef] [Google Scholar]
  • Guilford JP, Park DG. The effect of interpolated weights upon comparative judgments. The American Journal of Psychology. 1931;43:589–599. doi: 10.2307/1415160. [CrossRef] [Google Scholar]
  • Henson RNA. Short-term memory for serial order: The start-end model. Cognitive Psychology. 1998;36:73–137. doi: 10.1006/cogp.1998.0685. [PubMed] [CrossRef] [Google Scholar]
  • Hitch GJ, Towse JN, Hutton U. What limits children’s working memory span? Theoretical accounts and applications for scholastic development. Journal of Experimental Psychology: General. 2001;130:184–198. doi: 10.1037/0096-3445.130.2.184. [PubMed] [CrossRef] [Google Scholar]
  • Hudjetz A, Oberauer K. The effects of processing time and processing rate on forgetting in working memory: Testing four models of the complex span paradigm. Memory & Cognition. 2007;35:1675–1684. doi: 10.3758/BF03193501. [PubMed] [CrossRef] [Google Scholar]
  • James W. The principles of psychology. NY: Henry Holt; 1890. [Google Scholar]
  • Jenkins JG, Dallenbach KM. Obliviscence during sleep and waking. The American Journal of Psychology. 1924;35:605–612. doi: 10.2307/1414040. [CrossRef] [Google Scholar]
  • Jolicoeur P, Dell’Acqua R. The demonstration of short-term consolidation. Cognitive Psychology. 1998;36:138–202. doi: 10.1006/cogp.1998.0684. [PubMed] [CrossRef] [Google Scholar]
  • Kane MJ, Poole BJ, Tuholski SW, Engle RW. Working memory capacity and the top-down control of visual search: Exploring the boundaries of“ executive attention” Journal of Experimental Psychology: Learning, Memory, and Cognition. 2006;32:749–777. doi: 10.1037/0278-7393.32.4.749. [PubMed] [CrossRef] [Google Scholar]
  • Keppel G, Underwood BJ. Proactive inhibition in short term retention of single items. Journal of Verbal Learning and Verbal Behavior. 1962;1:153–161. doi: 10.1016/S0022-5371(62)80023-1. [CrossRef] [Google Scholar]
  • Keller TA, Cowan N, Saults JS. Can auditory memory for tone pitch be rehearsed? Journal of Experimental Psychology: Learning, Memory, and Cognition. 1995;21:635–645. doi: 10.1037/0278-7393.21.3.635. [PubMed] [CrossRef] [Google Scholar]
  • Lewandowsky S, Duncan M, Brown GDA. Time does not cause forgetting in short term serial recall. Psychonomic Bulletin & Review. 2004;11:771–790. doi: 10.3758/BF03196705. [PubMed] [CrossRef] [Google Scholar]
  • Lewandowsky S, Oberauer K. The word-length effect provides no evidence for decay in short-term memory. Psychonomic Bulletin & Review. 2008;15(5):875–888. doi: 10.3758/PBR.15.5.875. [PubMed] [CrossRef] [Google Scholar]
  • Lewandowsky S, Oberauer K. No evidence for temporal decay in working memory. Journal of Experimental Psychology: Learning, Memory, and Cognition. 2009;35:1545–1551. doi: 10.1037/a0017010. [PubMed] [CrossRef] [Google Scholar]
  • Lewandowsky S, Oberauer K, Brown GDA. No temporal decay in verbal short term memory. Trends in Cognitive Science. 2009;13:120–126. doi: 10.1016/j.tics.2008.12.003. [PubMed] [CrossRef] [Google Scholar]
  • Lisman JE, Idiart MA. Storage of 7+/-2 short-term memories in oscillatory subcycles. Science. 1995;267(5203):1512–1515. doi: 10.1126/science.7878473. [PubMed] [CrossRef] [Google Scholar]
  • Lisman JE, Jensen O. The theta-gamma neural code. Neuron. 2013;77(6):1002–1016. doi: 10.1016/j.neuron.2013.03.007. [PMC free article] [PubMed] [CrossRef] [Google Scholar]
  • Logie RH. Visuo-spatial processing in working memory. The Quarterly Journal of Experimental Psychology. 1986;38:229–247. [PubMed] [Google Scholar]
  • Logie RH. Working memory. In: Bayne T, Cleeremans A, Wilken P, editors. The Oxford companion to consciousness. Oxford, U.K.: 2009. pp. 667–670. [Google Scholar]
  • Logie RH, Zucco GM, Baddeley AD. Interference with visual short-term memory. Acta Psychologica. 1990;75:55–74. doi: 10.1016/0001-6918(90)90066-O. [PubMed] [CrossRef] [Google Scholar]
  • Lovatt P, Avons SE, Masterson J. The word-length effect and disyllabic words. The Quarterly Journal of Experimental Psychology. 2000;53:1–22. doi: 10.1080/713755877. [PubMed] [CrossRef] [Google Scholar]
  • McGeoch JA. Forgetting and the law of disuse. Psychological Review. 1932;39:352–370. doi: 10.1037/h0069819. [CrossRef] [Google Scholar]
  • McKeown D, Mercer T. Short-term forgetting without interference. Journal of Experimental Psychology: Learning, Memory, and Cognition. 2012 doi: 10.1037/a0027749. Advance online publication. [PubMed] [CrossRef] [Google Scholar]
  • Melton AW. Implications of short-term memory for a general theory of memory. Journal of Verbal Learning and Verbal Behavior. 1963;2:1–21. doi: 10.1016/S0022-5371(63)80063-8. [CrossRef] [Google Scholar]
  • Miller GA. The magical number seven, plus or minus two: some limits on our capacity for processing information. Psychological Review. 1956;63:81–97. doi: 10.1037/h0043158. [PubMed] [CrossRef] [Google Scholar]
  • Morey CC, Bieler M. Visual short-term memory always requires general attention. Psychonomic Bulletin & Review. 2013;20:163–170. doi: 10.3758/s13423-012-0312-z. [PubMed] [CrossRef] [Google Scholar]
  • Morey CC, Cowan N. When visual and verbal memories compete: Evidence of cross-domain limits in working memory. Psychonomic Bulletin & Review. 2004;11:296–301. doi: 10.3758/BF03196573. [PubMed] [CrossRef] [Google Scholar]
  • Morey CC, Cowan N. When do visual and verbal memories conflict? The importance of working-memory load and retrieval. Journal of Experimental Psychology: Learning, Memory, and Cognition. 2005;31:703–713. doi: 10.1037/0278-7393.31.4.703. [PMC free article] [PubMed] [CrossRef] [Google Scholar]
  • Mueller ST, Seymour TL, Kieras DE, Meyer DE. Theoretical implications of articulatory duration, phonological similarity and phonological complexity in verbal working memory. Journal of Experimental Psychology: Learning, Memory, and Cognition. 2003;29:1353–1380. doi: 10.1037/0278-7393.29.6.1353. [PubMed] [CrossRef] [Google Scholar]
  • Murdock BB. The retention of individual items. Journal of Experimental Psychology. 1961;62:618–625. doi: 10.1037/h0043657. [PubMed] [CrossRef] [Google Scholar]
  • Nairne JS. A feature model of immediate memory. Memory & Cognition. 1990;18:251–269. doi: 10.3758/BF03213879. [PubMed] [CrossRef] [Google Scholar]
  • Nairne JS. Remembering over the short term: the case against the standard model. Annual Review of Psychology. 2002;53:53–81. doi: 10.1146/annurev.psych.53.100901.135131. [PubMed] [CrossRef] [Google Scholar]
  • Oberauer K, Kliegl R. A formal model of capacity limits in working memory. Journal of Memory and Language. 2006;55:601–626. doi: 10.1016/j.jml.2006.08.009. [CrossRef] [Google Scholar]
  • Oberauer K, Lewandowsky S. Forgetting in immediate serial recall: Decay, temporal distinctiveness, or interference? Psychological Review. 2008;115:544–576. doi: 10.1037/0033-295X.115.3.544. [PubMed] [CrossRef] [Google Scholar]
  • Oberauer K, Lewandowsky S. Modeling working memory: A computational implementation of the Time-Based Resource-Sharing theory. Psychonomic Bulletin & Review. 2011;18:10–45. doi: 10.3758/s13423-010-0020-6. [PubMed] [CrossRef] [Google Scholar]
  • Oberauer K, Lewandowsky S. Evidence against decay in verbal working memory. Journal of Experimental Psychology: General. 2013;142:380–411. doi: 10.1037/a0029588. [PubMed] [CrossRef] [Google Scholar]
  • Oberauer K, Lewandowsky S, Farrell S, Jarrold C, Greaves M. Modeling working memory: An interference model of complex span. Psychonomic Bulletin & Review. 2012;19:779–819. doi: 10.3758/s13423-012-0272-4. [PubMed] [CrossRef] [Google Scholar]
  • Page MPA, Norris DG. The primacy model:a new model of immediate serial recall. Psychological Review. 1998;105:761–781. doi: 10.1037/0033-295X.105.4.761-781. [PubMed] [CrossRef] [Google Scholar]
  • Penney CG. Modality effects and the structure of short term verbal memory. Memory & Cognition. 1989;17:398–422. [PubMed] [Google Scholar]
  • Peterson L, Peterson MJ. Short-term retention of individual verbal items. Journal of Experimental Psychology. 1959;58:193–198. doi: 10.1037/h0049234. [PubMed] [CrossRef] [Google Scholar]
  • Pratt CC. The law of disuse. Psychological Review. 1936;43:83–93. doi: 10.1037/h0059321. [CrossRef] [Google Scholar]
  • Portrat S, Barrouillet P, Camos V. Time-related decay or interference-based forgetting in working memory? Journal of Experimental Psychology: Learning, Memory, and Cognition. 2008;34:1561–1564. doi: 10.1037/a0013356. [PubMed] [CrossRef] [Google Scholar]
  • Repovš G, Baddeley A. The multi-component model of working memory: explorations in experimental cognitive psychology. Neuroscience. 2006;139:5–21. doi: 10.1016/j.neuroscience.2005.12.061. [PubMed] [CrossRef] [Google Scholar]
  • Ricker TJ, Cowan N. Loss of visual working memory within seconds: The combined use of refreshable and non-refreshable features. Journal of Experimental Psychology: Learning, Memory, and Cognition. 2010;36:1355–1368. doi: 10.1037/a0020356. [PMC free article] [PubMed] [CrossRef] [Google Scholar]
  • Ricker TJ, Cowan N. Differences in presentation methods in working memory procedures: A matter of working memory consolidation. Journal of Experimental Psychology: Learning, Memory, and Cognition. 2013 online in advance of print. [PMC free article] [PubMed] [Google Scholar]
  • Ricker TJ, Cowan N, Morey CC. Visual working memory is disrupted by covert verbal retrieval. Psychonomic Bulletin & Review. 2010;17:516–521. doi: 10.3758/PBR.17.4.516. [PMC free article] [PubMed] [CrossRef] [Google Scholar]
  • Ricker TJ, Spiegel LR, Cowan N. Time-based loss in short-term visual memory is not from temporal distinctiveness. submitted. [PMC free article] [PubMed] [Google Scholar]
  • Robinson ES. The ’similarity’ factor in retroaction. The American Journal of Psychology. 1927;39:297–312. doi: 10.2307/1415419. [CrossRef] [Google Scholar]
  • Service E. The effect of word length on immediate serial recall depends on phonological complexity, not articulatory duration. Quarterly Journal of Experimental Psychology. 1998;51A:283–304. doi: 10.1080/713755759. [CrossRef] [Google Scholar]
  • Sokolov EN. Perception and the conditioned reflex. NY: Pergamon Press; 1963. [Google Scholar]
  • Sperling G. The information available in brief visual presentations. Psychological Monographs. 1960;74(Whole No. 498) [Google Scholar]
  • Saults JS, Cowan N. The development of memory for ignored speech. Journal of Experimental Child Psychology. 1996;63:239–261. doi: 10.1006/jecp.1996.0049. [PubMed] [CrossRef] [Google Scholar]
  • Thornedike EL. Educational psychology: The psychology of learning. Teachers College. Columbia University; 1913. [Google Scholar]
  • Towse JT, Htich GJ, Hutton U. The resource king is dead! Long live the resource king! Behavioral and Brain Sciences. 1999;22:111–111. doi: 10.1017/S0140525X99401785. [CrossRef] [Google Scholar]
  • Turvey MT. On peripheral and central processes in vision: inferences from an information processing analysis of masking with patterned stimuli. Psychological Review. 1973;80:1–52. doi: 10.1080/00335557043000078. [PubMed] [CrossRef] [Google Scholar]
  • Underwood BJ. Interference and forgetting. Psychological Review. 1957;64:49–60. doi: 10.1037/h0044616. [PubMed] [CrossRef] [Google Scholar]
  • Vallar G, Baddeley AD. Fractionation of working memory - neuropsychological evidence for a phonological short-term store. Journal of Verbal Learning and Verbal Behavior. 1984a;23:151–161. doi: 10.1016/S0022-5371(84)90104-X. [CrossRef] [Google Scholar]
  • Vallar G, Baddeley AD. Phonological short-term store, phonological processing and sentence comprehension - a neuropsychological case-study. Cognitive Neuropsychology. 1984b;1:121–141. doi: 10.1080/02643298408252018. [CrossRef] [Google Scholar]
  • Vergauwe E, Barrouillet P, Camos V. Visual and spatial working memory are not that dissociated after all: A time-based resource-sharing account. Journal of Experimental Psychology: Learning, Memory, and Cognition. 2009;35:1012–1028. doi: 10.1037/a0015859. [PubMed] [CrossRef] [Google Scholar]
  • Vergauwe E, Barrouillet P, Camos V. Do mental processes share a domain-general resource? Psychological Science. 2010;21:384–390. doi: 10.1177/0956797610361340. [PubMed] [CrossRef] [Google Scholar]
  • Vergauwe E, Dewaele N, Langerock N, Barrouillet P. Evidence for a central pool of general resources in working memory. Journal of Cognitive Psychology. 2012;24:359–366. doi: 10.1080/20445911.2011.640625. [CrossRef] [Google Scholar]
  • Vogel EK, Woodman GF, Luck SJ. The time course of consolidation in visual working memory. Journal of Experimental Psychology:Human Perception and Performance. 2006;32:1436–1451. doi: 10.1037/0096-1523.32.6.1436. [PubMed] [CrossRef] [Google Scholar]
  • Waugh NC, Norman DA. Primary memory. Psychological Review. 1965;72:89–104. doi: 10.1037/h0021797. [PubMed] [CrossRef] [Google Scholar]
  • Winkler I, Schröger E, Cowan N. The role of large-scale memory organization in the mismatch negativity event-related brain potential. Journal of Cognitive Neuroscience. 2001;13:59–71. doi: 10.1162/089892901564171. [PubMed] [CrossRef] [Google Scholar]
  • Williams O. A study of the phenomenon of reminiscence. Journal of Experimental Psychology. 1926;9:368–387. doi: 10.1037/h0069781. [CrossRef] [Google Scholar]
  • Woodman GF, Vogel EK, Luck SJ. Visual search remains efficient when visual working memory is full. Psychological Science. 2001;12:219–224. doi: 10.1111/1467-9280.00339. [PubMed] [CrossRef] [Google Scholar]
  • Woodman GF, Vogel EK, Luck SJ. Flexibility in visual working memory: Accurate change detection in the face of irrelevant variations in position. Visual Cognition. 2012;20:1–28. doi: 10.1080/13506285.2011.630694. [PMC free article] [PubMed] [CrossRef] [Google Scholar]
  • Yum KS. An experimental test of the law of assimilation. Journal of Experimental Psychology. 1931;14:68–82. doi: 10.1037/h0071335. [CrossRef] [Google Scholar]
  • Zhang W, Luck SJ. Sudden death and gradual decay in visual working memory. Psychological Science. 2009;20:423–428. doi: 10.1111/j.1467-9280.2009.02322. [PMC free article] [PubMed] [CrossRef] [Google Scholar]

What is decay theory of forgetting in psychology?

Trace decay theory states that forgetting occurs as a result of the automatic decay or fading of the memory trace. Trace decay theory focuses on time and the limited duration of short term memory. This theory suggests short term memory can only hold information for between 15 and 30 seconds unless it is rehearsed.

What is true about the decay theory?

It is widely believed that neurons die off gradually as we age, yet some older memories can be stronger than most recent memories. Thus, decay theory mostly affects the short-term memory system, meaning that older memories (in long-term memory) are often more resistant to shocks or physical attacks on the brain.

What is an example of decay theory in psychology?

Another problem with decay theory is it does not account for why some memories fade quickly while others linger. Novelty is one factor that plays a role. For example, you are more likely to remember your very first day of college than all of the intervening days between it and graduation.

What causes forgetting according to decay quizlet?

According to decay theory, forgetting occurs because the physical memory trace fades through disuse as time passes, unless it is reactivated by being used occasionally.