Tải bản đầy đủ (.pdf) (497 trang)

Handbool of research methods in human memory hajime otani, bennerr l schwartx, routledge, 2019 scan

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (9.49 MB, 497 trang )


HANDBOOK OF RESEARCH
METHODS IN
HUMAN MEMORY

The Handbook of Research Methods in Human Memory presents a collection of chapters on
methodology used by researchers in investigating human memory. Understanding the basic
cognitive function of human memory is critical in a wide variety of fields, such as clinical
psychology, developmental psychology, education, neuroscience, and gerontology, and studying
memory has become particularly urgent in recent years due to the prominence of a number
of neurodegenerative diseases, such as Alzheimer’s. However, choosing the most appropriate
method of research is a daunting task for most scholars. This book explores the methods that are
currently available in various areas of human memory research and serves as a reference manual to
help guide readers’ own research. Each chapter is written by prominent researchers and features
cutting-­edge research on human memory and cognition, with topics ranging from basic memory
processes to cognitive neuroscience to further applications. The focus here is not on the “what,”
but the “how”­—how research is best conducted on human memory.
Hajime Otani is a professor of psychology at Central Michigan University. His current research
focuses on emotion and memory.
Bennett L. Schwartz is a professor of psychology at Florida International University. He conducts
research on memory and metamemory. He is currently Editor-­in-­Chief of New Ideas in Psychology.



HANDBOOK OF RESEARCH
METHODS IN
HUMAN MEMORY

Edited by Hajime Otani and Bennett L. Schwartz



First published 2019
by Routledge
711 Third Avenue, New York, NY 10017
and by Routledge
2 Park Square, Milton Park, Abingdon, Oxon, OX14 4RN
Routledge is an imprint of the Taylor & Francis Group, an informa business
© 2019 Taylor & Francis
The right of Hajime Otani and Bennett L. Schwartz to be identified as
the authors of the editorial material, and of the authors for their individual
chapters, has been asserted in accordance with sections 77 and 78 of the
Copyright, Designs and Patents Act 1988.
All rights reserved. No part of this book may be reprinted or reproduced or
utilised in any form or by any electronic, mechanical, or other means, now
known or hereafter invented, including photocopying and recording, or in
any information storage or retrieval system, without permission in writing
from the publishers.
Trademark notice: Product or corporate names may be trademarks or
registered trademarks, and are used only for identification and explanation
without intent to infringe.
Library of Congress Cataloging-­in-­Publication Data
A catalog record for this title has been requested
ISBN: 978-­1-­138-­21794-­2 (hbk)
ISBN: 978-­1-­138-­21795-­9 (pbk)
ISBN: 978-­0-­429-­43995-­7 (ebk)
Typeset in Bembo
by Apex CoVantage, LLC


To our fathers
Yo Otani and Foster Schwartz




CONTENTS

List of Contributors
x
Forewordxiii
Prefacexv
  1 History of Methods in Memory Science: From Ebbinghaus to fMRI
Hajime Otani, Bennett L. Schwartz, and Abby R. Knoll

1

  2 Dependent Measures in Memory Research: From Free Recall to Recognition
Anne M. Cleary

19

  3 Measures of Forgetting
Benjamin C. Storm

36

  4 Accuracy and Bias in Episodic Memory
Aysecan Boduroglu and Aycan Kapucu

50

  5 Response Time Measures in Memory Research

Motonori Yamaguchi and Richard Schweickert

67

  6 Methods of Studying Working Memory
Zach Shipstead and Ashley Nespodzany

84

  7 Methods of Studying Text: Memory, Comprehension, and Learning
Kathryn S. McCarthy, Kristopher J. Kopp, Laura K. Allen, and
Danielle S. McNamara

104

  8 The Methodology of Metamemory and Metacomprehension
Deborah K. Eakin and Jarrod Moss

125

vii


Contents

  9 Research Methods for Studying the Emotion-­Memory Relationship
Hajime Otani, Terry M. Libkuman, Abby R. Knoll, and Cody J. Hensley

154


10 Methods for Studying Memory Differences Between Young and Older Adults
Aslı Kılıç and Amy H. Criss

178

11 Discovering Memory: Methods in the Study of Memory Development
P. Douglas Sellers II and Karin Machluf

192

12 Assessing Autobiographical Memory Disruptions in Psychiatric Populations
Laura Jobson

205

13 Methods of Studying Memory Without Awareness
Neil W. Mulligan

222

14 Methods of Studying False Memory
Henry Otgaar, Sanne T. L. Houben, and Mark L. Howe

238

15 Methods of Studying Eyewitness Memory
Nadja Schreiber Compo, Jonathan Vallano, Jillian Rivard, Angelica Hagsand,
Michelle Pena, and Christopher Altman

253


16 The Assessment of Autobiographical Memory: An Overview of
Behavioral Methods
Adam R. Congleton and Dorthe Berntsen

267

17 Methods of Studying Prospective Memory
Melissa J. Guynn, Gilles O. Einstein, and Mark A. McDaniel

284

18 Face Memory
Karen Lander and Vicki Bruce

313

19 Challenges in Music Memory Research
Zehra F. Peynircioğlu, Esra Mungan, and Bennett L. Schwartz

330

20 A User’s Guide to Collecting Data Online
Kalif E. Vaughn, Jeremy Cone, and Nate Kornell

354

21 Neuropsychological Methods in Memory Research
Kata Pauly-­Takacs, Celine Souchay, Alastair D. Smith, and Chris J. A. Moulin


374

22 Applications of Functional MRI in Memory Research
Joey Ka-­Yee Essoe and Jesse Rissman

397

viii


Contents

23 From the Laboratory to the Classroom: Challenges and Solutions for
Conducting Memory Research in Educational Contexts
John Dunlosky, Kayla Morehead, Amanda Zamary, and Katherine A. Rawson
24 Methods of Studying Individual Differences in Memory
Kimberly M. Wingert and Gene A. Brewer

428
443

Index459

ix


CONTRIBUTORS

Laura K. Allen, Mississippi State University, USA
Christopher Altman, Florida International University, USA

Dorthe Berntsen, Aarhus University, Denmark
Gene A. Brewer, Arizona State University, USA
Aysecan Boduroglu, Boğaziçi University, Turkey
Vicki Bruce, Newcastle University, UK
Anne M. Cleary, Colorado State University, USA
Jeremy Cone, Williams College, USA
Adam R. Congleton, Aarhus University, Denmark
Nadja Schreiber Compo, Florida International University, USA
Amy H. Criss, Syracuse University, USA
John Dunlosky, Kent State University, USA
Deborah K. Eakin, Mississippi State University, USA
Gilles O. Einstein, Furman University, USA
Joey Ka-­Yee Essoe, University of California, Los Angeles, USA
Melissa J. Guynn, New Mexico State University, USA
Angelica Hagsand, University of Gothenburg, Sweden
Cody J. Hensley, Central Michigan University, USA
Sanne T. L. Houben, Maastricht University, The Netherlands
Mark L. Howe, Maastricht University, The Netherlands, and City, University of London, UK
Laura Jobson, Monash University, Australia

x


Contributors

Aycan Kapucu, Ege University, Turkey
Aslı Kılıç, Middle East Technical University, Turkey
Abby R. Knoll, Central Michigan University, USA
Kristopher J. Kopp, Arizona State University, USA
Nate Kornell, Williams College, USA

Karen Lander, University of Manchester, UK
Terry M. Libkuman, Central Michigan University, USA
Karin Machluf, The Pennsylvania State University, USA
Kathryn S. McCarthy, Georgia State University, USA
Mark A. McDaniel, Washington University, USA
Danielle S. McNamara, Arizona State University, USA
Kayla Morehead, Kent State University, USA
Jarrod Moss, Mississippi State University, USA
Chris J. A. Moulin, Université Grenoble Alpes, France
Neil W. Mulligan, University of North Carolina­—Chapel Hill, USA
Esra Mungan, Boğaziçi University, Turkey
Ashley Nespodzany, Arizona State University, USA
Hajime Otani, Central Michigan University, USA
Henry Otgaar, Maastricht University, The Netherlands, and City, University of London, UK
Kata Pauly-­Takacs, Leeds Beckett University, UK
Michelle Pena, Florida International University, USA
Zehra F. Peynircioğlu, American University, USA
Katherine A. Rawson, Kent State University, USA
Jesse Rissman, University of California, Los Angeles, USA
Jillian Rivard, Barry University, USA
Bennett L. Schwartz, Florida International University, USA
Richard Schweickert, Purdue University, USA
P. Douglas Sellers II, The Pennsylvania State University, USA
Zach Shipstead, Alma College, USA
Alastair D. Smith, Plymouth University, UK
Celine Souchay, Université Grenoble Alpes, France
Benjamin C. Storm, University of California, Santa Cruz, USA

xi



Contributors

Jonathan Vallano, University of Pittsburgh, USA
Kalif E. Vaughn, Northern Kentucky University, USA
Kimberly M. Wingert, Arizona State University, USA
Motonori Yamaguchi, Edge Hill University, UK
Amanda Zamary, Kent State University, USA

xii


FOREWORD
Finding Out How Our Memories Work Versus
How We Think Our Memories Work

In a chapter that Elizabeth Bjork and I wrote for a festschrift honoring William K. Estes, we discussed what we referred to as “important peculiarities” of human memory (pp. 36–41; Bjork &
Bjork, 1992). We argued that certain characteristics of human memory are peculiar because they
differ so fundamentally from the corresponding characteristics of manufactured memory devices­—
such as a compact disk or the memory in a computer­—and we argued that such characteristics are
important because optimizing one’s own learning, or one’s children’s or students’ learning, requires
understanding the unique functional architecture of human memory.
This handbook, in addition to providing a survey of the current array of methods that researchers
use to explore the complexities and mysteries of how our memories work (or fail to work), provides
a picture of the complexities (and important peculiarities) of human memory. Collectively, and via
the editors’ opening summary of the 133-­year history of controlled research on human memory,
this handbook summarizes the progress, pitfalls, and evolving methodologies that have characterized
research on the complexities and idiosyncrasies of the human memory system.
Individually, each chapter comprises an important resource for researchers and practitioners who
have a particular research interest, but the book as a whole is indeed a “handbook” from multiple perspectives. There are chapters that focus on the basic encoding, retention, and competitive dynamics that characterize human long-­term memory­—and on the role short-­term/­working

memory plays in the functioning of our memories. In addition, there are chapters that summarize
the range of methodologies now being used to examine the behavioral and brain dynamics of more
specialized topics, such as how our memories change across the life span, how our memories malfunction during amnesia or when we suffer from psychiatric disorders, how emotion and memory
interact, and how some memories, such as autobiographical memories and memory for faces or
music, have special properties. Other chapters summarize research methods that let us explore metamemory processes, such as judgments of learning, and other methods that let us explore aspects of
memory functioning that are not accompanied by conscious awareness.
What also comes through in this handbook is the real-­world importance of research on human
memory­—for optimizing education, self-­regulated learning, eyewitness-­testimony procedures, and
treatment of memory disorders. Finally, this handbook also constitutes a kind of methodological
toolkit for researchers. There are chapters on the proper uses of alternative measures of memory and
forgetting, such as accuracy and reaction-­time measures, and there is a chapter on the potential and
the problematic aspects of examining memory processes via online experimentation.

xiii


Foreword

In total, this handbook testifies to the importance of understanding how human memory works
and to the complexity and the vitality of current research on human memory. I cannot help wondering what Hermann Ebbinghaus would think were he able to read this handbook today­—133 years
after he began exploring human memory by learning lists of nonsense syllables. I assume he would
be amazed­—but then again, given his insights and prescience­—maybe not.
Robert A. Bjork

Reference
Bjork, R. A., & Bjork, E. L. (1992). A new theory of disuse and an old theory of stimulus fluctuation. In A.
Healy, S. Kosslyn, & R. Shiffrin (Eds.), From learning processes to cognitive processes: Essays in honor of William
K. Estes (Vol. 2, pp. 35–67). Hillsdale, NJ: Lawrence Erlbaum.

xiv



PREFACE

This book is about the methods that researchers use to investigate human memory. The story starts
a little over 100 years ago when Hermann Ebbinghaus began scientific research on human memory
using a modest methodological invention called nonsense syllables. Since then, memory science
has made tremendous advances in understanding the working of memory. We argue that such
advances were not possible without creative methods researchers invented to uncover what some
psychologists many years ago thought was hopelessly beyond the reach of science. Of course, we
fully acknowledge that research should be dictated by theories rather than methodology. Nonetheless, without methodology, theories cannot be empirically tested. For example, educators often give
advice to students about how to optimize learning. However, without empirically testing these
ideas, we will never know whether the advice is actually effective. And, empirical testing requires
methodology.
How do scientists study human memory? At the beginning, methodology was simple; there
were nonsense syllables (e.g., käb, Ebbinghuas) and paired associates (e.g., Wez­—319, Mary Whiton
Calkins). However, a shift from the behavioristic approach to the information-­processing approach
necessitated the development of other, more refined, methods, from implicit memory tests to source
monitoring to feeling-­of-­knowing judgments. And more recently, ecological considerations have
played a major role in expanding memory research methodology, such as diary recording and free
narratives. Adding to this is advancement in technology in neuroscience such as EEG and fMRI.
With the proliferation of studies on human memory and the hundreds of thousands of papers currently available, going through the entire literature on human memory to determine the best methods for a study is too laborious for any researchers. Thus, for busy researchers and graduate students,
we thought a book cataloging the methods that are available would be tremendously useful. In fact,
such a book was published in 1982 by C. Richard Puff (Handbook of Research Methods in Human
Memory and Cognition, Academic Press). This book has been valuable to graduate students (including
one of us, Otani, when he was a graduate student many years ago) particularly because it emphasized
the “how to” of using these methods. Furthermore, even for seasoned researchers it is difficult to
know all the methods that are available, and therefore, this book was useful in introducing novel
approaches to researchers entrenched in their own familiar methodology.
Today, Puff’s handbook is still making valuable contributions; in fact, one of us (Otani) still

assigns chapters from this book in his graduate seminar. However, this book needs to be updated as
new topics and methods have emerged since its publication. For this reason, we brought together
many of the most noted researchers in the field of human memory and asked them to describe in
xv


Preface

detail the methodologies that they employ in investigating the myriad topics covered under the
umbrella of human memory. As such, Research Methods in Human Memory covers such diverse topics
as working memory (Chapter 6. Shipstead and Nespodzany), false memory (Chapter 14. Otgaar,
Houben, and Howe), autobiographical memory in psychiatric populations (Chapter 12. Jobson),
how to measure forgetting (Chapter 3. Storm), memory without awareness (Chapter 13. Mulligan),
and how to collect data online for memory experiments (Chapter 20. Vaughn, Cone, and Kornell).
The emphasis is on behavioral methods, although several of the chapters discuss neuroscientific
approaches to memory (Chapter 21. Pauly-­Takacs, Souchay, Smith, and Moulin; Chapter 22. Essoe
and Rissman). Many deal with strictly laboratory science (Chapter 5. Yamaguchi and Schweickert),
whereas others discuss methodologies used in more applied settings (Chapter 23. Dunlosky, Morehead, Zamary, and Rawson; Chapter 15. Schreiber Compo et al). We think that these chapters will
be helpful to anyone interested in doing or simply understanding the science of memory research
as it is practiced today.
There are many people we need to thank for assistance in making this book become a reality. We
are particularly grateful to Paul Dukes for listening to our pitch and having the faith in us to take
this book to Routledge. We are also grateful to the following staff at Routledge for their input on
this book (Marie Louise Roberts and Claudia Bona-­Cohen). We also thank Abby Knoll, a graduate
student, for lending her hand in reviewing some of the chapters. We, of course, thank our authors.
We also thank our families for being patient with all the time that we both needed to devote to these
chapters to make this book the best it can be.
We would also like to thank the late Dr. C. Richard Puff. His seminal book inspired us to take
on this book project. We hope that he would be pleased to see how much progress memory science
has made in methodology since the publication of his book in 1982.

Hajime Otani and Bennett L. Schwartz
January 17, 2018

xvi


1
HISTORY OF METHODS IN
MEMORY SCIENCE
From Ebbinghaus to fMRI
Hajime Otani, Bennett L. Schwartz, and Abby R. Knoll

Hermann Ebbinghaus once said that psychology is a discipline with a long past but a short history
(Shakow, 1930). It has a short history because psychology did not receive formal recognition as a
unified and independent discipline until Wilhelm Wundt established his laboratory in Leipzig, Germany in 1879 (Murray, 1983). A few years later in 1885, Ebbinghaus published his first book, On
Memory: A Contribution to Experimental Psychology, and introduced “an experimental and quantitative
approach” to investigate the “manifestations of memory” (Ebbinghaus, 1885/­2011, p. xiii; see also
Nicolas, 2006). From the very beginning, Ebbinghaus knew that his approach was a radical departure from the dominant, descriptive approach. In fact, in the preface of his book, he pleaded with
his readers to withhold judgment about the “practicability” of this approach. Since then, memory
science has flourished and knowledge has expanded; however, it is remarkable that all this was
accomplished within a span of a little over 100 years. By comparison, the natural sciences had a head
start by at least 100 years; for instance, in physics, Nicolaus Copernicus proposed the heliocentric
theory of the universe in 1543, and in chemistry, Antonie-­Laurent de Lavoisier discovered the role
of oxygen in combustion in 1778.
Nevertheless, as Ebbinghaus said, psychology indeed has a long past, and the topic of human memory is no exception. According to Yates (1966), in antiquity, memory was considered a critical component of rhetoric, which was important for politics, religion, and art, for the reason that by training in
the art of memory, orators could develop a skill to deliver a lengthy speech without making errors, not
only to tell the truth but also to testify the divinity of the soul. Three treaties became profoundly influential as a textbook to spread the teaching of the art of memory (Yates, 1966): Ad Herennium (anonymous, 86 to 82 BC), De Orator (Cicero, 55 BC), and Institutio Oratoria (Quintilian, 1 AD). These texts
advocated the use of locations and imagery to create artificial memory, which are still mentioned in
psychology textbooks today as effective mnemonic techniques (e.g., the method of loci). Throughout
the history of the Western world, the art of memory was practiced by many scholars (Yates, 1966),

and a number of these scholars made contributions to the understanding of memory, including Aristotle, Plato, Augustine, Aquinas, Da Vinci, and Francis Bacon (Hermann & Chaffin, 1987). According to Hermann and Chaffin (1987), the majority of these scholars took theoretical and/­or pragmatic
approaches, even though some, such as Aristotle and Plato, took an empirical/­descriptive approach.
There is no doubt that the work of these scholars has provided insights and inspirations to the modern
scholars of memory. However, there is no denying that the modern scientific approach to memory
began with Ebbinghaus, who showed that methodology can be developed to bring the “realm of mental phenomena” (Ebbinghaus, 1885/­2011, p. xiii) under scientific scrutiny. In the present chapter, our
1


Hajime Otani et al.

goal is to trace the development of research methods since Ebbinghaus to show that during the short
history, memory science has made impressive advances in methodology in search for the answers to
increasingly sophisticated questions. However, our goal is not to create an exhaustive list of methods
that have been developed since 1885, but rather to present the highlights of the methods that became
familiar to many of us conducting empirical memory research.

Ebbinghaus: The Modest Beginning
Among his many contributions, Ebbinghaus became famous for developing nonsense syllables to
achieve simplicity and homogeneity in the materials that he used in his experiments. He was aware
that these materials were not free of variations; however, he thought that these syllables were better
suited for quantitative analysis of memory because, unlike poetry or prose, these syllables were less
susceptible to various influences (such as content and style), more plentiful, and easy to break down
into quantifiable pieces. Thus, he tried to achieve scientific rigor by simplifying the materials, and
by doing so, he established the list-­learning paradigm, which has been the workhorse of memory
research ever since. Furthermore, he learned lists of these syllables in the order of presentation, paced
by the stroke of a metronome or the ticking of a watch, the method referred to as the serial learning method. The use of nonsense syllables received accolades from Ebbinghaus’ contemporaries; for
example, Titchener said that the use of these syllables represented “the most considerable advance in
this chapter of psychology, since the time of Aristotle” (Titchener, 1910, pp. 380–381). However,
Ebbinghaus also used stanzas from Byron’s Don Juan as comparison materials in at least one of his
experiments, indicating that he was interested in connecting what he found with nonsense syllables

with real-­world memory phenomena, a fact often missed in textbook descriptions of Ebbinghaus’
contributions. He also invented a measure of memory, which he called a saving score, to quantify
how much faster he could learn a list when he relearned it (expressed in percent time saved) relative
to how long it took for him to learn the list for the first time. Although the saving score did not
become as popular as nonsense syllables as a method in human memory research, it was a sensitive
measure capable of revealing memory traces even after a 31-­day retention interval.
Using these methods, Ebbinghaus (1885/­2011) investigated issues that are still pertinent today:
the repetition effect, overlearning, massed versus distributed practice, forgetting, and remote associations. What was remarkable about Ebbinghaus was that he used himself as the sole participant
in his experiments, learning a total of 84,590 nonsense syllables in 6,609 lists during the approximately 832.6 hours he spent on testing (Hoffman, Bringmann, Bamberg, & Klein, 1987), a feat that
is unthinkable for any modern researcher to replicate. His experiments consisted of three phases,
which are familiar to any modern memory researcher: a learning phase, a retention interval, and a
recitation phase. Furthermore, he was careful about controlling extraneous variables, such as time
of day, fatigue, and any extreme changes in the “outer and inner life” (Ebbinghaus, 1885/­2011,
p. 26) that might have influenced the results of the experiments. In these experiments, he learned a
list to the criterion of perfect learning, and when he investigated the issue of retention, he relearned
the list to compute saving scores. It is important to note that despite his reputation as a pioneer of
the associationistic tradition in psychology, his research was not strictly focused on the nature of
associations (see Verhave & van Hoorn, 1987, for many misrepresentations about Ebbinghaus). In
fact, Ebbinghaus defined memory broadly as “Learning, Retention, Association and Reproduction”
(Ebbinghaus, 1885/­2011, p. xiii).

Post Ebbinghaus: Calkins, Behaviorists, and Bartlett
Newman (1987), who described the development that took place in memory research immediately
after Ebbinghaus (1985/­2011), said that between 1885 and 1905, 44 memory papers were published,
2


History of Methods in Memory Science

and of these, 11 referenced Ebbinghaus. However, Newman’s comment for the period between

1891 and 1895 is noteworthy; he said that he was surprised by “the variety of methods” (p. 81) that
appeared during this period, such as memory span, serial reconstruction, paired-­associate learning,
free recall, recognition, and card sorting (for studying transfer and retroactive interference). Thus,
soon after Ebbinghaus, some of the methods that we are familiar with in today’s laboratory were
already at work.
Mary Whiton Calkins (1894, 1896a, 1896b) developed the paired-­associate learning paradigm
to study the formation of associations, even though she did not name the method or provide the
rationale for the method (Madigan & O’Hara, 1992). In this paradigm, participants were presented
with pairs of items during learning, and their memories of associations were tested when the first
items of the pairs were presented, and participants were asked to recall the second items of the pairs.
In her 1894 paper, Calkins described that she conducted experiments with ten participants at the
Harvard University Lab with the average of 80 experiments per participant and with 25 participants
at Wellesley College with the average of 16 experiments per participant. Apparently, Calkins followed Ebbinghaus’ example of extensive within-­subjects investigation, except that she did not use
herself as the sole participant in her experiments. Furthermore, following Ebbinghaus, she adopted
the list-­learning paradigm by presenting lists of nonsense syllable-­digit pairs when she investigated
auditory memory and lists of color strip-­digit pairs when she investigated visual memory to study
the laws of associations: frequency, recency, primacy, and vividness. In her 1896a paper, she introduced two ways of visually presenting pairs of items, successive (that is, the first item is presented
first followed by the second item) and simultaneous (both the first and second items are presented
at the same time), with the latter becoming the standard way of studying verbal learning over many
subsequent years.
Despite the impressive series of experiments that she conducted, Calkins did not receive recognition for developing the paired-­associate paradigm (Madigan & O’Hara, 1992), even though
Titchener (1901) referenced her name and included her experiments as exercise experiments in his
instructor’s manual of laboratory practice (but not in the textbook itself). Indeed, Harvard University ultimately rejected a petition to grant her a doctoral degree, and when Radcliff College offered
her a doctoral degree, Calkins refused to accept it; thus, she never was able to call herself Dr.
Calkins (Furumoto, 1979; Madigan & O’Hara, 1992). The paradigm, however, was adopted by
behaviorists, and according to Madigan and O’Hara (1992), Thorndike (1908) was the earliest to
use the term paired associate but also did not reference Calkins’ work. It is well-­known that behaviorally oriented researchers (see Kausler, 1966) made extensive use of the paired-­associate learning
paradigm because this paradigm was well suited for analysis within the framework of stimulus-­
response associations, particularly for the issues regarding transfer of learning and interference.
For instance, Osgood (1949) developed a model of transfer of training (transfer and retroaction

surface) based on a paradigm consisting of two lists of paired-­associate items that shared or did not
share stimulus and response terms. This paradigm was later referred to as an A-­B, A-­C paradigm
because both lists shared the stimulus terms (A) but the response terms were different (B and C).
After learning both lists, participants were presented with the stimulus terms (A) and were asked to
reproduce the response terms from List 1 (B) to test for retroactive inhibition/­inference and from
List 2 (C) to test for proactive inhibition/­interference (Crowder, 1976). According to Hintzman
(2011), the popularity of the paired-­associate paradigm peaked in the middle of the 1960s and has
been on a precipitous decline since, with only a small number of recent publications mentioning
the keywords paired associate. However, the demise of paired-­associate paradigm may have been
over exaggerated because the modern equivalent of this paradigm is the cued recall method, which
played a crucial role in studying phenomena, such as encoding specificity (Tulving & Thomson,
1973), the generation effect (Slamecka & Graf, 1978), and judgments of learning (Nelson & Dunlosky, 1991) during the 1970s, 1980s, 1990s, and beyond.
3


Hajime Otani et al.

While verbal learning research inherited Ebbinghaus’ list-­learning tradition, another tradition
was emerging. Bartlett (1932) broke away from Ebbinghaus because using nonsense syllables in
his experiments led him to “disappointment” and “a growing dissatisfaction” due to the artificiality of the experiments. His remedy was to select the materials that people commonly dealt with in
their “daily activities” to make the experiments more “realistic” (p. xvii, Bartlett, 1939/­1995). It is
well known that Bartlett used a prose passage, The War of the Ghosts, to investigate repeated recall
attempts over time. However, what is less known is that he used many other prose passages, such
as The Son Who Tried to Outwit His Father (see Appendix). Furthermore, he developed methods to
study picture memory, such as presenting simple line drawings to examine how the reproduction
of these drawings changed over time. Bartlett’s approach did not become influential, at least in the
United States, until the everyday memory movement began during the 1970s.

Emergence of Information Processing Approach in the 1960s
Toward the end of the 1950s, a sufficient number of psychologists became dissatisfied with the dominance of the behavioristic approach and began constructing a new approach based on an information processing analogy. This so-­called cognitive revolution resulted in a new set of questions that led to

the development of new methods to answer these questions. Among the major developments during this era was the Brown-­Peterson paradigm, developed to study short-­term memory. The notion
that immediate memory is somehow different from long-­term memory was already suggested by
William James (1890), who described that primary memory was different from secondary memory
because we have conscious awareness of the former and not the latter. The distinction received
increased attention after Miller (1956) published his paper titled “The Magic Number Seven, Plus
or Minus Two: Some Limit on Our Capacity for Processing Information.” In this paper, he presented the capacity limitation in absolute judgments and immediate memory from the perspective
of information theory. However, what was important for memory researchers was that despite the
limitation, recoding of information could increase transmitted information by increasing the amount
of information per chunk. Miller defiantly proclaimed that experimental psychologists had little to
say about the recoding phenomenon because it is “less accessible to experimental manipulation than
nonsense syllables or T mazes” (Miller, 1956, p. 96).
Against this backdrop, Brown (1958) and Peterson and Peterson (1959) developed their methods
to investigate how the memory trace decayed immediately following its creation. It is interesting to
note that Brown (from the United Kingdom) said in his paper that he was inspired by a lecture given
by Bartlett in 1950, even though he also mentioned Hull (Hull et al., 1940), a famous behavioral
theorist, in the introduction section. Peterson and Peterson (from the United States) did not mention Bartlett but mentioned Hull (1952) and Underwood (1949). Although these researchers were
motivated by different perspectives, their aim was the same: to prevent participants from rehearsing following the presentation of a stimulus. Their approach was similar in that Brown presented
a series of five digit pairs after presenting stimuli (pairs of consonants), and Peterson and Peterson
asked participants to count numbers backwards by threes after presenting a trigram. The critical
part of both methods was that the distractor activity had to be intense enough to prevent rehearsal.
Brown accomplished this by presenting the digit pairs at a rate of one per 0.78 seconds, and Peterson
and Peterson asked participants to count the numbers at a rate of twice per second. Both methods
showed rapid forgetting immediately after a stimulus was presented, and based on the results, Brown
concluded that the memory trace decays rapidly without rehearsal, whereas in keeping with the
functional tradition, Peterson and Peterson carefully avoided a theoretical conclusion. The Brown-­
Peterson paradigm became important over the years because toward the end of the 1960s, Atkinson
and Shiffrin (1968) proposed a formal model of memory based on the information processing perspective, which included the short-­term memory store as one of three separate memory systems.
4



History of Methods in Memory Science

The assumption that humans are an information processing system also led to a new emphasis on
free recall and how participants organize recall outputs. Unlike paired-­associate learning, free recall
allows participants to recall items in any order, and if participants are actively processing information, it is reasonable to assume that free recall outputs would reflect the active role participants play
in remembering information. Bousfield (1953) reported evidence showing that participants were
actively organizing information, in line with the notion of recoding mentioned by Miller (1956). He
presented a list consisting of words from four different categories (e.g., animals) followed by a free
recall test. Then, he analyzed the recall output in terms of repetitions of words from the same category (e.g., zebra followed by otter). The results showed that these repetitions occurred significantly
more frequently than expected by chance, indicating that participants were actively clustering words
from the same category in their recall output. A similar organization of recall output was reported by
Tulving (1962) in his experiment using a list that did not have a categorical structure like Bousfield’s.
Because the list was not categorized, Tulving used multiple study-­test trials to count the number
of the same two words being recalled together (e.g., desk followed by dog) on two successive test
trials. The results showed that such repetitions increased across 16 study-­test trials, indicating that
participants organized their recall even though the study list did not have obvious structure, a phenomenon referred to as subjective organization. Over the next decade, there was a flurry of activity
in developing appropriate measures of organization (Puff, 1979), as reflected in the popularity of free
recall, which peaked shortly after 1970 (Hintzman, 2011). The subsequent decline in the popularity
of free recall also corresponded with a sudden decline in the investigation of organization.
Nevertheless, the analysis of free recall protocols has shown a resurgence in recent years. For
instance, Kahana, Howard, Zaromb, and Wingfield (2002) analyzed free recall outputs in terms
of the probability of first recall and the conditional response probability as a function of lag. They
analyzed free recall outputs after the first item was recalled and showed that the probability of recall
was higher among the items from neighboring serial positions, revealing, yet again, the usefulness
of analyzing free recall. Brainerd, Reyna, and Howe (2009) also developed a method based on
Markov chains to decompose the free recall protocol, which enables predictions as to which patients
with mild cognitive impairment would progress to Alzheimer’s dementia. To promote some of the
organization measures, Senkova and Otani (2012, 2015) developed spreadsheet calculators to make
these measures more accessible to researchers because computing these measures is prohibitively
laborious.

Another topic worth mentioning for this era was pioneered by Hart (1965) and Brown and
McNeill (1966), who began conducting experiments in a field that later became known as metacognition. Hart conducted experiments on feeling of knowing (FOK), or a feeling that one knows the
answer even though for the moment, one is unable to retrieve it. Brown and McNeil investigated
the tip-­of-­the-­tongue (TOT) state, in which one experiences an intense feeling that the sought-­after
answer is on the verge of being retrieved. Psychologists have written about these phenomena since
William James (1890); however, for many years, these phenomena have not attracted attention of
empirical researchers, with the only exception being a German researcher, Wenzl (1932, 1936). The
method Hart developed was later referred to as the RJR (recall-­judgment-­recognition) paradigm
because in this paradigm, participants are asked to answer a series of questions (e.g., “Which planet is
the largest in our solar system?” Jupitar), and if they fail to answer a question, they are asked to make
an FOK judgment by being asked whether they know the answer (even though at the moment
they cannot recall it) and whether they will be able to recognize the answer if it is presented among
distractors. The results showed that participants were more likely to recognize correct answers when
they had an FOK than when they did not have an FOK. Brown and McNeil elicited TOTs by presenting the definitions of rare words (e.g., apse, nepotism, cloaca), and when participants indicated that
they were experiencing a TOT, they were asked to write down partial information about the target
word (number of syllables, initial letter, similar sounding words, words sharing similar meaning).
5


Hajime Otani et al.

The results indicated that when participants experienced a TOT, they showed generic recall, which
consisted of the partial information of the target word. Over the years, research on metacognition
became a major field in human memory research because FOK and TOT represent monitoring and
control of cognition (Nelson & Narens, 1994; Schwartz & Brown, 2014).

1970s Levels of Processing, Multiple Memories, Emergence of
Ecological Approach, and Working Memory
During the 1970s, a new perspective, the levels of processing framework (Craik & Lockhart, 1972),
had emerged, and a new method was developed to test new hypotheses. The core assumption of

this framework was that processing proceeds from shallow (perceptual analysis) to deep (semantic
analysis) and that memory becomes more durable as the depth increases. Researchers relied heavily
on the incidental learning method with orienting tasks to test this hypothesis. The assumption of
this method was that by asking participants to perform an orienting task (e.g., check whether the
word is printed in capital letters), it would be possible to manipulate the depth at which they process information. Hyde and Jenkins’ (1969) paper is often cited as the study that inspired the levels
of processing hypothesis, even though the goal of their experiments was to investigate the effect
of orienting tasks on organization in free recall. In Hyde and Jenkins’ study, the researchers used
three orienting tasks: (1) rate the pleasantness of each word, (2) check whether each word included
a letter E, and (3) estimate the number of letters in each word. These tasks were administered with
an incidental and/­or intentional learning instruction. The results showed that the E checking and
number of letter tasks reduced both recall and organization. The relation between the orienting
tasks and levels of processing was formalized by Craik and Tulving (1975), who used the orienting
tasks to test the notion that encoding at a deep level would produce more durable memories than
encoding at a shallow level. These researchers asked participants orienting questions that directed
them to process structural (“Is the word in capital letters?”) and phonemic (“Does the word rhyme
with WEIGHT?”) information in the shallow condition and categorical (“Is the word a type of
fish?”) or semantic (“Would the word fit the following sentence?”) information in the deep condition. The results supported the level of processing hypothesis, and numerous other studies used this
method and replicated the effect. However, soon, the notion of levels of processing was challenged
(e.g., Morris, Bransford, & Franks, 1977), and alternative notions have been proposed as a replacement, such as elaboration (Bradshaw & Anderson, 1982), transfer appropriate processing (e.g., Morris et al., 1977), distinctiveness (e.g., Jacoby & Craik, 1979; Hunt & Mitchell, 1982), and relational
and item-­specific processing (e.g., Einstein & Hunt, 1980). Nevertheless, to this day, the incidental
learning method has remained the primary method of studying encoding processes in memory.
Another approach that emerged during the 1970s was the system approach, which was the extension of the information processing model of memory proposed by Atkinson and Shiffrin (1968).
The assumption of this approach was that memory consists of distinct systems that can be dissociated. Endel Tulving was the pioneer of this approach, and among the many contributions he made
in human memory research was the distinction between episodic and semantic memory systems.
Tulving (1972) conceptualized episodic memory as a type of memory studied by the list-­learning
paradigm in the laboratory and semantic memory as a type of memory needed to use language.
Regarding the methods to study semantic memory, the prominent methods have been sentence
verification and semantic priming. Collins and Quillian (1969) used a sentence verification task to
test their hierarchical network model, which assumed that sematic memory consists of an associative
network organized in a hierarchy and that the time it takes to verify a sentence reflects the distance

one must travel in this network. They presented sentences such as “Animal has skin” and “Canary
is canary,” and asked participants to verify each sentence. The results showed that as predicted, participants took longer to verify some sentences than others, providing initial support for the model.
6


History of Methods in Memory Science

However, as evidence contrary to the model accumulated, Collins and Loftus (1975) proposed a
revised model, the spreading-­activation theory, which assumed that in semantic memory, concepts
are linked with other concepts with varying degrees of distance based on relatedness. The core
assumption, which was supported by the sematic priming phenomenon, was that once a concept is
activated, the activation would quickly spread to nearby concepts, increasing the accessibility of these
concepts. Rosch (1975) conducted a series of experiments using the semantic priming paradigm, in
which pairs of words or pictures (e.g., chair-­dresser) were presented, and participants were asked to
decide whether both items belonged to the same category. The critical manipulation was whether a
prime category (e.g., furniture) or a blank preceded the pair. The results showed that presenting a category name as a prime reduced the decision time, showing evidence of semantic priming. Although
Rosch did not present these results as evidence to support or refute any particular model of semantic
memory, Loftus (1975) in her rejoinder article argued that Rosch’s results were consistent with the
spreading-­activation theory, which became enormously influential over the years.
While the levels of processing framework continued the list-­learning tradition of Ebbinghaus, the
tradition of Bartlett received renewed attention during the 1970s because of the dissatisfaction of a
sizable number of researchers that traditional laboratory investigation had not yielded useful information. Reflecting such sentiment, at the conference on Practical Aspects of Memory, Neisser (1978)
lamented that “If X is an interesting or socially significant aspect of memory, then psychologists have
hardly ever studied X” (p. 4). This movement was referred to as the everyday memory movement,
and a collection of papers representing this movement was published in an edited book by Neisser
(1982), and the second edition of this book was published by Neisser and Hyman (2000). By reading
the first edition, one would be impressed with the eclectic nature of this movement. Furthermore,
many of the topics from this movement, such as eyewitness memory, autobiographical memory,
flashbulb memory, and prospective memory, have developed into major fields of research today.
During the 1970s, researchers also took another look at short-­term memory, focusing more on

its wider function than its storage capacity and forgetting curve. Baddeley and colleagues (e.g., Baddeley, Grant, Wight, & Thomson, 1975; Baddeley & Hitch, 1974; see Baddeley, 1986, for an extensive review) began asking about the importance of short-­term memory in cognitive functioning and
found that even when participants were asked to repeat digits, which presumably filled up short-­
term memory, there was no dramatic decline in cognitive performance. Based on their findings,
these researchers proposed the notion of working memory, which consists of three subsystems that
are semi-­independent of each other: the central executive, the phonological loop, and the visual-­
spatial sketch pad (VSSP). The method they used to test these systems was a concurrent memory
task, based on the assumption that if a proposed system is important, disabling it by a concurrent
memory task would create disruptions in performance. A concurrent memory task they used to test
the phonological loop system was an articulatory suppression task, in which participants were asked
to say something repetitive such as “the, the, the” while performing a main task. The results showed
that robust phenomena in a memory span task, the phonological similarity and word-­length effects,
were disrupted, showing the involvement of the phonological loop in these phenomena. The VSSP
system was examined using a spatial suppression task, in which participants were asked to perform
a pursuit rotor task while memorizing five spatial or nonsense sentences. The spatial sentences
described the locations (left, right, up, and down) of digits in a 4 × 4 matrix, whereas the nonsense
sentences replaced the location words with nonsense words (quick, good, bad, and slow). The results
showed that when participants did not perform the pursuit rotor task, remembering of the spatial
and nonsense sentences was similar; however, when participants performed the pursuit rotor task,
remembering of the spatial sentences, but not the nonsense sentences, was disrupted (Baddeley et al.,
1975). These results led the researchers to conclude that remembering the spatial sentences required
the VSSP system. Baddeley (1986) described that the central executive was the most elusive system
among the three subsystems. He speculated that this system is similar to the supervisory attention
7


Hajime Otani et al.

system proposed by Norman and Shallice (1980), which is assumed to be responsible for attentional
control of various actions, such as reasoning, reading comprehension, and encoding into long-­term
memory.

Baddeley and colleagues took a system approach to working memory; however, other researchers conceptualized working memory as an individual difference variable and began developing
tasks to measure working memory capacity (e.g., Daneman & Carpenter, 1980; Engle, Canter, &
Carullo, 1992). These tasks were referred to as complex memory span tasks, and it has been shown
that working memory capacity measured by these tasks is associated with performance on a variety
of cognitive tasks that tap into fluid intelligence. Working memory has become a major area of
research today because it is assumed to play a critical role in executive functions, such as planning,
monitoring, focusing, directing attention, and inhibiting distractions.

1980s Unconscious Memory, Dissociation, and Dual Processing
During the 1980s, researchers began questioning the role of consciousness in remembering, which
led to the distinction between implicit and explicit memory (Schacter, 1987). Explicit memory
requires conscious retrieval of memory, whereas implicit memory does not. Explicit memory has
been studied since Ebbinghaus using measures such as recall and recognition, whereas implicit memory is studied by asking participants to respond with the first word that comes to mind or in a manner that they do not consciously access their memories (Schacter, 1987). A number of tasks have
been developed to investigate implicit memory. For instance, Jacoby and Dallas (1981) presented
a list of words and manipulated encoding conditions using the incidental learning paradigm. These
researchers then tested explicit memory using a standard yes-­no recognition test and implicit memory using a perceptual recognition test. In the latter test, a target or distractor word was presented
for 35 milliseconds followed by a mask, and participants were asked to simply identify the word.
The assumption was that the exposure to these words during the study phase would increase the
likelihood that these words would come to mind quickly, even without making a retrieval effort,
a phenomenon referred to as repetition priming (Schacter, 1987). The results showed that explicit
memory was sensitive to encoding manipulations such as levels of processing and difficulty of processing, whereas implicit memory was not. The difference in sensitivity to manipulations between
implicit and explicit memory was referred to as a dissociation, which was considered as evidence that
implicit and explicit memory represent two separate memory systems. During the 1980s, there was
a flurry of research to find a dissociation using a variety of implicit memory tasks, such as word-­stem
completion (e.g., Graf, Mandler, & Haden, 1982; Graf, Squire, & Mandler, 1984), word-­fragment
completion (Tulving, Schacter, & Stark, 1982), and lexical decision (e.g., Duchek & Neely, 1989;
Moscovitch, 1982). Furthermore, these implicit memory tests were classified to be perceptually
driven or conceptually driven (see Toth, 2000, for a complete list).
Another distinction that became important during the 1980s was the distinction between remember and know judgments (Tulving, 1985). Tulving conceptualized remembering as the subjective experience of retrieving information from episodic memory and knowing as the subjective experience
of retrieving information from semantic memory. However, in subsequent years, these judgments

were used to test the dual-­process theory of recognition memory, which proposes that recognition
can be accomplished by two processes: familiarity and recognition (e.g., Mandler, 1980). Familiarity
is a feeling that one has encountered an item in the past even though contextual information surrounding the encoding of the item is absent (know), whereas recollection is recognition of an item
that is accompanied by the retrieval of contextual information (remember). In studies investigating
familiarity and recollection, participants are asked to take a recognition test in which they are asked
to recognize an item and indicate whether they have a remember or know experience. Numerous studies using this method showed that remember and know judgments can be dissociated (see
8


×