Tải bản đầy đủ (.pdf) (22 trang)

Lecture Practical meta-analysis

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (170.92 KB, 22 trang )

Practical Meta­Analysis
David B. Wilson
Evaluators’ Institute
July 16-17, 2010

Practical Meta­Analysis ­­ D. B. Wilson

1


Overview of the Workshop


Topics covered will include






Review of the basic methods
 Problem definition
 Document Retrieval
 Coding
 Effect sizes and computation
 Analysis of effect sizes
 Publication Bias
Cutting edge issues
Interpretation of results
Evaluating the quality of a meta-analysis


Practical Meta­
Analysis ­­ D. B. 
Wilson

2


Forest Plot from a Meta­Analysis of
Correctional Boot­Camps
Favors Comparison

Favors Bootcamp

Harer & Klein-Saffran, 1996
Jones & Ross, 1997
Fl. Dept. of JJ (Stuart Co.), 1997
Fl. Dept. of JJ (Polk Co., Boys), 1997
Jones (FY97), 1998
Jones (FY94-95), 1998
Mackenzie & Souryal (Illinois), 1994
Mackenzie & Souryal (Louisiana), 1994
Jones (FY91-93), 1998
Mackenzie & Souryal (Florida), 1994
Jones (FY96), 1998
Marcus-Mendoza (Men), 1995
Mackenzie, et al. 1997
Penn. Dept. of Corrections, 2001
Flowers, Carr, & Ruback 1991
Bureau of Data and Research, 1996
Mackenzie & Souryal (Oklahoma), 1994

T3 Associates, 2000
Mackenzie & Souryal (New York), 1994
Peters, 1996b
Camp & Sandhu, 1995
Mackenzie & Souryal (S.C., New), 1994
Jones, 1996
NY DCS (88-96 Releases), 2000
Marcus-Mendoza (Women), 1995
Austin, Jones, & Bolyard, 1993
Burns & Vito, 1995
Peters, 1996a
Fl. Dept. of JJ (Bay Co.), 1997
NY DCS (96-97 Releases), 2000
NY DCS (97-98 Releases), 2000
Fl. Dept. of JJ (Pinellas Co.), 1996
Fl. Dept. of JJ (Manatee Co.), 1996
CA Dept. of the Youth Authority, 1997
Boyles, Bokenkamp, & Madura, 1996
Mackenzie & Souryal (S.C., Old), 1994
Fl. Dept. of JJ (Polk Co., Girls), 1997
Jones, 1997
Thomas & Peters, 1996
Wright & Mays, 1998
Mackenzie & Souryal (Georgia), 1994

Practical Meta­
Analysis ­­ D. B. 
Wilson

Overall Mean Odds-Ratio


3


The Great Debate










1952: Hans J. Eysenck concluded that there were no
favorable effects of psychotherapy, starting a raging
debate
20 years of evaluation research and hundreds of studies
failed to resolve the debate
1978: To proved Eysenck wrong, Gene V. Glass
statistically aggregate the findings of 375 psychotherapy
outcome studies
Glass (and colleague Smith) concluded that
psychotherapy did indeed work
Glass called his method
“meta-analysis”
Practical Meta­
Analysis ­­ D. B. 
4

Wilson


The Emergence of Meta­analysis


Ideas behind meta-analysis predate Glass’ work by
several decades




Karl Pearson (1904)
 averaged correlations for studies of the effectiveness of
inoculation for typhoid fever
R. A. Fisher (1944)
 “When a number of quite independent tests of significance
have been made, it sometimes happens that although few or
none can be claimed individually as significant, yet the
aggregate gives an impression that the probabilities are on
the whole lower than would often have been obtained by
chance” (p. 99).
Practical Meta­
 Source of the idea
of cumulating probability values

Analysis ­­ D. B. 
Wilson

5



The Emergence of Meta­analysis


Ideas behind meta-analysis predate Glass’ work by
several decades


W. G. Cochran (1953)
 Discusses a method of averaging means across independent
studies
 Laid-out much of the statistical foundation that modern metaanalysis is built upon (e.g., Inverse variance weighting and
homogeneity testing)

Practical Meta­
Analysis ­­ D. B. 
Wilson

6


The Logic of Meta­analysis




Traditional methods of review focus on statistical
significance testing
Significance testing is not well suited to this task






Highly dependent on sample size
Null finding does not carry the same “weight” as a significant
finding
 significant effect is a strong conclusion
 nonsignificant effect is a weak conclusion

Meta-analysis focuses on the direction and magnitude
of the effects across studies, not statistical significance



Isn’t this what we are interested in anyway?
Direction and magnitude
are represented by the effect size
Practical Meta­

Analysis ­­ D. B. 
Wilson

7


Illustration
Table 1
21 Validity Studies, N = 68 for Each

Observed
validity
Study
coefficient
1
0.04
2
0.14
3
0.31 *
4
0.12
5
0.38 *
6
0.27 *
7
0.15
8
0.36 *
9
0.20
10
0.02
11
0.23
12
0.11
13
0.21

14
0.37 *
15
0.14
16
0.29 *
17
0.26 *
18
0.17
19
0.39 *
20
0.22
21
0.21
* p < .05 (two tailed).



Simulated data from 21 validity studies. Taken from: Schimdt, F. L.
(1996). Statistical significance testing and cumulative knowledge in
psychology: implications
for training of researchers. Psychological
Practical Meta­
Methods, 1, 115-129.

Analysis ­­ D. B. 
Wilson


8


Illustration (Continued)
Table 2
95% Confidence Intervals for Correlations From Table
1, N = 68 for Each
Observed
95% confidence
validity
interval
Study
coefficient
Lower
Upper
1
0.39
0.19
0.59
2
0.38
0.18
0.58
3
0.37
0.16
0.58
4
0.36
0.15

0.57
5
0.31
0.09
0.53
6
0.29
0.07
0.51
7
0.27
0.05
0.49
8
0.26
0.04
0.48
9
0.23
0.00
0.46
10
0.22
-0.01
0.45
11
0.21
-0.02
0.44
12

0.21
-0.02
0.44
13
0.20
-0.03
0.43
14
0.17
-0.06
0.40
15
0.15
-0.08
0.38
16
0.14
-0.09
0.37
17
0.14
-0.09
0.37
18
0.12
-0.12
0.36
19
0.11
-0.13

0.35
20
0.04
-0.20
0.28
21
0.02
-0.22
0.26

Practical Meta­
Analysis ­­ D. B. 
Wilson

9


When Can You Do Meta­analysis?


Meta-analysis is applicable to collections of research that







Are empirical, rather than theoretical
Produce quantitative results, rather than qualitative findings

Examine the same constructs and relationships
Have findings that can be configured in a comparable statistical
form (e.g., as effect sizes, correlation coefficients, odds-ratios,
proportions)
Are “comparable” given the question at hand

Practical Meta­
Analysis ­­ D. B. 
Wilson

10


Forms of Research Findings Suitable to Meta­
analysis


Central tendency research




Pre-post contrasts




Prevalence rates
Growth rates


Group contrasts




Experimentally created groups
 Comparison of outcomes between treatment and comparison
groups
Naturally occurring groups
 Comparison of spatial abilities between boys and girls
 Rates of morbidity among high and low risk groups

Practical Meta­
Analysis ­­ D. B. 
Wilson

11


Forms of Research Findings Suitable to Meta­
analysis


Association between variables




Measurement research
 Validity generalization

Individual differences research
 Correlation between personality constructs

Practical Meta­
Analysis ­­ D. B. 
Wilson

12


Effect Size: The Key to Meta­analysis


The effect size makes meta-analysis possible



It is the “dependent variable”
It standardizes findings across studies such that they can be
directly compared

Practical Meta­
Analysis ­­ D. B. 
Wilson

13


Effect Size: The Key to Meta­analysis



Any standardized index can be an “effect size” (e.g.,
standardized mean difference, correlation coefficient,
odds-ratio) as long as it meets the following







Is comparable across studies (generally requires standardization)
Represents the magnitude and direction of the relationship of
interest
Is independent of sample size

Different meta-analyses may use different effect size
indices
Practical Meta­
Analysis ­­ D. B. 
Wilson

14


The Replication Continuum
Conceptual
Replications

Pure

Replications

You must be able to argue that the collection of studies you are
meta-analyzing examine the same relationship. This may be at
a broad level of abstraction, such as the relationship between
criminal justice interventions and recidivism or between schoolbased prevention programs and problem behavior. Alternatively
it may be at a narrow level of abstraction and represent pure
replications.

The closer to pure replications your collection of studies, the
easier it is to argue comparability.

Practical Meta­
Analysis ­­ D. B. 
Wilson

15


Which Studies to Include?


It is critical to have an explicit inclusion and exclusion
criteria (see pages 20-21)





The broader the research domain, the more detailed they tend to

become
Refine criteria as you interact with the literature
Components of a detailed criteria
 distinguishing features
 research respondents
 key variables
 research methods
 cultural and linguistic range
 time frame
 publication types
Practical Meta­

Analysis ­­ D. B. 
Wilson

16


Methodological Quality Dilemma


Include or exclude low quality studies?









The findings of all studies are potentially in error (methodological
quality is a continuum, not a dichotomy)
Being too restrictive may restrict ability to generalize
Being too inclusive may weaken the confidence that can be
placed in the findings
Methodological quality is often in the “eye-of-the-beholder”
You must strike a balance that is appropriate to your research
question

Practical Meta­
Analysis ­­ D. B. 
Wilson

17


Searching Far and Wide






The “we only included published studies because they
have been peer-reviewed” argument
Significant findings are more likely to be published than
nonsignificant findings
Critical to try to identify and retrieve all studies that meet
your eligibility criteria


Practical Meta­
Analysis ­­ D. B. 
Wilson

18


Searching Far and Wide (continued)


Potential sources for identification of documents










Computerized bibliographic databases
“Google” internet search engine
Authors working in the research domain (email a relevant
Listserv?)
Conference programs
Dissertations
Review articles
Hand searching relevant journal
Government reports, bibliographies, clearinghouses


Practical Meta­
Analysis ­­ D. B. 
Wilson

19


A Note About Computerized Bibliographies







Rapidly changing area
Get to know your local librarian!
Searching one or two databases is generally inadequate
Use “wild cards” (e.g., random? will find random,
randomization, and randomize)
Throw a wide net; filter down with a manual reading of
the abstracts

Practical Meta­
Analysis ­­ D. B. 
Wilson

20



Strengths of Meta­analysis










Imposes a discipline on the process of summing up
research findings
Represents findings in a more differentiated and
sophisticated manner than conventional reviews
Capable of finding relationships across studies that are
obscured in other approaches
Protects against over-interpreting differences across
studies
Can handle a large numbers of studies (this would
overwhelm traditional approaches to review)
Practical Meta­
Analysis ­­ D. B. 
21
Wilson


Weaknesses of Meta­analysis









Requires a good deal of effort
Mechanical aspects don’t lend themselves to capturing
more qualitative distinctions between studies
“Apples and oranges” criticism
Most meta-analyses include “blemished” studies to one
degree or another (e.g., a randomized design with
attrition)
Selection bias posses a continual threat





Negative and null finding studies that you were unable to find
Outcomes for which there were negative or null findings that
were not reported

Analysis of between
study differences is fundamentally
Practical Meta­
correlational
Analysis ­­ D. B. 
22

Wilson



×