Improving Decisions on Policy and Practice Via International Systematic Reviews

Saturday, February 18, 2017: 3:00 PM-4:30 PM
Room 202 (Hynes Convention Center)
David Wilson, George Mason University, Fairfax
The Campbell Collaboration is an international association engaged in producing systematic reviews of social science research. The organization promotes positive social and economic change through policies and practices informed by the findings from these reviews. Currently, we conduct reviews in five areas: crime and justice, education, international development, nutrition, and social welfare. Our products use state-of-the-art systematic review methods, including the statistical methods of meta-analysis when possible and appropriate. Non-systematic reviews can be criticized in any policy debate as relying on a biased selection of studies. In our reviews, we strive to minimize bias through transparency, peer review, and rigorous methods. Both the protocol for a planned review and the completed review are peer reviewed by substantive and methods experts. The required methods include an international search for all available studies, published or unpublished, that meet explicit and detailed eligibility criteria. Data and descriptive information is extracted from all eligible studies by at least two independent reviewers. Potential biases and limitations of the research must be considered when drawing overall conclusions. We have experienced several challenges related to the use of our reviews. First, our reviews are typically lengthy and highly technical. To address this, we produce a short, plain-language summary for each review and have started producing policy briefs that summarize multiple related reviews. Second, our reviews take a year or more to complete, making it difficult to respond quickly to a policymaker's request for an assessment of evidence in a particular area. Third, there is often limited research of sufficient quality on a topic from which to draw firm conclusions, reducing the clarity of guidance of reviews for informing a specific policy. And finally, the locations and contexts in which reviewed research was conducted may do not match the contexts where new policies or practices are being considered. However, with enough studies, analyses can explore differential effects across persons and contexts. Despite these challenges, we have gained credibility within our areas of focus as a source for an unbiased and transparent assessment and synthesis of the scientific evidence.