What does the science say? Q&A with Neal Haddaway on systematic reviews and systematic maps
While there are growing calls for policy-making to reflect the scientific evidence, that evidence doesn’t always speak with one voice. Frequently, two studies seem to offer diametrically opposed answers to the same question. And it’s all too easy for bias to slip in when we’re selecting, weighing up and interpreting the literature. SEI’s Neal Haddaway introduces two methods that aim to distil objective, transparent and credible answers from the scientific evidence – and new tools to make it easier.
Q: What are systematic reviews and systematic maps?
A: Systematic review and systematic mapping aim to make evidence synthesis as transparent, objective and comprehensive as possible. They are specific approaches, with required stages and processes. For example, they both start with setting out the methods that you plan to use for the research in a written protocol, which is then peer-reviewed.
Other methods used include things like searching through multiple databases using a tried and tested search string, screening articles for relevance against a pre-determined set of inclusion criteria, and extracting data in a specific way.
Systematic review and systematic mapping use very similar approaches. Both start the same way – with setting up a peer-reviewed protocol that outlines the planned methods. In systematic mapping, the focus is then on identifying and describing the evidence base: selecting those studies that meet criteria of relevance and scientific credibility and detailing them in a searchable database.
Q: What’s the added value of systematic reviews and systematic maps?
A: The reason we have systematic reviews and systematic maps is that the traditional ways of synthesising scientific evidence are susceptible to a number of different biases. For example, researchers might tend to base their reviews or meta-analyses on studies that they’re already familiar with – and so the results that you get might not be representative of the whole body of evidence.
Q: How have you worked with systematic reviews in projects before?
A: I’ve been working with systematic reviews for a number of years across a range of different topics. Sometimes these have been dedicated systematic review projects, but often we’ve used systematic reviews to find out the state of knowledge on a particular topic at the start of a project.
That was the case with BONUS RETURN. This included a systematic review within a longer, more elaborate project. In BONUS RETURN we produced two closely related systematic maps looking at what ecotechnologies have been used to reuse carbon and nutrients in the Baltic.
In fact, we realised that there was quite a lot of variety in what people understood by the term ecotechnology. So, we conducted a systematic review of research literature where people had used the term and tried to build a conceptual model of what they meant by it. That way we could be sure of what was relevant to BONUS RETURN, and what wasn’t.
Q: How long does a systematic review or systematic map take?
A: Some can be very quick; I’ve been involved with one that took about six weeks. And some can be quite time-consuming; one of my colleagues recently finished one that took about five years. It really depends, first, on how big the evidence base you’re dealing with is and, second, how detailed you are with the kind of information you’re extracting and the kind of synthesis that you want.
In fact, we recently did a review on just this question, looking at around 80 systematic maps and reviews published by the Collaboration for Environmental Evidence. We found that on average it takes around 164 person-days for a systematic review, and about 211 person-days for a systematic map. We used the results to create a free online tool called PredicTER, where people can get an estimate of how long a mapping or review might take, depending on the specifics of their project.
In fact, we realised that there was quite a lot of variety in what people understood by the term ecotechnology. So, we conducted a systematic review of research literature where people had used the term and tried to build a conceptual model of what they meant by it. That way we could be sure of what was relevant to BONUS RETURN, and what wasn’t.
Q: Do you have any advice for people interested in conducting a systematic review or map?
A: The first thing I’d recommend is to connect with one of the many communities of practice and network of people who are interested in systematic reviews and maps and their methods. For instance, the Collaboration for Environmental Evidence works on environmental subjects, Cochrane works in the field of health care, and the Campbell Collaborationworks in the areas of social welfare, crime and justice, and international development.
Each one is a voluntary group of people interested in the methodology. They can offer a lot of advice and support such as guidance documents and standards on what you should report in a review or map: our tool ROSES is a useful, descriptive checklist designed for this purpose. There are also a lot of methodological articles related to different steps along the way. But the easiest thing to do would be to connect with someone who has conducted a systematic review before; we’re happy to help.