Search our findings

Enter your keywords below:
or view ALL our journal articles »

News Archives:

  • » 2020 (2)
  • » 2019 (1)
  • » 2018 (16)
  • » 2017 (8)
  • » 2016 (10)
  • » 2015 (9)
  • » 2014 (3)
  • » 2013 (10)
  • » 2012 (3)
  •  

     

    5 take-away messages from 'Using Research Evidence: A Practical Guide'

    Authors: NESTA and Alliance for Useful Evidence, 2016

    Authors: NESTA and Alliance for Useful Evidence, 2016


    Dr Nilam Ashra-McGrath, Research Uptake Manager for COMDIS-HSD shares her 5 ‘take-away’ messages from the recently published Using Research Evidence: A Practical Guide
    Although this guide, published by the Alliance of Useful Evidence and NESTA, is aimed at those working to change UK policy, much of it is applicable to international development. It offers a good overview of some of the methods that are commonly used to generate research evidence, as well as touching on their pitfalls and assumptions. Among the many sensible messages are 5 good reminders:
    1: Not all evidence is equal
    Selective use of evidence (known as confirmation bias) via literature reviews rather than systematic reviews can skew everything from research design through to policy discussions. Literature reviews allow us to ‘cherry pick’ the evidence that best fits our argument and justification for research. The guide reminds us that bias – be it based on optimism, hindsight or loss-aversion – can lead us to engage not in ‘evidence-based policy’ but in ‘policy-based evidence’, where our conclusions have already been formed and we are retro-fitting literature and methods to support that conclusion.
    Let’s not forget also that esteemed journals are prone to bias, with positive results more likely to be published, while a staggering 65% of negative studies are not even being written up. This has far-reaching consequences, and begs the question of how many policies have been implemented based on positive studies alone. Surely knowing what doesn’t work is just as important?
    2: Quality matters
    Peer reviewed papers are still seen as the ‘gold standard’ to assess quality, but if journals are prone to publication bias, then this clearly presents a problem; likewise with another gold standard of quality: the systematic review. As the guide points out, a systematic review is “only really as good as the quality of the studies it is based on” (p36). Judging the quality of evidence can therefore be slippery. Researchers have varying views on what quality means (though they won’t always admit this). Is it about methods, integrity, utility, or usefulness? The default position appears to be that research methods determine the quality of evidence, but here we have another ‘hierarchy of evidence’ to consider (see p32-33).
    3: Rapid reviews – timing matters
    It’s a long road from research to publication, and policymakers have days, if not hours, to make decisions. A robust systematic review can take 6-12 months, and a ‘rapid review’ can take 1-2 months. This means that there are clear advantages to rapid reviews, but the guide makes clear that rapid reviews should follow a systematic process, i.e. have a clear methodology so that the review can be added to when new research becomes available. Databases such as 3ie and Cochrane contain ‘off-the-shelf’ systematic reviews that will help researchers working in time sensitive contexts.
    4: Context matters
    The guide contains an important reminder that we are predisposed to use methods that we are comfortable with, rather than what suits the context in which the research will take place.
    5: Communication matters
    By this I mean sharing everything from your research methodology, through to interim and full results, and not just leaving everything until the end of your research to be shared in one academic paper. The Easy, Attractive, Social and Timely (EAST) framework, will help decision makers, and those who supply them with evidence, make more effective and efficient policy.
    Aside from the additional instructions to use Google Scholar and stay off Wikipedia, this guide offers many important reminders (all evidence-based, of course). You could skim through the key messages at the end of each section, or you could invest a couple of hours reading this cover to cover. I recommend taking a couple of hours.