Few deny there are major global challenges ahead — from climate change and loss of ecosystems and biodiversity, to malnutrition, poverty and disease, to how to harness new technologies and innovations for the betterment of all humankind. It should not have to be said, but the policies that will provide solutions to these complex problems have to be based on evidence, often from science, because many of these challenges are at their heart to do with science. But the journey from evidence to policy is far from simple, always lengthy, frequently divisive and often ineffective. Why is that? To get to the bottom of this question we first need to ask another — what is evidence?

Evidence is based on objective investigation, reasoning and analysis as opposed to subjectivity and prejudice, belief and myth. Evidence may be derived from a purely scientific investigation, but it has to be allied to the political, cultural, economic and social dimensions of these global problems and herein lies the origin of the difficulty. As arguments emerge about the validity and worth of the evidence, bias and prejudice become difficult to remove. These arguments are amplified by two interrelated aspects of 21st-century life: firstly, the “post-truth” era where the opinions of experts are viewed with skepticism, where everyone’s opinions are equally valid and opposing views (and evidence) dismissed as “fake news”; and secondly, the sharing of information through the Internet and social media, personalized through the use of algorithms aimed to harvest and respond to existing preferences and thereby fostering an “echo chamber” effect which increases the entrenched preferences of like-minded individuals.

Together these create a view that scientific evidence is not clear or conclusive, takes place behind closed doors and is elitist, giving rise to conspiracies about who produced the evidence and for what purpose. In this environment, subjectivity prevails over objectivity as policy-makers cherry-pick the evidence to fit with the preconceived views and aspirations of their supporters, as well as their own existing political mantras. To some, evidence becomes nothing more than any “fact” (whether true or not) that can be used to support a particular viewpoint.

So, what to do about it?

To see clear evidence ignored, distorted or diluted in favour of ill-informed subjective views leads to frustration and anger. But this is not enough. It achieves nothing. A new approach is needed, away from naive assumption that good evidence will be readily accepted and will quickly and easily contribute to policy.

Appreciating the sheer complexity of many of the intractable problems that science is addressing is a good first step. From there we need new ways to gather, assimilate and communicate evidence. In our recent article in Palgrave Communications, a rigorous protocol of mapping, analyzing, visualizing and sharing is proposed:

  • Mapping: to define the boundaries of a problem and the people and organizations involved
  • Analysis: to identify what is known, what is not known, what are the important drivers and what works
  • Visualizing: finding ways to present the accumulated knowledge in a transparent, accessible way
  • Sharing: communicating the evidence to all sectors of society.

This process (see figure 1) is essentially what is called evidence synthesis. Visualization and sharing are particularly important — all too often, evidence synthesis results in lengthy and often impenetrable reports, which makes evidence sharing impossible. But what are the best ways to visualize and then share evidence? More research is needed. Could there be better evidence sharing via Web-based national and international events, new online publishing models and social media? Should people with expert knowledge be more active and proactive rather than passive and reactive? Is collective action needed? Do we need better ways to share experience and approaches to find out what works and what doesn’t? Should evidence be supplemented with powerful, “real life” stories to increase the power of the message? Can evidence be democratized in a way that does not undermine science itself; i.e., while also recognizing that science relies on specialist knowledge, technical expertise and years of training? Why do some issues grab public attention, facilitating remarkably rapid development of remedial policy (such as plastics in the ocean), while others of even more importance and impact (such as the increasing frequency and severity of extreme weather events resulting from global warming) do not? These are just some of the questions we need answers to.

The next step in our protocol is evidence evaluation. This also has to be an open and transparent process, a critical process that questions the validity of the evidence. Again, research is needed. Who should lead the evaluation process? Is there a central role for universities, academic organizations, commissions, etc.? To bring success we need independence and inclusivity, so should we break away from the traditional model of the “expert panel” (of mostly white male senior academics) and strive toward diversity of experience, ethnicity and gender?

An important part of our protocol is that evidence evaluation should simultaneously and equally combine not only testing of that evidence under further independent scientific scrutiny but a wider discussion, debate and deliberation. Within the evaluation process it is important to locate not only where evidence is lacking, inconclusive or ambiguous, but also to understand how evidence is perceived, misunderstood or ignored. The same piece of evidence can be interpreted in different ways by different stakeholders, leading to disagreement and conflict. Deliberative forums involving the protagonists locate and challenge misconceptions and ideological stances in order to undermine enclave thinking and give the opportunity to reach agreements on contested pieces of evidence. These forums are currently physical meetings facilitated by researchers, governments or experts, and of necessity are restricted in the number of people taking part. The Internet could change that — broadening the scope of deliberative forums through innovation would allow much wider participation and larger sets of data to be collected and evaluated, analysis aided by the use of artificial intelligence techniques.

The results of this two-pronged evidence evaluation would enable a policy idea to be efficiently transformed into a policy plan, since all evidence has been validated and all stakeholder viewpoints have been either reasonably satisfied or properly discredited. The formal protocol, as described in our paper, would be a source of stability, discipline and confidence-building, a recourse when problems arise and a way to break through logjams and overcome barriers. It establishes trust between scientists, government and the public, and builds a more effective science-policy interface.

This article first appeared in the LSE Impact Blog on July 24, 2018. It was republished under the Creative Commons License 3.0.

Photo: Shutterstock/By Nirat.pix


Do you have something to say about the article you just read? Be part of the Policy Options discussion, and send in your own submission. Here is a link on how to do it. | Souhaitez-vous rĂ©agir Ă  cet article ? Joignez-vous aux dĂ©bats d’Options politiques et soumettez-nous votre texte en suivant ces directives.

Peter Horton
Peter Horton is emeritus professor of biochemistry in the Department of Molecular Biology and Biotechnology and chief research advisor to the Grantham Centre for Sustainable Futures at the University of Sheffield.
Garrett Wallace
Garrett Wallace Brown is chair of political theory and global health policy at the School of Politics and International Studies at the University of Leeds.

You are welcome to republish this Policy Options article online or in print periodicals, under a Creative Commons/No Derivatives licence.

Creative Commons License

More like this