Policymakers have hard time finding way ahead

In Summary

January 25–Last year the Washington Post featured an article by two academics, Michael Callen and Adnan […]

Should the mosquito nets being distributed for free or have the people buy them

Free distribution of mosquito nets or have the people buy them instead?

January 25–Last year the Washington Post featured an article by two academics, Michael Callen and Adnan Khan who wondered how effective are a government’s economic policy without the evidence which makes interesting reading for those involved in defining our future.

In international development, the ‘evidence revolution’ has generated a surge in policy research over the past two decades. We now have a clearer idea of what works and what doesn’t.

In India, performance pay for teachers works: students in schools where bonuses were on offer got significantly higher test scores. In Kenya, charging small fees for malaria bed nets doesn’t work — and is actually less cost-effective than free distribution.

The American Economic Association’s registry for randomized controlled trials now lists 1,287 studies in 106 countries, many of which are testing policies that very well may be expanded. But can policymakers put this evidence to use?

Here’s how we did our research

We assessed the constraints that keep policymakers from acting on evidence. We surveyed a total of 1,509 civil servants in Pakistan and 108 in India as part of a program called Building Capacity to Use Research Evidence (BCURE), carried out by Evidence for Policy Design (EPoD) at Harvard Kennedy School and funded by the British government.

We found that simply presenting evidence to policymakers doesn’t necessarily improve their decision-making. The link between evidence and policy is complicated by several factors.

 There are serious constraints in policymakers’ ability to interpret evidence.

We asked civil servants to interpret numerical information presented in a two-by-two table, shown below, that displays the crop yields obtained by farmers who had or had not used a new seed variety. We asked whether farmers who had used the new seeds were more likely to harvest more crops than those who had not used the new seeds.

To answer the question, respondents had to calculate ratios and note that among farmers using the new variety, about three times as many saw yields increase rather than decrease — while among farmers who didn’t adopt the new variety, about five times as many saw yields increase. Comparing these ratios would show that farmers using the old seed variety were the ones who were more likely to see crop yields rise.

Farmer yield question

Our civil servants couldn’t interpret the table; their answers were no more accurate than if they had guessed randomly. When we asked open-ended questions, we found that many respondents misinterpreted the data in the table. Either they compared only the numbers in the top row or only the numbers in the left column. They did not convert those numbers into ratios.

We also showed respondents a bar chart displaying unemployment rates in three districts and asked which district had the largest number of unemployed people, without giving them the population data. Only 17% of mid-career civil servants in Pakistan responded correctly that they did not have enough information to answer.

Civil servants in India and Pakistan are well educated and selected through competitive exams. Our results are consistent with psychology researchers’ findings that common heuristics — rules of thumb we use to make complex decisions quickly — inhibit people without statistical training from correctly interpreting quantitative information. In a study of US adults for example, only 41%  correctly answered a question like the one about farmer yields above.

When we asked survey respondents what barriers they faced in relying on evidence to make decisions, they answered that the most serious barriers were a lack of training in both data analysis and in how to evaluate and apply research findings. Such training might be useful. Our colleague Dan Levy and others at Harvard Kennedy School used these insights to design digital learning modules for using data and evidence.

 Organizational and structural barriers get in the way of using evidence.

These policymakers agreed that evidence should play a greater role in decision-making but offered a variety of reasons for why it does not. Notably, few mentioned that they had trouble getting data, research or relevant studies. Rather, they said their departments lacked capacity to analyze or interpret data, that they had to make decisions too quickly to consult evidence and that they weren’t rewarded when they did. Apparently, just disseminating evidence may not be enough.

In conversations, civil servants told us that their organizations centralized decisions, strongly favored the status quo and discouraged innovation. Here’s how one civil servant in Lahore, Pakistan, put it: “The people at the top are not interested in evidence. They want things done their way irrespective of the outcome.”

 When presented with quantitative versus qualitative evidence, policymakers update their beliefs in unexpected ways.

We then compared whether our Pakistani policymakers were more likely to change their views about a government policy after we presented them supporting evidence that was either “soft” — meaning it was anecdotal, based on a single story — or “hard,” meaning it was quantitative and based on a representative survey or experiment. Both types of evidence increased the policymaker’s support for a policy, although on average hard evidence changed minds significantly more.

But it depended on what kind of policy we were talking about. In most cases, support changed more in response to data. Numbers supporting a reduction in customs tariffs to increase revenue or privatizing the postal service led to much greater increases in support for these policies compared with anecdotes. But in others, support changed more in response to stories. For example, we gave a quote from a parent about how tracking and reporting on teacher attendance and school facilities had increased the quality of their child’s education. We offered similar stories for other policy options, but this quote struck a chord where numbers could not.

For some policies, providing evidence in favor of the policy actually reduced support. When we gave hard evidence in support of a policy, our civil servants reduced their support for four out of 12 policies — although the drop was not statistically significant. Positive stories about appointing specialists instead of generalists to senior civil service positions led to decreased support for appointing specialists.

The implication here is that decision-making depends not only on the quality of evidence that is presented, but also on the context in which this evidence is received. In cases where the policymaker holds strong beliefs and is inclined to discount evidence, an intervention to soften the policymaker’s priors may be more useful than generating rigorous evidence.

Researchers and policymakers alike say that government would be more effective if decisions were made based on carefully derived evidence. But before that can happen, civil servants need better training in interpreting data and governments need to introduce systems that support and incentivize the use of evidence.

Michael Callen is assistant professor of economics and strategic management at the Rady School of Management at the University of California in San Diego.

Adnan Khan is research and policy director of the International Growth Centre at London School of Economics, co-chair of the Commission on State Fragility, and a board member of the Center for Economic Research in Pakistan.

 

 

 

Related Posts