NEWS 2013

Policymakers: Caveat Emptor for Research

Economist recommends a dose of skepticism for findings


Manski
Charles Manski discusses uncertainty in policy analysis at the
British Academy in London.

While acknowledging the importance of relying on experts for policy analysis, IPR economist Charles F. Manski bluntly told an audience of U.K. civil servants and citizens that they should be more discriminating consumers of policy research in a recent lecture at the British Academy.

“You are the consumers of research from economists and other social scientists, and I’m here to tell you that you should be skeptical of what is reported to you,” Manski said to the approximately 130 attendees on March 27 in London.

Drawing from his book Public Policy in an Uncertain World: Analysis and Decisions (Harvard University Press, 2013), Manski spoke about how current policy analysis tends to hide uncertainty. He said that policy analysis would be more credible—and salient—if researchers would acknowledge upfront the limits of their data and findings.

In his response, Lord (Gus) O’Donnell, former U.K. cabinet secretary for three British prime ministers and head of the civil service, praised the book for its readability and its attempt to get policymakers to “think about the language we use in thinking about uncertainty and decision making.”

Manski underscored that he attempted to write the book “in English” for a much wider public instead of in “math” for his fellow economists. So the only formula in his book is:

Data + assumptions => conclusions

While it may seem trivial, it is a fundamental one, he said.

“We tend to think that what science does, is it brings data,” he said. “What’s often forgotten is that the data alone don’t do it.”

In research, assumptions have to be combined with data to predict what could happen. But critical assumptions are often hidden.

Starting in the 1980s, Manski began examining what kind of conclusions could be drawn from strong and weak assumptions. This led him to formulate his “law of decreasing credibility,” which expresses the inherent tension between strength and credibility: A strong conclusion requires a strong assumption. But disagreements about the credibility of strong assumptions are common, and some of the most important assumptions cannot be tested.

This lack of “testability” allows a particular side to stake out a position by maintaining a particular assumption. Then another side stakes out another position, based on another untestable assumption.

“These debates can just continue forever because neither side can prove the other wrong,” Manski said.

Plus, the research community and the public reward strong findings, not ambiguous ones. Such incentives lead scientists and analysts to “sacrifice” credibility “to maintain assumptions far stronger than they can defend,” he pointed out.

Manski suggests “facing up to the uncertainty.” He identifies eight different types of “incredible” certitude in his book, providing real-world examples for each, such as confounding advocacy with research, as in the case of economist Milton Friedman’s advocacy of school vouchers.

His research led him to conclude that it is better to use assumptions of varying strength and provide upper and lower bounds, such as unemployment ranges from 7 to 12 percent, rather than an exact figure, like 7 percent.

He acknowledges that despite the uncertainty, policy decisions still must be made. Manski thought that diversification, like what one does for a financial portfolio, could provide an answer in some contexts.

So he constructed the idea of “adaptive diversification” in policy analysis. It could deal with the problem of imperfect and partial knowledge by balancing potential errors and treating policymaking as a dynamic process over time.

Using diversification, one would assign different treatments to different groups. Taking the example of tax policy, Manski suggested that in theory you could subject different groups to different tax rates and policies. Then, depending on the results, the government could adjust the policy every five or ten years.  Where politically feasible—and there would be cases where the approach would not work—the idea is that it would allow a social policy planner to “set it in place, see what happens, and then learn,” he said.

“I am presenting an innovation, a different way of doing policy analysis,” Manski ended, noting that he hoped this was the start of a broad discussion on the topic.

Charles F. Manski is Board of Trustees Professor in Economics and an IPR fellow. The full video of his speech can be viewed here.  You can also watch a brief interview with the Royal Economic Society.