The “Academia to Policy” column seeks to bridge the debate in academia to the adoption of policy in practice. In this inaugural Academia to Policy article, Natasha Somji discusses the challenges present in the translation of academic work to policy.
Policymakers frequently turn to academics as consultants, who are able to conduct analysis and provide background when considering how to change existing laws or propose new policies. However, the ways in which academia is translated into politics can be rather problematic.
It is often the case that policymakers have little awareness about the tools used in quantitative studies carried out by academics. Indeed there is a need to ‘dumb down’ academia to make it more accessible to the masses. While this is not problematic in itself, if the methodology is flawed, policymakers may be proposing changes that are based on questionable assumptions. Consider Gary Kleck’s and Britt Patterson’s study entitled, The Impact of Gun Control and Gun Ownership Levels on Violence Rates which has been cited time and again in the policy world to justify repealing gun laws. The research finds that gun prevalence and most gun laws have no effect on violence rates. However, upon closer examination, we find that the study is fraught with methodological concerns that render the findings implausible. Not only does this make the very basis of policies problematic, but, perhaps even more concerning, it hints that a policymaker did not even consider that the findings may be questionable means that they are likely to be unaware that there is a problem. This has implications for how future policies are generated.
There are, of course, other ways to influence policies. But, just as policymakers get their hands on studies that are problematic, so too do their constituents. If individuals lobby on the basis of a study that is methodologically flawed and policymakers have an incentive to be re-elected, they may concede to the demands from their electorate, even if the policies passed have a negative effect on the very people who pushed it through.
The interpretation of academia becomes even more problematic when studies that are intended to test the effect of a certain policy on a very specific sample, are then used by policymakers to generalise to entire populations that may have inherently different structures and characteristics. Let’s consider a study conducted by Thomas Lemieux and Kevin Milligan (2008) entitled “Incentive effects of social assistance: A regression discontinuity approach” that looks at the effect of higher unemployment benefits for those over the age of 30 on labour supply. The findings from the study are reproduced in Table 1 below and find that the increase in social assistance benefits in Quebec in 1986 for male high school dropouts between the ages of 25 and 40 is associated with a reduction in employment of between three to five percent, significant at the 5% level.
Table 1: Regression discontinuity estimates of the effect of higher social assistance benefits on labour supply in Quebec, 1986
While the authors make it clear that their research applies to a very specific population – high school male dropouts between the ages of 25 to 40 in Quebec in 1986 – it has been used to generate more general conclusions about the negative effect of unemployment insurance on employment rates. Of course, this interpretation may have some basis, but it holds only for a certain structure of unemployment benefits in a very particular context; a different type of social assistance on unemployment benefits may not have the same impact. Indeed, an article in The Gateway (Bushby, 2013), a business and careers newspaper distributed at the LSE, finds that one of the elements in Iceland’s recovery from the economic crisis was to increase its welfare budget; given that these two forms of social assistance exist in completely different settings, it is dangerous to make large generalisations from quantitative research as the same policy in a different context may result in completely contradictory results.
So is it all doom and gloom for citizens like us? Indeed, no: it is our duty to educate ourselves on the research that policymakers are using to make decisions, and to critically assess their validity. For students like us who are being equipped with the skills with which to assess public policy, we have an obligation to use our new-found knowledge towards understanding how policy circles work. Learning econometrics, then, needs to be viewed with the perspective of creating a world in which we become critical consumers of knowledge, not just passive learners.
Works Cited
Bushby, A. (2013, February 13-26, 2013). Alternative economics in Iceland. The Gateway, p. 10.
Kleck, G., & Patterson, E. B. (1993). The impact of gun control and gun ownership levels on violence rates. Journal of Quantitative Criminology, 9(3), 249-287.
Lemieux, T., & Milligan, K. (2008). Incentive effects of social assistance: A regression discontinuity approach. Journal of Econometrics, 142(2), 807-828.
Leave a Reply