Closer than they look at first glance: A systematic review and a research agenda regarding measurement practices for policy learning
International Review of Public Policy, ISSN: 2706-6274, Vol: 3, Issue: 2, Page: 146-171
2021
- 6Citations
- 15Captures
Metric Options: Counts1 Year3 YearSelecting the 1-year or 3-year option will change the metrics count to percentiles, illustrating how an article or review compares to other articles or reviews within the selected time period in the same journal. Selecting the 1-year option compares the metrics against other articles/reviews that were also published in the same calendar year. Selecting the 3-year option compares the metrics against other articles/reviews that were also published in the same calendar year plus the two years prior.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Article Description
Learning is a cognitive and social dynamic through which diverse types of actors involved in policy processes acquire, translate and disseminate new information and knowledge about public problems and solutions. In turn, they maintain, strengthen or revise their policy beliefs and preferences. Despite the conceptual and theoretical developments over the last years, concerns about the measurement of policy learning remain persistent. Based on the Preferred Reporting Items for Systematic Review and Meta-Analyses (PRISMA) approach, this article reports the results of a systematic review of the existing practices for measuring policy learning in the public administration and policy research. In addition to operationalizations, data sources, methods of analysis and levels of analysis, we examine how the reviewed articles deal with the processual nature of policy learning. We show that the existing measurement practices transcend the research streams on policy learning for the most part, which extends the argument developed by Dunlop and Radaelli (2018) that policy learning is an analytical framework of the policy process. Based on these results, we argue for more transparent operationalizations, discuss the strengths and weaknesses of direct and indirect measurement approaches, and call for more creativity in designing measurement methods that recognize the multilevel nature of policy learning.
Bibliographic Details
Provide Feedback
Have ideas for a new metric? Would you like to see something else here?Let us know