Policy Analytics Symposium
Dustin Chambers
Senior Affiliated ScholarDanilo Friere
Adam Smith FellowScott Baier
Senior Affiliated ScholarJoshua C. Hall
Associate Professor, Department of Economics, West Virginia UniversityEdward J. Timmons
Senior Affiliated ScholarNarendra Regmi
Wolfgang Alschner
Bryan McCannon
Yang Zhou
Conor Norris
Patrick McLaughlin
The Mercatus Center at George Mason University is pleased to announce the creation of the Policy Analytics Society as a part of its QuantGov initiative. Policy analytics is, at base, the measurement of government policy for the purpose of studying its causes and consequences.
The new Society will serve the needs of scholars, researchers, and practitioners who use advanced analytical methods such as machine learning and natural language processing to identify and measure latent variables within public policy text. It aims to foster discussion of effective and legitimate ways of creating new data-driven research using human- and AI-powered algorithms. Because most public policy is written in unstructured text, using policy analytics to turn such text into structured data will lead to more and better research about the causes and effects of government policy.
To launch this effort, the Policy Analytics Society has released a series of working papers that showcases the development of policy analytics through novel datasets and methodologies. The following papers advance the study of best practices for the measurement of government policy and seek to empirically identify innovations for data-driven policy research. Topics covered by the symposium include a theoretical framework for policy analytics, methods for data validation, the use of machine learning to analyze trade agreements, data analytics related to occupational licensing, and a computational analysis of contracts.
Concept and Methodology
Towards a Formalization of Policy Analytics – Dustin Chambers
How to Improve Data Validation in Five Steps – Danilo Friere
Democratizing Policy Analytics with AutoML – Danilo Friere
Specific Applications
Using Machine Learning to Capture Heterogeneity in Trade Agreements – Scott Baier and Narendra Regmi
Validating Readability and Complexity Metrics: A New Dataset of Before-and-After Laws – Wolfgang Alschner
Measuring a Contract's Breadth: A Text Analysis – Joshua C. Hall, Bryan McCannon, and Yang Zhou
Man vs. Machine: A Novel Evaluation of Data Analytics Using Occupational Licensing as a Case Study – Edward J. Timmons and Conor Norris
-
A relatively new field within economics, policy analytics has experienced rapid growth and yielded insights in many disciplines within the profession.
Dustin ChambersMarch 25, 2021 -
Social scientists are awash with new data sources.
March 25, 2021 -
Machine learning methods have made significant inroads in the social sciences.
March 25, 2021 -
In this paper, we employ machine learning techniques to capture heterogeneity in free trade agreements.
Scott BaierMarch 25, 2021 -
If algorithms are to be the policy analysts of the future, the policy metrics they produce will require careful validation.
March 25, 2021 -
We use a computational linguistic algorithm to measure the topics covered in the text of school teacher contracts in Ohio.
Joshua C. HallMarch 25, 2021 -
For researchers of state regulatory policy, the difficulty of gathering data has long presented an obstacle. This study compares two new databases for state-level occupational licensing laws.
Edward J. TimmonsMarch 25, 2021
17 Metrics to Watch In the Biden Era
Tyler Cowen