05/26/2022 | Press release | Distributed by Public on 05/27/2022 01:19
Typically, the logic behind the need for an intervention is not made explicit. However, if one takes a step back, it is not immediately obvious why we expect an intervention to be necessary or even helpful. Do we think that individuals are not optimizing? If they are optimizing (as we typically assume in economics), then why do we want them to deviate from this behavior? Why do we think that any intervention, even if successful, will have effects that are not just temporary? After the intervention ends, won't the participants just return to the prior equilibrium?
The dynamics of human behavior outlined here provides one way to understand one possible motivation behind such policies…. Following a change in the state of the world, individuals will tend to hold on to preexisting traditions that tend to be suited for the prior environment rather than the current one. In these cases, interventions that help to aid
the adoption of new beliefs, values, or actions, which are better matched to the contemporary environment, can improve welfare." o Marina Agranov and Pietro Ortoleva have an overview paper that documents how there is now a growing body of evidence using different approaches that shows that people may have an explicit desire to randomize when making choices in which they are unsure or preferences are incomplete. They note this has implications for how we use revealed preference in linking choices with utilities, but it may also provide another reason for thinking about stochastic choice as a way of allocating scarce resources.o Rebecca Dizon-Ross and Seema Jayachandran have a short paper suggesting that eliciting willingness to pay using BDM can be improved by also asking WTP for small inexpensive household goods unrelated to the focal good or service. This helps reduce measurement error by capturing effects such as short-term liquidity issues, tiredness, and social desirability issues.o The reports at the end are always interesting reading and useful to get some benchmarking statistics too. For example:o Academic Year salaries from the survey of US economics departments: Mean academic year (9-month) salaries in top 15 departments are now $349,014 for Full Professors, $226,932 for Associate Professors, and $180,303 for Assistant Professors. For departments ranked 16-30, the respective salaries are $279K, $186K, and $161K, while for the mean PhD-granting institution they are $221K, $156K, and $138K. The same report also reports PhD program applications were 20,345 at N=62 universities, with 2,478 offers of admission and 778 new students enrolled. 884 PhDs were awarded from 99 institutions in 2020-21, of which one-third were to female students.o The editors' reports of the different AEA journals are also there. For the AER, Esther Duflo reports 1910 papers submitted in 2021 and 123 papers published; the desk-rejection rate is 39% and acceptance rate of 5-7%, and median decision time for papers sent to referees is 77 days, 90th percentile 185 days. I appreciate the editor saying an explicit objective is to increase the number of revisions decided on without sending the revisions back to the referee. For AER Insights, Amy Finkelstein reports 777 submissions, a desk rejection rate of 37%, an acceptance rate of 5%. Reviewing times are shorter: 55 days for the median, and 88 days for the 90th percentile. For the AEJ Applied, Ben Olken reports 780 submissions, 37 papers published, 54% desk-rejection rate, acceptance rate of 5.6%. He notes a desire to be known for quick turnaround of papers, with 87% handled in 3 months, and 99% within 5 months. He discusses the changes in the policies for handling papers rejected at AER/AER Insights - authors can submit the reports and a cover letter without making a big revision, to avoid lots of time responding to comments if responses aren't going to be determining factor for the decision. 76 papers previously sent to the AER were submitted, of which 9 were accepted.