Presentation by Philip Cohen on collaborative work with Micah Altman as part of the MIT CREOS research talk series. Presented in fall 2018, in Cambridge, MA.
Contemporary journal peer review is beset by a range of problems. These include (a) long delay times to publication, during which time research is inaccessible; (b) weak incentives to conduct reviews, resulting in high refusal rates as the pace of journal publication increases; (c) quality control problems that produce both errors of commission (accepting erroneous work) and omission (passing over important work, especially null findings); (d) unknown levels of bias, affecting both who is asked to perform peer review and how reviewers treat authors, and; (e) opacity in the process that impedes error correction and more systematic learning, and enables conflicts of interest to pass undetected. Proposed alternative practices attempt to address these concerns -- especially open peer review, and post-publication peer review. However, systemic solutions will require revisiting the functions of peer review in its institutional context.
2. • Framework for asking what works and what doesn’t
• Basis for evaluation facilitates interventions
• Interventions are interdependent
Goals of the talk
3. Areas of concern for peer review
With growing volume, is peer review system succeeding at:
• Time
• Incentives
• Quality
• Bias
• Accountability
7. • Why publishing? Inertia
Peer review is a mechanism
• The system is overburdened
Competing demands
8. What journals ask peer reviewers to do
17 journal review criteria lists, broken into 5 categories
Veracity (44% of all criteria / 100% of journals included at least one)
Analysis methods, data, conclusions justified, literature review, scientific soundness or validity, measurement,
evidence, research design, accuracy, concepts appropriately defined and used
Importance (22% / 82%)
Contribution to knowledge, importance / significance, novelty, theoretical contribution, statement of contribution,
stimulate research
Presentation (18% / 88%)
Writing, argument, English quality, presentation
Market (13% / 65%)
Relevance to sub-discipline, social relevance, interest to readers, international interest, policy relevance
Transparency (3% / 12%)
Data availability, ethical concerns
* Journals: American Sociological Review (guidelines, no reviewer form), Social Forces, European Sociological Review, Acta Sociologica, Journal of Marriage and Family, Demography, Demographic Research, Population
Studies, Sociological Science, Organization Science, MDPI journals, Comparative Population Studies, Health and Human Rights, Royal Society Open Science, Nature Research, American Archivist, PLOS One
9. • Why publishing? Inertia
Peer review is a mechanism
• The system is overburdened
Competing demands
• Unbundle its functions
Technology
Cautionarytale:APCs
Whatcouldgowrong
• Systemic effects
Practice of scholarship
Allocation of money
16. Bias
American Journal of Sociology
Reviewer invitation
"If you have a professional or friendship tie to the
undisclosed author of this paper (e.g., past or
current faculty peer, coauthor, contemporaneous
PhD-institution peer, etc.) please let us know."
17. Bias
Teplitskiy et al. (2018). “The sociology of scientific validity: How
professional networks shape judgement in peer review.”
22. Five innovations
Post publication peer review
• Time
Research is available immediately
(Including bad work)
• Quality
More engaged reviews
(But maybe somethings never reviewed)
23. Open peer review
• Incentives
Credit for quantity and quality, including citations
• Quality
Visibility encourages good reviews
(or less honest reviews)
• Bias
Less discrimination under cover of blindness
(but maybe retribution)
• Accountability
Reviewers and editors make decisions in public view
24. Recognition for APT
• Incentives, quality
People review for credit, citations
• Bias
Reviewing record deters discrimination
25. Decouple peer review
• Incentives
New platforms and initiatives can develop incentives
• Quality
More focused review practices
• Accountability
Institutional actors with more public accountability
26. Replication with review
• Time
Start at point of review rather than publication
• Incentives
Reviewing has more impact, and leads to new research
• Quality
Better feedback, error detection