Learning Objectives:
LO7a: To understand the history of peer review, and place current developments in Open Peer Review in that context (knowledge).
LO7b: To gain insight into the process of responsible research evaluation, and the role that peer review and traditional and next-generation metrics play in this (knowledge).
LO7c: To be able to identify and apply a range of metrics to demonstrate the broader impact of your research outputs (tasks).
Fundamentals of good peer review.
History of peer review and scholarly publishing.
Types of open peer review and new models.
Pros and cons associated with different types of open peer review, including post-publication peer review, commenting and annotation.
Issues with traditional methods of research assessment and evaluation.
The San Francisco Declaration on Research Assessment (DORA), Leiden Manifesto, and Metric Tide reports.
Next generation metrics (aka altmetrics), responsible metrics use and peer review.
Role of metrics in research evaluation, funding, promotion, signalling and reporting.
Differentiating between impact and attention.
Individuals: Nikolaus Kriegeskorte, Irene Hames, Tony Ross-Hellauer, Peter Kraker, Michael Markie, Sabina Alam, Elizabeth Gadd, William Gunn.
Organisations: OpenAIRE, ScienceOpen, Publons, PubPeer, OpenUP, Altmetric, ImpactStory, BioMed Central, Frontiers, eLife, PEERE.
Other: Editorial staff at journals offering traditional peer review.
Tools
Peer review report template, Authorea.
Eigenfactor project.
Hypothes.is
Research Articles and Reports
Why the impact factor of journals should not be used for evaluating research (Seglen, 1997).
Effect of open peer review on quality of reviews and on reviewers' recommendations: a randomised trial (van Rooyen et al., 1999).
A Reliability-Generalization Study of Journal Peer Reviews: A Multilevel Meta-Analysis of Inter-Rater Reliability and Its Determinants (Bornmann et al., 2010).
Effect on peer review of telling reviewers that their signed reviews might be posted on the web: randomised controlled trial (van Rooyen et al., 2010).
Open peer review: A randomised controlled trial (Walsh et al., 2010).
Deep impact: unintended consequences of journal rank (Brembs et al., 2013).
Excellence by Nonsense: The Competition for Publications in Modern Science (Binswanger, 2014).
Attention! A study of open access vs non-open access articles (Adie, 2014).
Publishing: Credit where credit is due (Allen et al., 2014).
The Metric Tide report (Wilsdon et al., 2015).
Grand challenges in altmetrics: heterogeneity, data quality and dependencies (Haustein, 2016).
Badges to Acknowledge Open Practices: A Simple, Low-Cost, Effective Method for Increasing Transparency (Kidwell et al., 2016).
A framework to monitor open science trends in the EU (Smith et al., 2016).
Peer Review Survey 2015: Key Findings (Mark Ware Consulting, 2016).
Point of View: How open science helps researchers succeed (McKiernan et al., 2016).
Peer Review Quality and Transparency of the Peer-Review Process in Open Access and Subscription Journals (Wicherts, 2016).
Next-generation metrics: Responsible metrics and evaluation for open science (European Commission, 2017).
Evaluation of Research Careers fully acknowledging Open Science Practices: Rewards, incentives and/or recognition for researchers practicing Open Science (European Commission, 2017).
Research: Gender bias in scholarly peer review (Helmer et al., 2017).
"Excellence R Us": university research and the fetishisation of excellence (Moore et al., 2017).
Metrics for openness (Nichols and Twidale, 2017).
What is open peer review? A systematic review (Ross-Hellauer, 2017).
Survey on open peer review: Attitudes and experience amongst editors, authors and reviewers (Ross-Hellauer et al., 2017).
A multi-disciplinary perspective on emergent and future innovations in peer review (Tennant et al., 2017).
Reviewer bias in single- versus double-blind peer review (Tomkins et al., 2017).
Prestigious science journals struggle to reach even average reliability (Brembs, 2018).
Making research evaluation more transparent: Aligning research philosophy, institutional values, and reporting (Dougherty et al., 2018).
Research excellence indicators: time to reimagine the 'making of'? (Ferretti et al., 2018).
The Journal Impact Factor: A brief history, critique, and discussion of adverse effects (Lariviere and Sugimoto, 2018).
Scholarly Communication Librarians' Relationship with Research Impact Indicators: An Analysis of a National Survey of Academic Librarians in the United States (Miles et al., 2018).
Assessing scientists for hiring, promotion, and tenure (Moher et al., 2018).
Ten considerations for open peer review (Schmidt et al., 2018).
Key posts
Six essential reads on peer review, ASAPbio.
Peer reviews are open for registering at Crossref, Jennifer Lin.
Why we don't sign our peer reviews, Jeremy Yoder.
The Fractured Logic of Blinded Peer Review in Journals, Hilda Bastian.
The peer review process: challenges and progress, Irene Hames.
Responsible metrics: Where it's at?, Lizzie Gadd.
Goodhart's Law and why measurement is hard, David Manheim.
Academe's prestige problem: We're all complicit in perpetuating a rigged system, Maximillian Alvarez.
Let's move beyond the rhetoric: it's time to change how we judge research, Stephen Curry.
Blockchain offers a true route to a scholarly commons, Lambert Heller.
There is an absence of scientific authority over research assessment as a professional practice, leaving a gap that has been filled by database providers, Arlette Jappe, David Pithan and Thomas Heinze, LSE Impact Blog.
Other
Metrics and Research Assessment, ScienceOpen collection.
The Open Access Citation Advantage, ScienceOpen collection.
Citation Behaviour and Practice, ScienceOpen collection.
Scholarly Publication Practices and Impact Factor Calculation and Manipulation, ScienceOpen collection.
Peer Review in the Age of Open Science, Tony Ross-Hellauer, 2017.
The San Francisco Declaration on Research Assessment (DORA) and Leiden Manifesto.
NISO Alternative Assessment Metrics (Altmetrics) Initiative.
Snowball Metrics, standardized research metrics.
Perform one open peer review on a paper of your choice at ScienceOpen, and get a DOI for it.
Integrate one peer review (pre- or post-publication) experience into Publons.
Use Publons journal list to check open peer review policies of journal(s) in your discipline.
Sign DORA in either a personal or business-level capacity.
Define your impact.
Write a personal impact statement about your research (actual or predicted). Avoid using journal titles or the journal impact factor.
Discover the Altmetric scores for your published items using their bookmarklet.
Track your research impact by integrating your ORCID profile with either ScienceOpen or ImpactStory (or both).
Do you have a personal website? If not, now is a good time to design one and make all of the above information part of your digital profile.
Find out what your research department or institutes research evaluation criteria are. Have a discussion about them with your research colleagues.
- Find out who wrote them, and ask them what evidence they used to support the criteria.