Bold ideas and critical thoughts on science.

Janet Hering

Counting is not enough

Janet Hering's take on reconnecting academic research with societal needs.
12 February 2019

Let’s start with the obvious. Evaluation and assessment are part and parcel of the scientific profession. Universities want to hire the best faculty, funding agencies want to support the best projects, and journals want to publish the best papers. To this end, scientists serve as members of hiring and promotion committees and on panels for funding agencies. We also write evaluation letters and reviews of manuscripts and proposals. Of course, this all takes time, which could otherwise be spent on our own research.

These competing demands on our time tempt us to rely on indicators as proxies for quality. But saving time does not justify disregarding the many critiques of indicators such as the Journal Impact Factor (JIF) and h-index. Indicators are often intransparent and fail to accommodate field-specific characteristics; furthermore, they can be subject to manipulation and create perverse incentives, in the worst case, compromising scientific integrity (Alberts et al., 2015; Hering, 2018; Hicks et al., 2015; Klein and Falk-Krzesinski, 2017). This issue becomes even more pressing when the scientists being evaluated are conducting applied research whose outputs are not well aligned with conventional indicators (Hering et al., 2012; Moher et al., 2018).

I would certainly not argue that we should ignore an individual’s professional record, but we also need to be aware of the anchoring effect (Kahneman, 2011) of our first glance at a CV, list of publications or even the affiliations that commonly appear on the cover page of a proposal or manuscript. But what if the first information that we received about someone applying for a faculty position or submitting a proposal or manuscript did not provide us with shortcuts for assessing quality?

STARTING WITH A PLAIN LANGUAGE STATEMENT

I would propose that all reviews (i.e., of job applications, funding proposals, and manuscripts) start with the reviewer reading a plain language statement (AGU, 2018). This document should be concise, clearly-written, and focused on the scientific and societal significance of the job application, funding proposal or manuscript.  Authors would be instructed to focus on content and to avoid, as much as possible, mentioning specific institutions or other aspects of their “pedigree”.

Reviewers would be asked to assess the plain language statement(s) (i.e., as exceptional, adequate or inadequate) before they are given access to the complete application file, proposal or manuscript.  This would help to anchor their subsequent responses to content rather than to proxies for quality (i.e., indicators).

This process would have the greatest benefit for bodies like hiring committees or panels for funding agencies that have to compare a large number of potential applicants or proposals.  Faced with this workload, committee and panel members are more likely to feel overburdened and to be tempted to take shortcuts. Pre-assessment of the plain language statement could encourage committee or panel members to consider applications or proposals with exceptional statements even if the applicants are not the strongest when judged from the perspective of conventional indicators. Taking this even further, applications with “inadequate” statements could be excluded from further consideration.

Even when the evaluation is made on a case-by-case basis (i.e., for promotion and tenure decisions or individual reviews of manuscripts or proposals), pre-assessment based on a plain language statement could help to reinforce the recommendation that “the candidate should be evaluated on the importance of a select set of work, instead of using the number of publications or impact rating of a journal as a surrogate for quality” (Alberts et al., 2015).

I would certainly not claim that such adaptations of review processes would change the culture of academic assessment of scientists overnight.  But I think it could go a long way toward shifting the focus from indicators to content. I feel that such a shift is vital to the future success of the scientific enterprise, not only in the context of problem-driven or solution-oriented research but also for truly creative curiosity-driven research.

Author info

Janet Hering is a Professor in the Department of Environmental Systems Science at the Swiss Federal Institute of Technology (ETH) in Zurich and in the School of Architecture, Civil and Environmental Engineering at ETH Lausanne. As the Director of the Swiss Federal Institute of Aquatic Science and Technology (Eawag), she interacts with stakeholders from policy and practice. She is a member of the U.S. National Academy of Engineering.

Digital Object Identifier (DOI)

https://doi.org/10.5281/zenodo.2562817

Cite as

Hering, J. (2019). Counting is not enough – rediscovering the value of narrative. Elephant in the Lab. DOI: https://doi.org/10.5281/zenodo.2562817

References

Collapse references

AGU (2018). Creating a plain-language summary, LINK.

(accessed: 11th February, 2019)

Alberts, B., Cicerone, R.J., Fienberg, S.E., Kamb, A., McNutt, M., Nerem, R.M., Schekman, R., Shiffrin, R., Stodden, V., Suresh, S., Zuber, M.T., Pope, B.K. and Jamieson, K.H. (2015). Self-correction in science at work. Science, 348, 1420-1422. LINK.

(accessed: 11th February, 2019)

Hering, J.G. (2018). Reconnecting academic research with societal needs through assessment (Quick File). Open Science Framework, 2 pp. LINK.

(accessed: 11th February, 2019)

Hering, J.G., Hoffmann, S., Meierhofer, R., Schmid, M. and Peter, A.J. (2012). Assessing the Societal Benefits of Applied Research and Expert Consulting in Water Science and Technology. Gaia-Ecological Perspectives for Science and Society, 21, 95-101. LINK.

(accessed: 11th February, 2019)

Hicks, D., Wouters, P., Waltman, L., de Rijcke, S. and Rafols, I. (2015). Bibliometrics: The Leiden Manifesto for research metrics. Nature, 520, 429-431. LINK.

(accessed: 11th February, 2019)

Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux, New York.

Klein, J.T. and Falk-Krzesinski, H.J. (2017) Interdisciplinary and collaborative work: Framing promotion and tenure practices and policies. Research Policy, 46, 1055-1061. LINK.

(accessed: 11th February, 2019)

Moher, D., Naudet, F., Cristea, I.A., Miedema, F., Ioannidis, J.P.A. and Goodman, S.N. (2018). Assessing scientists for hiring, promotion, and tenure. PLOS Biology, 16, e2004089. LINK.

(accessed: 11th February, 2019)

0 Comments

Continue reading

What happens to science when it communicates?

What happens to science when it communicates?

In August 2023 Benedikt Fecher conducted an interview with Clemens Blümel from the German Centre for Higher Education Research and Science Studies (DZHW) on the topic of ‘what happens when science opens up and communicates’ and the emerging challenges for future scientific communication.