open science

It seems that in psychology and communication, as in other fields of social science, much of what we think we know may be based on a tenuous empirical foundation.

“Psychology emergency” by atomicity (Flickr).

Concerns have been raised about the integrity of the empirical foundation of psychological science, such as low statistical power, publication bias (i.e. an aversion to reporting statistically nonsignificant or “null” results), poor availability of data, the rate of statistical reporting errors (meaning that the data may not support the conclusions), and the blurring of boundaries between exploratory work (which creates new theory or develops alternative explanations) and confirmatory work (which tests existing theory). It seems that in psychology and communication, as in other fields of social science, much of what we think we know may be based on a tenuous empirical foundation. However, a number of open science initiatives have been successful recently in raising awareness of the benefits of open science and encouraging public sharing of datasets. These are discussed by Malte Elson (Ruhr University Bochum) and the OII’s Andrew Przybylski in their special issue editorial: “The Science of Technology and Human Behavior: Standards, Old and New”, published in the Journal of Media Psychology. What makes this issue special is not the topic, but the scientific approach to hypothesis testing: the articles are explicitly confirmatory, that is, intended to test existing theory. All five studies are registered reports, meaning they were reviewed in two stages: first, the theoretical background, hypotheses, methods, and analysis plans of a study were peer-reviewed before the data were collected. The studies received an “in-principle” acceptance before the researchers proceeded to conduct them. The soundness of the analyses and discussion section were reviewed in a second step, and the publication decision was not contingent on the outcome of the study: i.e. there was no bias against reporting null results. The authors made all materials, data, and analysis scripts available on the Open Science Framework (OSF), and the papers were checked using the freely available R package statcheck (see also: www.statcheck.io). All additional (non-preregistered) analyses are explicitly labelled as exploratory. This makes it easier to see…