Spin. Clickbait. Exaggerated headlines. The rise of social media such as Facebook and Twitter has changed how health-related research and news is presented to audiences around the world, and it is not unheard of for researchers and reporters to overstate the findings of a study.
To better understand this issue, a May 30, 2018 study in PLOS One took a detailed look at the 50 most-shared academic journal articles linking any exposure with a health outcome, and media stories that covered those articles. The multidisciplinary research team was led by Noah Haber, who recently completed his ScD in health economics in the Department of Global Health and Population at Harvard T.H. Chan School of Public Health. They assessed the studies’ strength of causal inference, or whether the study could determine that the exposure itself changed the health outcome, using a novel systematic review tool. They then compared them with the strength of causal language used to describe results in both academic journal articles and media articles.
The study found that 34% of the academic studies reviewed used language that reviewers considered too strong given their strength of causal inference, and 48% of media articles used stronger language than their associated academic articles. Moreover, 58% of media articles inaccurately reported the question, results, intervention, or population of the academic study. The team is now researching how academia, media, and social media contribute to this issue, and interventions to help fix it.
In addition to the PLOS paper, the team created a website called MetaCausal that includes a public explainer of its research, the full dataset, full protocol, review tool, analysis code, reviewer profiles, and results from the study.
Read the PLOS One paper: Causal language and strength of inference in academic and media articles shared in social media (CLAIMS): A systematic review