Skip to Content
Advertisement
Communication

Should Scientists Publish Their Personal Biases?

What if scientists were more transparent about their values? Would their results and recommendations be better received and more trusted if they acknowledged any relevant personal beliefs that may have shaped their research? Nope. Transparency hurts.Photo Illustration by Francesco Izzo / Wikicommons

Featured Video

A lot of modern science challenges us to change our behaviors. Results related to climate change, for example, suggest we travel, shop, and eat differently. Psychology and sociology ask us to shift our perceptions of each other.

Once the science is done, though, the work is not over. The next step involves facing one of a scientist’s “most significant challenges,” according to a recent study in PLoS ONE. And that is the headache of dealing with how their message comes across to an ideologically polarized public.

What if scientists were more transparent about their values? Would their results and recommendations be better received and more trusted if they acknowledged any relevant personal beliefs that may have shaped their research? That’s what Kevin C. Elliott and colleagues, authors of the PLoS ONE study, sought to determine with some online experiments. They recruited 494 U.S.-participants from Amazon Mechanical Turk (a “convenience sample”—more “male, younger, more highly educated, and more liberal” than a representative sample) to take a survey; it was advertised vaguely as “Your Attitudes about Important Social Issues in the US” to solicit a broad cross-section of people not particularly interested in, or opinionated about, the issues discussed in the experiments.

Advertisement

In the first one, subjects read a vignette of a scientist giving a public talk in Washington, D.C. on the health risks of Bisphenol A (BPA), an endocrine-disrupting compound used in plastic consumer products. There’s an ongoing controversy about its effects. The researchers manipulated the vignette to have the scientist’s conclusion (BPA harmful or not harmful) comport with her explicitly stated values (a policy preference for either “protecting public health” or “promoting economic growth”) on the one hand, and diverge on the other; they also accounted for whether the subjects’ values aligned with the scientist’s in each case. In the second experiment, the vignette is manipulated in the same way but features a policy recommendation from the scientist rather than a statement summarizing the weight of the scientific evidence.

In each experiment, participants registered their impressions of the scientist and their conclusions—were they competent, credible, expert, honest, intelligent, sincere, or trustworthy?—on a 7-point scale. Elliott and his team concluded that disclosing a scientist’s values doesn’t boost his or her credibility or the trustworthiness of the conclusion reached. In fact, the additional transparency can reduce them!

Green arrows are used when the scientist’s conclusion coincides with the scientist’s values; red arrows are used when the scientist’s conclusion conflicts with the scientist’s values. Effects are relative to a scientist not expressing a preference for particular values and are net of control variables.Kevin C. Elliott, Aaron M. McCright, Summer Allen, and Thomas Dietz
Kevin C. Elliott, Aaron M. McCright, Summer Allen, and Thomas Dietz
Advertisement

The effect also depends, the researchers conclude, “on whether scientists and citizens share values, whether the scientists draw conclusions that run contrary to their values, and whether scientists make policy recommendations.” A broad recommendation to a disparate set of communities holding a spectrum of values will produce a spectrum of responses—without, the research suggests, an overall boost in credibility.

Perhaps scientists don’t need it. Last year, a Pew Research Center report found that Americans trust scientists, alongside the military, much more than religious and business leaders, the news media, and elected officials. Yet the trust Americans place in scientists might mean they should be more transparent and vocal about their findings and policy preferences than they already are: The same Pew report found that, on the topic of climate change, 39 percent of Americans trust scientists “a lot” to give “full and accurate information”—way more than energy industry leaders (7 percent), the news media (7 percent), or elected officials (4 percent).

What do you think? Should scientists take this as a cue to be more open about their values and policy preferences in their research, given this discrepancy in trust in their favor? Or should they continue to keep their values to themselves?

Brian Gallagher is the editor of Facts So Romantic, the Nautilus blog. Follow him on Twitter @brianga11agher.

Advertisement

WATCH: What is missing from the public perception of science.

Advertisement

Stay in touch

Sign up for our free newsletter

More from Communication

Explore Communication

Can We Protect Science?

It was a burning question at the World Economic Forum last week

January 28, 2026

What Would Richard Feynman Make of AI Today?

The scientific sage was always suspicious of grand promises delivered before details were understood

January 22, 2026

The Most Beautiful Science of the Year

Insights from Nautilus in 2025

December 24, 2025

The Psychedelic Scientist

High on ayahuasca, Bruce Damer saw how life on Earth began. He may very well be right.

December 16, 2025

How Did Language Evolve?

A new biocultural perspective points the way

November 26, 2025

Your Exclamation Points Speak Volumes!

Employing this most excitable of punctuation marks changes how its writer is perceived

November 3, 2025