Trust in science – factor in culture and belief


The following article is authored by Justin Sulik.


  • Trust in science should be considered in relation to people’s cultural and cognitive processes of belief formation.
  • We need to broaden our conceptualisation of trust in science to not rely solely on top-down cues of trustworthiness.
  • We need to develop better models that clarify how social influences impact the endorsement of scientific consensus.
  • Trust in science could be better understood through greater efforts to analyse social barriers to participation.

When the COVID-19 pandemic required the implementation of new policies, politicians, health authorities and social scientists underscored the importance of trust in science for compliance with those policies or for motivating large-scale shifts in social norms. But are these reasonable burdens to have been placed on trust in science? I argue that trust in science should not be understood as manufacturing compliance, but rather as something deeply embedded in the cultural and cognitive processes of belief formation.


Earlier in the pandemic, findings that trust in science was associated with compliance with disease-prevention policies (Pagliaro et al., 2021) seemed to cement its immediate relevance for top-down social change. However, subsequent waves of research showed that trust does not matter directly for compliance. Rather, the role of trust in science is to bridge people’s pre-existing worldviews with the new, uncertain situations that they face in the pandemic (Sulik et al., 2021a).


People have always had strategies, philosophies, schemas or frameworks for understanding the world around them and for guiding action. But when they are faced with complex decisions about unfamiliar scenarios, they increasingly rely on others (Siegrist, 2021). In the pandemic, this includes the behaviour of their social networks (Tunçgenç et al., 2021), perceived social norms (Bicchieri et al., 2021) and, of course, scientific experts.


In this more bottom-up view, trust in science is not some uniquely rational ideal. Rather, it is just one way among many that ordinary people use social information to connect their worldviews and their decisions. But if trust in science is just one avenue of social influence, then to make progress on trust, we simultaneously need to think more about what people believe, and on how they form those beliefs because both the what and the how are subject to social influence.


Concerning the what, trust in science has long been considered ideologically polarised, with more conservative or religious populations having lower trust (Gauchat, 2012). However, this increasingly seems a particular problem in the US: in the US context, not only is there an unusually strong association between ideology and trust in science, but trust in science also has an unusually strong association with people’s attitudes and behaviours in the pandemic (Sulik et al., 2021a). As lower trust in science is thus not a necessary feature of a conservative ideology, a more global view should help illuminate where genuine tensions exist between certain ideologies and science, versus where differences are amplified by cultural influence (Hornsey et al., 2018; Pröpper et al., 2022; Rutjens et al., 2022).


However, the issue is not just that scientific consensus sometimes competes with other beliefs. Science also clashes with people’s epistemologies – frameworks for processing information into beliefs and knowledge. For instance, as coronavirus vaccinations have stalled at suboptimal levels, a priority is to identify populations that, though vaccine hesitant, might yet be persuaded to vaccinate. Certain users of Complementary and Alternative Medicine (CAM) are less likely to vaccinate (Soveri et al., 2021), but their reasons for hesitancy are rooted in unique philosophical frameworks. In particular, some CAM users view knowledge as arising from intuition or mystical experiences (Browne et al., 2015; Lindeman et al., 2022). This presents a challenge because it constitutes an ascientific (cf. amoral vs immoral) epistemology.


It is still worth persuading ascientific groups to vaccinate, but that does not mean persuasion efforts should bank on the ideals that make science trustworthy from a scientific perspective. Doing so is like spending time and resources fixing bugs in a Linux program and then trying to run it on a Windows system. Scientific ideals already assume epistemological commitments that the target audience might not endorse (Morisseau et al., 2021). In expanding the focus from what people believe (e.g., their ideologies) to how they form beliefs (their epistemologies), I propose three steps we can take towards a broader conceptualisation of trust in science.


First, we should not rely solely on top-down cues of trustworthiness. Realistically, when people decide whether to trust a source of information, they have some freedom in choosing what they would like to learn about it - if, indeed, they are motivated to learn anything at all, itself a non-trivial assumption (Morisseau et al., 2021). The things that we scientists take to be convincing cues of our trustworthiness are of little help when sceptics can choose to bypass that information entirely. In a current project where participants can choose what cues to explore (and where they previously rated their trust in science), we found that people who already trusted science chose cues relevant to scientific expertise; people who distrusted science choose social or informal cues that were at best noisier signals of expertise, or worse, entirely orthogonal (Sulik et al., 2022).


Second, work on trust in science needs better models of social influence. We already know that cognitive biases, by impacting how people form beliefs, can reduce their endorsements of scientific consensus (Lindeman et al., 2022; Pennycook, 2022; Sulik et al., 2020). However, I have shown experimentally that social influence can amplify such cognitive biases (Sulik et al., 2021b). Rather than just being fed misinformation, people can be socially influenced to disregard sources of empirical evidence. As a corollary, if any group wants to lastingly stake its identity on something demonstrably untrue, it will likely have to make scorn for sources of empirical evidence part of that identity. This is worrying because such feedback loops can lead into snowballing situations where it becomes increasingly difficult to have any influence on a group from outside.


Finally, I question whether trust in science can be fully understood without consideration of social barriers to participation. Open science practices are associated with better trust in science (Rosman et al., 2022), and it may seem as though this is just an epistemic issue: if people can check the research behind the conclusions, they have more reason to endorse them. However, open science also has value because of its role in justice and inclusion (Avezedo et al., 2019; 2022; Elsherif et al., 2022). As psychological distance to science is itself a predictor of science denial (Većkalov et al., 2022), better inclusion of marginalised groups in the process of producing science—a way of reducing that psychological distance—must be part of long-term progress on trust in science.


Have you seen?
From ivory towers to glass houses, science is transforming 
Polarisation kidnapped science, the price is paid by all
We politicised science and scientised politics – is that a problem?




Azevedo, F., Parsons, S., Micheli, L., Strand, J., Rinke, E., … & FORRT (2019, December 13). Introducing a Framework for Open and Reproducible Research Training (FORRT).


Azevedo, F., Liu, M., Pennington, C.R., Pownall, M., Evans, T. R., Parsons, S., Elsherif, M. M., Micheli, L., Westwood, S. J., & Framework for Open, Reproducible Research Training (FORRT). (2022). Towards a culture of open scholarship: the role of pedagogical communities. BMC Research Notes, 15, 75.


Bicchieri, C., Fatas, E., Aldama, A., Casas, A., Deshpande, I., Lauro, M., Parilli, C., Spohn, M., Pereira, P., & Wen, R. (2021). In science we (should) trust: Expectations and compliance across nine countries during the COVID-19 pandemic. PloS One, 16(6), e0252892.


Browne, M., Thomson, P., Rockloff, M. J., & Pennycook, G. (2015). Going against the herd: psychological and cultural factors underlying the ‘vaccination confidence gap’. PLoS One, 10(9), e0132562.


Elsherif, M., Middleton, S., Phan, J. M., Azevedo, F., Iley, B., Grose-Hodge, M., Tyler, S., Kapp, S. K., Gourdon-Kanhukamwe, A., Grafton-Clarke, D., Yeung, S. K., Shaw, J. J., Hartmann, H., & Dokovova, M.  (2022, June 20). Bridging Neurodiversity and Open Scholarship: How Shared Values Can Guide Best Practices for Research Integrity, Social Justice, and Principled Education. MetaArXiv.


Gauchat, G. (2012). Politicization of science in the public sphere: A study of public trust in the United States, 1974 to 2010. American Sociological Review, 77(2), 167-187.


Hornsey, M. J., Harris, E. A., & Fielding, K. S. (2018). The psychological roots of anti-vaccination attitudes: A 24-nation investigation. Health Psychology, 37(4), 307—315.


Lindeman, M., Svedholm-Häkkinen, A. M., & Riekki, T. J. (2022). Searching for the cognitive basis of anti-vaccination attitudes. Thinking & Reasoning, Advance online publication,


Morisseau, T., Branch, T. Y., & Origgi, G. (2021). Stakes of knowing the truth: a motivational perspective on the popularity of a controversial scientific theory. Frontiers in Psychology, 12, 708751.


Pagliaro, S., Sacchi, S., Pacilli, M. G., Brambilla, M., Lionetti, F., Bettache, K., ... & Zubieta, E. (2021). Trust predicts COVID-19 prescribed and discretionary behavioral intentions in 23 countries. PloS One, 16(3), e0248334.


Pennycook, G. (2022, August 31). A framework for understanding reasoning errors: From fake news to climate change and beyond.


Pröpper, H. Y., Geiger, S., Blanken, T. F., & Brick, C. (2022). Truth over identity? Cultural cognition weakly replicates across 23 countries. Journal of Environmental Psychology, 83, 101865.


Rosman, T., Bosnjak, M., Silber, H., Koßmann, J., & Heycke, T. (2022). Open science and public trust in science: Results from two studies. Public Understanding of Science. Advance online publication,


Rutjens, B. T., Sengupta, N., Der Lee, R. V., van Koningsbruggen, G. M., Martens, J. P., Rabelo, A., & Sutton, R. M. (2022). Science skepticism across 24 countries. Social Psychological and Personality Science, 13(1), 102-117.


Siegrist, M. (2021). Trust and risk perception: A critical review of the literature. Risk analysis, 41(3), 480-490.


Soveri, A., Karlsson, L. C., Antfolk, J., Lindfelt, M., & Lewandowsky, S. (2021). Unwillingness to engage in behaviors that protect against COVID-19: the role of conspiracy beliefs, trust, and endorsement of complementary and alternative medicine. BMC Public Health, 21, 684.


Sulik, J., Ross, R., & McKay, R. (2020, July). The contingency illusion bias as a driver of science denial. Proceedings of the 42nd Annual Conference of the Cognitive Science Society. Toronto, Canada.


Sulik, J., Deroy, O., Dezecache, G., Newson, M., Zhao, Y., El Zein, M., & Tunçgenç, B. (2021a). Facing the pandemic with trust in science. Humanities and Social Sciences Communications, 8(1), 1-10.


Sulik, J., Efferson, C., & McKay, R. (2021b). Collectively jumping to conclusions: Social information amplifies the tendency to gather insufficient data. Journal of Experimental Psychology: General, 150(11), 2309-2320.


Sulik, J., Niella, T., Origgi, G., & Deroy, O. (2022). Cues to trustworthiness: The expert/layman probllem. Manuscript in preparation.


Tunçgenç, B., El Zein, M., Sulik, J., Newson, M., Zhao, Y., Dezecache, G., & Deroy, O. (2021). Social influence matters: We follow pandemic guidelines most when our close circle does. British Journal of Psychology, 112(3), 763-780.


Većkalov, B., Zarzeczna, N., McPhetres, J., van Harreveld, F., & Rutjens, B. T. (2022). Psychological distance to science as a predictor of science skepticism across domains. Personality and Social Psychology Bulletin, Advance online publication,




Justin Sulik is an assistant professor in the interdisciplinary Cognition, Values & Behavior lab at Ludwig Maximilian University of Munich. He works on the psychology of science, including the socio-cognitive drivers of science denial, the role of trust in science in individual decision-making, how non-experts explain the world around them, and the value of cognitive diversity (especially neurodiversity) in collective problem-solving.


The facts, ideas and opinions expressed in this piece are those of the authors; they are not necessarily those of UNESCO or any of its partners and stakeholders and do not commit nor imply any responsibility thereof. The designations employed and the presentation of material throughout this piece do not imply the expression of any opinion whatsoever on the part of UNESCO concerning the legal status of any country, territory, city or area or of its authorities, or concerning the delimitation of its frontiers or boundaries.