top of page

Why Should We Ever Trust Science Over Intuition?

  • Writer: Manuela Medeiros
    Manuela Medeiros
  • Jul 20
  • 14 min read

By: Manuela Medeiros


When a scientist predicts an outcome based on a proven theory, and someone else claims a different outcome based on a strong feeling, who should we believe? In a world filled with opinions and gut instincts, what makes scientific reasoning more trustworthy than intuition?


In a time defined by uncertainty–climate upheavals, pandemics, political instability–society faces a defying dilemma: whom shall we trust to predict our future? On one hand stands the scientist, grounded in theory, data and falsifiable methodology. On the other, the individual who claims to have powerful premonition–vivid, visceral and deeply felt. This question, deceptively simple, is the nucleus of epistemology: what constitutes knowledge, and why should some forms of it command more authority than others?


Due to its epistemic foundations–systematic methodology, empirical scrutiny, and communal accountability, this essay argues that we ought to defer to the scientist not because science is infallible but because it is made uniquely suited to produce reliable predictions. Emerging from opaque psychological activity, premonitions are insulated from critical evaluation and juxtapose scientific foundations. To mistake the immediacy of intuition for the reliability of knowledge is to obstruct the distinction between feeling and knowing–a distinction on which society depends.  


Drawing on the philosophical frameworks of Karl Popper, Thomas Kuhn, Immanuel Kant, and contemporary epistemologists such as Alvin Goldman and Bas van Fraassen, this essay explores the contrast between empirical science and intuitive belief. Earning our deference not merely through success, but through the justificational structure of scientific prediction–this essay aims to address why premonitions remain alluring, and why resisting them is, paradoxically, a form of intellectual humility.


The scientist’s prediction is not a guess, it is not intuition–it is a claim rooted in a theory. But what makes a theory “well-attested,” and why should it matter?


An evidenced theory has survived repeated attempts at falsification. Famously argued by Karl Popper, the attempt to prove theories false and failing to do so rather than just “proving theories true”, is what makes science advance.  Falsifiability, for Popper, distinguishes science from pseudoscience. Standing for centuries not due to its invulnerability, but due to its consistent yielded accurate predictions–Newtonian mechanics revolutionized physics until better explanatory models, like Einstein's relativity, emerged. 

When an event is predicted, scientists often do so within this framework: they take a theory that has withstood scrutiny and apply it to a case governed by logical inference. A transparent, repeatable, and accountable process; this is the antithesis of premonition, which is a personal and non-inferential process. 


From a Bayesian lens, evidence should be updated proportionally to beliefs; scientific theories adjust in light of new data. When predicting storms using models that evolve as weather patterns change, meteorologists are ought to make a mistake and when they do–the models are simply recalibrated. By contrast, a premonition is impervious to revision; it simply is, and its truth or falsehood is only retrospectively revealed. On this basis, whilst premonitions are static and backward-confirmed, scientific predictions are dynamic and forward-facing.


Over time, belief systems that evolve are more likely to align with truth and science, despite its missteps, converges towards accuracy. Premonitions, lacking mechanisms of refinement, do not. This difference matters. 


Regardless, we are still prone to question the reliability of scientific predictions and if so, ask ourselves why do so many people continue to trust premonitions? The answer lies in the psychology of belief and the misidentification of certainty with clarity.   

Premonitions feel so compelling because they are internally vivid. “Just knowing” is an experience where the intensity of emotion and sensory reign over knowledge. William James, writing on religious experience, noted the power of subjective certainty. On the other hand, this intensity tells us nothing about truth itself. As warned by David Hume, the correspondence to reality is not guaranteed by the strength of a belief. We may feel certain, and be entirely wrong. 


Neuroscience confirms this. Even where none exist, the human-brain is an engine that recognizes patterns, evolved to detect threats. Intuition is guided by heuristics–mental shortcuts that are efficient but flawed. Take for example the slim probability of rare events starting to seem probable if they are emotionally salient–this trick is played by availability heuristic. A premonition of a plane crash may be nothing more than the echo of recent news–fueled by fear and anxiety. 


The remembrance of premonitions is fluid–remembered when they succeed, forgotten when they fail. This fluidity created an illusion of accuracy. Take for instance someone who recently dreamt about an earthquake followed by a report that one occurred thousands of miles away–they may feel vindicated, ignoring the many dreams that went unfulfilled. This is commonly known confirmation bias–one of the most pervasive cognitive distortions. 

In contrast, scientific predictions are carefully archived, compared to outcomes, and assessed with statistics. A failed prediction is not an inconvenience; it is data. In this way, science honors its own failure. Premonitions do not. 


To defer to scientific prediction is to make a commitment not just to a claim, but to method. That method shall embody: transparency, accountability, and intersubjectivity.


Defined as the “justified true belief”, Plato's classic definition of knowledge remains influential but also, insufficient. An illustration that justified true belief can occur by luck, The “Gettier problem” challenges the definition of knowledge as justified true belief (JTB). Argued as more than a robust foundation, the idea that knowledge comes from reliable cognitive processes, has been argued by contemporary epistemologists like Alvin Goldman, reinforcing the epistemic virtues of Science. 


Scientific reasoning is such a process. Founded within double-blind studies, replication, statistical inference, and peer review–each mechanism is designed to simultaneously reduce bias and increase reliability. Even when it errs, science errs in public–and corrects its blunders. Intuition lacks these epistemic safeguards. 


Not merely an individual activity but a communal one, Helen Longino and Philip Kitcher have argued that its credibility arises from critical interaction. When a theory is put to practice across laboratories and cultures, refined through disagreement, and scrutinised by rivals, it earns trust not from authority but from accountability. 

Conversely, premonitions are solipsistic. They are not subject to communal critique nor can be tested. You can not test someone's dream. Even if it is accurate, it provides no clear pathway for the understanding of the future. 


To prefer scientific prediction over premonition is not an epistemic judgment, it is an ethical one. It determines how we allocate trust, resources, and responsibility.

Imagine an authorial warning of a pandemic, one based on epidemiological models; the other on a gut feeling. To whom should governments listen? To defer to premonition in such contexts is not only irrational–it is negligent. 


The COVID-19 pandemic draws on this argument. Nations that trusted scientific modelling including but not limited to South Korea or New Zealand fared better than those who chose to waver between intuition and political convenience (Fig. 1). Predictions grounded in theory enabled preparation; vague feelings did not.


Fig. 1. Graph illustrating the best and the worst prepared countries for a pandemic; New Zealand and South Korea being amongst the best prepared. 
Fig. 1. Graph illustrating the best and the worst prepared countries for a pandemic; New Zealand and South Korea being amongst the best prepared. 

A healthy democracy relies on the public’s ability to evaluate claims and hold institutions accountable. Scientific predictions are a public dialogue–they can be interrogated and contested. Premonitions are private beliefs. To govern based on the latter is to merely abandon deliberation in favour of mysticism.


Hence why scientific prediction should remain our default in law, medicine, economics, and policy. Not due to perfection, but due to its flexibility–because it is corrigible. Premonitions do not admit correction; they demand or to believe or to stay silent. 


Still, some may challenge the privileges of science on various grounds. This essay considers three common objections:

"But science has been wrong before". True—but so has every human endeavour. The fact that science isn't constant is a virtue, not a flaw. Science, illustrated by Thomas Kuhn's work on paradigm shifts, is portrayed to progress through revolutionary rethinking, each shift bringing us closer to explanatory adequacy. Newton might have been wrong, but usefully so. Premonitions, when wrong, teach us nothing. And in that way, science prevails. 

Argued by some as a complement of science, intuition–by suggesting hypotheses, or alerting us to anomalies–may inspire inquiry, but it cannot justify belief or predict the future. As once posited by Einstein, “Imagination is more important than knowledge”, but imagination is the beginning, not the end, of understanding. 


Certain indigenous or traditional knowledge systems are often dismissed as premonitions but are in fact grounded in long-term empirical observation. This criticism is valid. Not all non-scientific knowledge is irrational. Often exhibiting their own form of methodological rigour, these systems are a key example of science's justification and accountability, not whether it is Western or indigenous. 


Believing is never neutral. Beliefs govern our acts, and how we act has consequences. Thus, our epistemic commitments carry an undeniable moral weight. To trust the prediction of a scientist over a premonition is not just a question of accuracy–it is a question of responsibility. 


Introduced by Philosopher Miranda Fricker, the concept of epistemic injustice is illustrated by the harm done when someone is wrongfully discredited or trusted based on social bias rather than concrete evidence. In this context, testimonial injustice against the scientist is a consequence of choosing to elevate a premonition over a well-substantiated scientific model.


Let it be by physical, social, and global contexts–harm is not only epistemic. Within the appalling narrative of climate change, public health, and technological risk, our survival heavily depends on our ability to distinguish beliefs that simply “feels right” from beliefs that has a reason to be trusted. Every event where a scientific warning is ignored in favour of a gut feeling, society is gambling with the truth and with lives. 


The question, then, becomes not just “Who is likely to be correct?” but “Whose guidance are we justified, morally and rationally, in following?” The answer lies in systems that expose themselves to challenge, that allow for error and correction, and that put evidence before ego. Only science does this.


At the core of this essay’s question lies a fundamental issue in epistemology: what counts as knowledge, and why are some forms of knowing privileged over others? To properly evaluate the difference between scientific prediction and intuitive premonition, we must first explore what philosophers have said about belief, justification, and the structures that underwrite reliable claims.


Plato famously defined knowledge as justified true belief. According to this view, for someone to know something, the belief must be categorized (Fig. 2). 

Fig. 2. A Venn Diagram illustrating the preliminary analysis of the JTB Theory
Fig. 2. A Venn Diagram illustrating the preliminary analysis of the JTB Theory

Scientific predictions clearly dedicate oneself to this triad: they aim to describe reality accurately (truth), are asserted confidently (belief), and are supported by empirical and theoretical evidence (justification). Often painted as occasionally true beliefs, premonitions rarely meet the standards of justification and are compelled by no public evidence, no validation process, and no mechanism for distinguishing insight from error.


Yet, Plato's formulation faced challenges. After publishing a brief paper in 1963 showing that a belief could be justified and true by accident, thus not qualifying as knowledge, Edmund Gettier revolutionized the thesis of the epistemic theory. Amidst the 20th century, philosophers like Alvin Goldman proposed “Reliabilism”: the idea that beliefs are justified if they arise from reliable cognitive processes. 


This shift is crucial for our dilemma. Scientific methods—data analysis, controlled experiments, peer review—are reliable processes. They are designed to reduce error and increase the likelihood of truth. A premonition, by contrast, arises from processes (e.g., dreams, gut feelings) that are not reliable predictors of truth. Even when they are occasionally accurate, they are not systematically so—and thus, under “Reliabilism”, do not produce justified belief.


This shift is crucial for our dilemma. Scientific methodology, data analysis, controlled experiments, peer review–are reliable processes. They are purposefully designed to reduce error margins and increase the likelihood of truth. A premonition, on the other hand, arises from processes (e.g., dreams, gut feelings, seers) that are not reliable predictors of truth. Considering they are rarely accurate, premonitions are not systematically designed for predictions–and thus, under “Reliabilism”, do not produce justified or reliable belief. 

Emphasized by recent developments in epistemology, traits like intellectual humility, open-mindedness, and the willingness to revise beliefs in light of evidence are factors of “Epistemic virtue”. Scientific inquiry institutionalises these virtues. Conversely, premonitions are typically resistant to challenge; they beg for faith, not openness. 

As a final, relevant insight illustrated by Immanuel Kant, in Critique of Pure Reason, a conceptual structure is argued as a requirement for knowledge, not just sensory input (intuition in his terms).  This provides a framework for comprehending shaped by categories like casualty and time. Scientific theories also offer common frameworks. A premonition, despite being vivid, lacks the conceptual scaffolding to connect sense with rational belief. A premonition may be felt, but it cannot be known–it cannot be proven. 


Not all beliefs stem from the same root. Some are fleeting impressions; others are birthed by years of inquiry. This distinction between a scientific prediction and a person's premonition is not one of sincerity, but of epistemic status–illustrated by a degree to which a belief is supported by reliable grounds and/or open to public scrutiny. Explored by a layered nature of belief using philosophical input from Aristotle, Mill, Peirce, and contemporary virtue–this essay argues that the belief of a scientist–though flawed–is epistemically superior to that of someone acting on intuition alone. 


In Posterior Analytics, Aristotle distinguishes between doxa (opinion), episteme (scientific knowledge), and nous (intellectual intuition). True knowledge (episteme), for Aristotle, must be demonstrable, it must come from well-rounded principles that are systemically derived. However, mere opinion can be persuasive without reliability. 

Scientific predictions clearly longs to episteme: depending on explanatory chains, formal logic, and empirical justification. Premonition, even when painted as “intuitive knowledge", operates at the level of doxa–lacking demonstrability. While empowering some, it lacks room for replication, correction, and justification.


John Stuart Mill posited against the dangers of “dead dogma”, the uncritical acceptance of truth based on tradition or emotion. Earned through rigorous method and deserving of qualified deference, Mill recognised that expertise should face alternatives and advocated for a liberal society where ideas are questioned–in which all views are equally valued. 

Mill's foundation helps society understand the clear distinction between premonition and science. A scientific view can be challenged, but it is grounded in a structure that welcomes such scrutiny. The intuitive prediction, paradoxically, resists critique–either it is believed or not. Premonition behaves on a “dogma” basis. 


Hypothesized by the American pragmatist Charles Peirce, fixing belief is contrasted in four ways: tenacity (believing what feels right), authority (believing what institutions say), a priori reasoning (believing what seems logical), and science (belief based on the method of experience, testing, and correction). Peirce argued that only scientific methods could resolve doubt in a stable and progressive context. 


Premonition aligns most closely with Peirce's "method of tenacity", holding to belief regardless of evidence. It may be comforting, but it refuses to create communal knowledge. A scientist, even when wrong, participates in the only method that gives belief justification over time. Thus, society should not question which prediction feels more convincing but which belongs to a self-correcting and publicably answerable process. 

Epistemologists of contemporary virtue like Linda Zagzebski and Jason Baehr posit that belief's reliability is dependent not solely on methods, but on the intellectual background of the believer. Traits like curiosity, flexibility, intellectual humility, and courage are epistemic virtues that increase the odds of feasible belief.


Scientific practice embodies these virtues. With courage to be proven wrong, science demands doubt, openness to falsification and public input. Premonitions, by its own nature, resist this vulnerability that makes science change over time. Insulated from doubt and hostility towards disconfirmation–even on the level of intellectual character, the scientist's prediction is ethically and epistemically preferable. 


While philosophy traditionally asks what we know and how, another equally important question is: how does belief shape the way we experience the world? 

Phenomenologists Edmund Husserl and Martin Heidegger rejected the idea that knowledge is a detached, analytical process. Instead, they focused on how humans experience the world from a first-person perspective. For Heidegger, we are not observers of the world—we are thrown into it, navigating meaning through tools, signs, and moods.

A premonition, like a dream, vision, or “gut feeling”, may carry immense existential weight for the person experiencing it. It feels embedded in their world, by force, part of their mood, history, and hopes. To dismiss it outright as irrational is to ignore how deeply people’s inner experiences shape their sense of reality.


Nevertheless, phenomenology requires intentionality—the mind's focus on significant objects. Scientific reasoning, while not as emotionally charged, attains a form of phenomenological clarity: it connects the individual to a wider intersubjective reality where meaning is communicated, examined, and polished through conversation. That is where its knowledge-based benefit exists. 


Humans are storytellers. We construct meaning through narrative, not through numbers. 

Premonitions persist not because they are accurate, but because they are compelling. They allow individuals to feel connected to something larger—fate, destiny, mystery. In this way, they function existentially rather than epistemically.


Scientific forecasts, on the other hand, frequently miss this storytelling appeal. They are arid, statistical, detached. But that doesn't render them weaker—it makes them more disciplined. They are freed from the misconceptions that give premonitions a sense of strength. They resist the entirely human urge to assign meaning when there is none. 

In this sense, the scientist’s belief may be emotionally thinner, but it is intellectually sturdier. It opens the door to shared knowledge, not just private meaning. It admits its own fallibility and invites correction, which is the highest expression of epistemic humility.


To defer to scientific prediction is to defer not to authority, but to methods that are refined through centuries of intellectual discourse and collective evaluation. It is to choose clarity over opacity and shared accountability over isolated conviction.


This is not a denial of intuition’s value. Intuition can be creative, personal, even revelatory. However, revelation does not equate to justification. Premonitions may inspire the spirit—but they cannot direct the vessel. In a world on the brink of ecological, technological, and political crises, our survival hinges on placing trust not where the voice is the loudest or the emotions are most intense, but where the logic is most robust. 


Intuition can be compelling, imaginative, individualized, and even enlightening. However, revelation does not provide justification. Premonitions can awaken the spirit, yet they cannot guide the vessel. In a world balanced on the brink of existence, society relies on establishing trust not in the strongest feelings, but in the most logical reasoning. 


Mistaking what feels correct for what is actually correct is a perilous cognitive trick. In the midst of life's unpredictability, we must resist the allure of instinct and instead seek those systems—no matter how flawed—that are willing to be mistaken, to be challenged, and to be improved. 

Respect for science, therefore, is not unconditional belief. It is a matter of ethical and intellectual duty: the bravery to rely on what inspires trust, and the insight to recognize the distinction.


Works cited:

 Goode, Terry M, and John R Wettersten. “How Do We Learn from Argument? Toward an Account of the Logic of Problems.” Canadian Journal of Philosophy, vol. 12, no. 4, 1982, pp. 673–689. JSTOR, www.jstor.org/stable/40231287, https://doi.org/10.2307/40231287.

Talbott, William. “Bayesian Epistemology.” Stanford Encyclopedia of Philosophy, Metaphysics Research Lab, Stanford University, 2016, plato.stanford.edu/entries/epistemology-bayesian/.

 Shermer, Michael. The Believing Brain. Times Books, 2011.

 James, William. The Varieties of Religious Experience a Study in Human Nature Página 1 de 400 the Varieties of Religious Experience : A Study in Human Nature / William James. 2009.

 David, Marian. “The Correspondence Theory of Truth.” Stanford Encyclopedia of Philosophy, 10 May 2002, plato.stanford.edu/entries/truth-correspondence/.

 Gettier Problems | Internet Encyclopedia of Philosophy. iep.utm.edu/gettier/#H5.

 Sosa, Ernest. “Goldman’s Reliabilism and Virtue Epistemology.” Philosophical Topics, vol. 29, no. 1/2, 2001, pp. 383–400. JSTOR, www.jstor.org/stable/43154371, https://doi.org/10.2307/43154371.

 ---. “Infographic: The Countries Best and Worst Prepared for an Epidemic.” Statista Infographics, 9 Dec. 2021, www.statista.com/chart/20629/ability-to-respond-to-an-epidemic-or-pandemic/.

 James, William. The Varieties of Religious Experience. Harvard University Press, 1985 (originally 1902).

 Sinclair, Rob. “Epistemic Injustice | Internet Encyclopedia of Philosophy.” Utm.edu, 2016, iep.utm.edu/epistemic-injustice/.

 Plato. Theaetetus. Trans. M.J. Levett. Hackett, 1992.

 “A Preliminary Analysis of the JTB Analysis and the Gettier Problem.” LUO Xingyu’s Blog, 11 Feb. 2018, luoxingyu.wordpress.com/2018/02/11/a-preliminary-analysis-of-the-jtb-analysis-and-the-gettier-problem/. Accessed 19 July 2025.

 Plato. Theaetetus. Trans. M.J. Levett. Hackett, 1992.

 Gettier, Edmund L. “Is Justified True Belief Knowledge?” Analysis, vol. 23, no. 6, 1 June 1963, pp. 121–123.

 Sosa, Ernest. “Goldman’s Reliabilism and Virtue Epistemology.” Philosophical Topics, vol. 29, no. 1/2, 2001, pp. 383–400. JSTOR, www.jstor.org/stable/43154371, https://doi.org/10.2307/43154371.

 Goldman, Alvin, and Bob Beddor. “Reliabilist Epistemology.” Stanford Encyclopedia of Philosophy, Metaphysics Research Lab, Stanford University, 2016, plato.stanford.edu/entries/reliabilism/.

 Goldman, Alvin, and Bob Beddor. “Reliabilist Epistemology.” Stanford Encyclopedia of Philosophy, Metaphysics Research Lab, Stanford University, 2016, plato.stanford.edu/entries/reliabilism/.

 Turri, Mark, John, Alfano, and John Greco. “Virtue Epistemology.” Stanford Encyclopedia of Philosophy, Metaphysics Research Lab, Stanford University, 2019, plato.stanford.edu/entries/epistemology-virtue/.

 Maximilian, Holzek. “A Summary of “the Critique of Pure Reason”- Immanuel Kant.” Medium, 23 July 2023, medium.com/@gryzbeck.maximilian/a-summary-of-the-critique-of-pure-reason-immanuel-kant-d030aebb3b8b.

 Lizka. “Epistemic Status: An Explainer and Some Thoughts.” Effectivealtruism.org, 31 Aug. 2022, forum.effectivealtruism.org/posts/bbtvDJtb6YwwWtJm7/epistemic-status-an-explainer-and-some-thoughts. Accessed 19 July 2025.

 “The Internet Classics Archive | Posterior Analytics by Aristotle.” Classics.mit.edu, classics.mit.edu/Aristotle/posterior.1.i.html.

  Aristotle. Posterior Analytics. Translated by Jonathan Barnes, Oxford University Press, 1994.

 “The Internet Classics Archive | Posterior Analytics by Aristotle.” Classics.mit.edu, classics.mit.edu/Aristotle/posterior.1.i.html.

 Mill, John Stuart. On Liberty. Penguin Classics, 1985.

 Peirce, Charles S. “The Fixation of Belief.” Popular Science Monthly, vol. 12, 1877, pp. 1–10.

 Zagzebski, Linda. Virtues of the Mind: An Inquiry into the Nature of Virtue and the Ethical Foundations of Knowledge. Cambridge University Press, 1996.

 Baehr, Jason. The Inquiring Mind: On Intellectual Virtues and Virtue Epistemology. Oxford University Press, 2011.

 Descartes, René. Meditations on First Philosophy. Trans. John Cottingham. Cambridge University Press, 1996.

 Heidegger, Martin. Being and Time. Translated by John Macquarrie and Edward Robinson, Harper & Row, 1962.

 Thorburn, Malcolm, and Steven A. Stolz. “Intersubjectivity, Embodiment and Enquiry: A Merleau-Ponty and Husserlian Informed Perspective for Contemporary Educational Contexts.” Educational Philosophy and Theory, Apr. 2025, pp. 1–11, https://doi.org/10.1080/00131857.2025.2486675.

Comments


ANY FEEDBACK?

image.png
image.png

© 2035 by Train of Thoughts. Powered and secured by Wix

bottom of page