The New Scientific Enlightenment

There is a massive paradigm shift occurring: beliefs about the nature of scientific inquiry that have held for hundreds of years are being questioned.

As laypeople, we see the symptoms all around us: climatology, economics, medicine, even fundamental physics; these domains (and more) have all become battlegrounds with mounting armies of Ph.D.s and Nobel Prize winners entrenching in opposing camps.  Here’s what’s at stake:

. . .

Scientific Objectivity

In 1972 Kahneman and Tversky launched the study into human cognitive bias, which later won Kahneman the Nobel.  Even a cursory reading of this now vast literature should make each and every logically-minded scientist very skeptical of their own work.

A few scientists do take bias seriously (c.f. Overcoming Bias and Less Wrong).  Yet, nearly 40 years later, it might be fair to say that its impact on science as a whole has been limited to improving clinical trials and spawning behavioral economics.

In 2008, Farhad Manjoo poignantly illustrates that our indifference to the pervasiveness of cognitive bias, combined with digital-age network effects has lead us to a crisis of truth; none of us, from Dittohead to Nobel Laureate, are able (practically speaking) to distinguish objective, scientific truth from carefully crafted stories.

What’s worse is that we continually craft these stories for ourselves as we go through life… it’s the basis for rational, conscious thought.

What’s even worse is how individual biases combine with scientific institutional biases (e.g. publication bias) to unintentionally corrupt the scientific process.  It’s gotten to the point where in 2010, headlines like these have seemingly little impact on the scientific community, and go unnoticed by policy makers and the general population alike:

• Million-dollar industry payments to doctors go undisclosed
• Science fails to face the shortcomings of statistics
• We’re so good at medical studies that most of them are wrong

“New Kinds” of Science

“[A]mong the great majority of active scientists… [reductionism] is accepted without question…. [T]he relationship between the system and its parts is intellectually a one-way street.” (Nobel Physicist, P.W. Anderson, 1972)

“if the stars in the universe were fractally distributed it would not be necessary to rely on the Big Bang theory” (new math applied to a cosmological paradox by Mandelbrot in 1974)

“The revolutionary new discoveries [complexity] researchers have made… could change the face of every science from biology to cosmology to economics.” (Waldrop, 1993)

“[U]nexpected results force a whole new way of looking at the operation of our universe…. [T]he ultimate scope and limitations of mathematics, the possibility of a truly fundamental theory of physics, the interplay between free will and determinism, and the character of intelligence in the universe.” (summary of A New Kind of Science, 2002).

“Reductionism is so rigid in its hopes to ‘entail’ everything in the unfolding of the universe…. I think reductionism is incomplete.” (Stuart Kauffman, 2009)

The “newness” that underlies each of these claims is not exactly new; it’s been discussed since at least the time of Aristotle.  The idea being that the whole is greater than the sum of the parts.

And that if scientific method focuses solely on reductionism — dividing the whole into easily-understandable parts — this will blind us to true understanding.

What’s new is the scientific approach to studying the bottom-up dynamic known as emergence.

The Nature of Scientific Proof

What counts as scientific evidence is always an evolving target, and an issue that can make or break careers.  But when human lives are on the line, scientific proof is not so important.  Inasmuch as science is more than self-indulgent theory, callings into question of scientific rules of evidence must be taken very seriously.

All proofs assume some specific form of consistent logic.

All logics are either incomplete or inconsistent.

Therefore, no proof is complete.

Ever since Karl Popper, scientists have been obsessed with syllogisms like this one (aka deductive reasoning)and its handmaiden, falsifiability.

This obsession has been at the expense of completeness though, as there are dozens of forms of logic which are equally consistent (like induction and abduction), but which rely on and promote creativity.

Statistical methods have played an increasing role in all of science since the advent of the computer.  Yet, their well-known limitations have been summarily ignored by science.  In the aforementioned excoriation of statistics it was observed that “The difference between ‘significant’ and ‘not significant’ is not itself statistically significant”.

In The Black Swan, the author (Taleb) pulls even fewer punches about what these shortcomings mean for scientific pursuit.  Ultimately he concludes that the combination of “unknown unknowns” with cognitive biases and increasing universal complexity means we are doomed to an eternally widening gap between what we know about the world and what we can know.  Even worse, we will become increasingly ignorant of or ignorance — by construction(!) — leading to dark times ahead.

Some scientists claim that the same epistemological conundrum is leading in the exact opposite direction: towards enlightenment.  Ironically(?) a modern convergence of science and spirituality is occurring, as epitomized by integral theory beginning in the late 1960s, but branching into related forms at an accelerated pace.  See, for example, a popular documentary, the rise of nonduality, and a new scientific conference.

Life, the Universe and Everything

For centuries, mathematicians and philosophers believed that everything true about the universe could — though not in practice, at least theoretically speaking — be written down and codified.  But then in 1931, Kurt Gödel proved an incompleteness theorem that dashed this hope even in principle.

It’s taken time for the gravity of Gödel’s discovery sink into the “grey matter” surrounding fundamental physics, but it’s finally starting to happen: “Some people will be very disappointed if there is not an ultimate theory…. I used to belong to that camp, but I have changed my mind.” (Stephen Hawking, Gödel and the end of physics, 2002)

The reticence of the scientific community to accept the incontrovertible could be seen as an example of science “advancing one funeral at a time” (an epithet wielded by Nobel physicist, Max Planck).  But it turns out the core issue is a paradox that is actually thousands of years old.

The issue is self-reference and it leads to many forms of paradox and uncertainty, including Heisenberg’s famous uncertainty principle.  The implications of self-reference for physics were summed up recently as follows: “After decades of debate, disputes over the mathematical rules governing reality remain unresolved” (ScienceNews, 2010)

Self-reference not only gives fits to quantum physicists, it calls into question one of the most basic premises of science, “the assumption that our world operates according to causal laws.” (Science’s First Mistake, 2010)

And it’s not just theoretical: one of the largest scientific industries in the world is desperately trying to fight the real-world effects of self-reference.

It should come as no surprise then that, in the void, theories that were once laughable to mainstream science are getting serious attention.

One such theory, biocentrism, challenges the primacy of physics in the pantheon of science, and replaces it with biology.  Moreover, it picks up the thread of observer-dependence laid down by quantum physics and weaves it into a narrative of a conscious universe, reminiscent of integral theory and its ilk.

If, as Einstein noted, “the belief in an external world independent of the perceiving subject is the basis of all natural science”; and if, as Popper proposed, falsifiability is the criterion demarcating science from non-science; then…

We must take biocentrism seriously. Because unlike every other “new age” theory accused of over-fitting the data, biocentrism makes falsifiable predictions which will likely be settled soon.

Plus, it may be the case that biocentrist theory explains phenomena that is often observed but not politically correct to talk about.  Says one noted scientist, “What Lanza says in this book is not new. Then why does Robert have to say it at all? It is because we, the physicists, do NOT say it––or if we do say it, we only whisper it, and in private––furiously blushing as we mouth the words.”

The Truth Wears Off

As if on cue, in December of 2010, The New Yorker published a tour de force of scientific journalism by Jonah Lehrer, subtitled, “Is there something wrong with the scientific method?”  Here are quotes from two well-respected scientists in the article:

“Whenever I start talking about this, scientists get very nervous” (Jonathan Schooler)

“[Michael] Jennions admits that his findings are troubling, but expresses a reluctance to talk about them publicly. ‘This is a very sensitive issue for scientists’”

What they are talking about is a mysterious phenomenon that resembles the bizarre observer-dependent causality of quantum mechanics, but at the scale of human population studies.  Referred to alternately as the “decline effect” and “cosmic habituation”, the effect has been shown to be both repeatable and predictable.  In a nutshell, the decline effect is as follows:

Many well-established, scientifically validated “facts” seem to decline in their validity over time as they are retested.

This is despite collective best efforts to recreate the exact conditions of initial experiments and account for red herrings like flawed statistical application, regression to the mean, publication bias, reporting bias, confirmation bias, survivorship bias, “significance chasing,” financial and other conflicts of interest, data collection issues, improper experimental controls, a priori faulty experimental design, and so on.

And yes, if you were paying attention, you would notice that the decline effect has a strong element of self-referentiality….

. . .

Not the Wind, Not the Flag

When reading about the decline effect, biocentrism and nonduality, it’s difficult for us scientifically trained folk to turn down our highly-attuned falsification mechanisms long enough to allow the synapses to connect the dots.  But the dots do seem to be connecting themselves lately.

The year after receiving a Masters of Science in Computer Science / Artificial Intelligence I read a book that I still recommend to people as the most important book I’ve read to date.  It begins with the following claims:

“[T]he mind is inherently embodied, reason is shaped by the body, and since most thought is unconscious, the mind cannot be known simply by self-reflection.”

“What universal aspects of reason there are arise from the commonalities of our bodies and brains and the environments we inhabit…. [R]eason is not entirely universal.”

“[W]e have no absolute freedom in Kant’s sense, no full autonomy.”

“The utilitarian person… does not exist…. People seldom engage in a form of economic reason that could maximize utility.”

“The phenomenological person, who through… introspection alone can discover everything there is to know about the mind and the nature of experience, is a fiction.”

“There is no poststructuralist person-no completely decentered subject for whom all meaning is arbitrary, totally relative, and purely historically contingent, unconstrained by body and brain.”

“[T]he classical correspondence theory of truth is false…. [T]hat statements are true or false objectively, depending on how they map directly onto the world….”

“There is no such thing as a computational person…. The neural structures of our brains produce conceptual systems and linguistic structures that cannot be adequately accounted for by formal systems that only manipulate symbols.”

“[T]here is no Chomskyan person, for whom language is pure syntax…. [C]entral aspects of language arise evolutionarily from sensory, motor, and other neural systems that are present in “lower” animals.”

At the time, these statements seemed scientifically blasphemous, but I suspect they would seem mundane and obvious to a recent MS graduate.  Though I sensed the inherent truth being spoken, it’s taken me a while to truly come to terms with the implications for my own understanding of the world.

As best as I can articulate, the questions being raised by “all of this” are as follows:

  • Are there any universal truths at all?
  • When we observe the world scientifically, are we studying the the universe as it truly is, or are we studying the nature of our minds, the structure of consciousness?
  • Can scientific breakthrough occur without some form of completely irrational faith or spirituality?

These are questions that I still do not know the answer to, but I am getting more comfortable with the not knowing.

What I like best about the practice of embracing “I don’t know” is that it allows us to step back and get clarity on questions like, “Are there any universal truths?”  And when we do step back, we realize that the question is not about “truth” or “universal” but rather “are there”.

. . .

“Do numbers exist?”

Which brings us full circle to question of independence.

When ScienceNews ran the article chastising scientists for rampant misapplication of statistical methods, the word “independence” was used only once (in the context of replicating an observed finding).

This is a curious fact considering that the real elephant in the room, statistically speaking, is always whether correlated observations are independent of one another.  If they are not, then the statistical methods that scientists use violate the fundamental assumption required of them.

Of course, we all know that there is no such thing as true independence, don’t we?  The butterfly effect and so on.  But we use statistical theory and make probabilistic arguments all the time, both in science and in life.  Without it, we’d be paralyzed to the point of inaction.

As scientists though, we’ve got to admit, it bothers us (doesn’t it?) that one of the fundamental assumptions of our daily activity is false.  That we lie to ourselves every day so that we can get on with our work.

The biggest elephant of all though is the question of whether the world exist independent of our observing it.  You and Descartes may be sure that it does, but I still don’t know.

There’s a question that was posed to me in my freshman calculus class in college, that I thought I knew the answer to, but which has nagged at me ever since: “Do numbers exist, or are they constructs of the human mind?”

To this day, I ask the same question of luminaries at scientific gatherings, and I have yet to get a definitive answer.  Sometimes I ask it in a slightly different way: “Is math inherently part of the structure of the universe?”  I have yet to get a satisfying answer.

. . .

When the Levee Breaks

None of this existential teeth gnashing is new in science.  Thomas Kuhn coined the phrase paradigm shift to describe revolutions of thought such as the transition from Ptolemaic to Copernican cosmology in the 16th century, or the shift from classical to quantum mechanics last century.

But what is new is the accelerated number of paradigm shifts that are occurring during the span of a single human lifetime.   In other words, we are at a crossroads in history during which multiple pillars of the existing scientific method are cracking at once.

What exactly does this mean?

I don’t know.

But the levee is about to break.

And in the spirit of scientific progress I will give you my own falsifiable prediction (or is it resolution?) as the new year approaches:

Something big is happening.  Something positive.  We can all feel it, can’t we?  It is hard to define, but we will all recognize it in hindsight.  And when we do, historians will agree that the levee broke in 2012….

Related posts:

  1. Why Falsifiability is Insufficient for Scientific Reasoning
  2. The Process
  3. Hive Mindstein
  4. Non-Dualism
  5. Science 2.0

  • Sam Chauhan

    Great job Rafe!

  • Anonymous

    Unsurprisingly, I am very skeptical of this argument. I interpret the story you tell as proving that the scientific method _works_. There is nothing unexpected here:

    (1) We know a priori that most hypotheses are wrong.

    (2) We know a priori that there is a base rate of errors that will lead to spurious positive results: experimental design flaws, measurement errors, statistical errors, confirmation bias, and publication bias, to name the most obvious.

    Therefore, we can calculate from Bayes’ Rule that most positive results will thus actually be wrong. The exact proportion depends on your priors for the two variables. For sake of argument, say that 95% of hypotheses are wrong and that 10% of all positive results are flawed. For completeness, assume that 10% of all negative results are also wrong.

    Baye’s Rule says that in this very reasonable scenario, 68% of all positive results are actually wrong. Even though experiments are “90% accurate” in the colloquial sense.

    If the scientific method works, however, it should thus eventually conclude that most positive results are wrong. Which, as your post illustrates, is what happens.

    No need to invoke observer effects, Godel, or spirituality. Just good old fashioned probability at work here.

  • Alex

    I think this is the key question:
    “When we observe the world scientifically, are we studying the the universe as it truly is, or are we studying the nature of our minds, the structure of consciousness?”

    Here are some related ponderings:
    1. - How’s the Internet changing us? We are losing feeling and obsessing with data, structures, info. But “losing” and “obsessing” are opinionated observations. Restructure of consciousness? What is data? What is music? onto #2

    “Weed through the slang of pompous modern man, past every status phrase coined for new roles, back past the jargon forged for every plan that sought to raise mankind to higher goals, then back beyond the crafty printing press that made a civil language of each tongue, yes, back when words could fluidly express a hero’s tale, when praise and prayer were sung, then back into the prehistoric slime to find that beastly grunt or frightened groan and like a riddle trace a word through time. It’s age-old journey’s not unlike our own, for with each subtle change you’ll find unfurled within that word the history of the world”

    “Will we ever realize the possibility that “truth” has already arrived on earth and that we don’t know this because we keep looking for it with our eyes rather than feeling for it with our hearts?”

    On a more personal note. Here’s a little rhyme:

    A Decent into the Male-Störm
    going through the Poe-tree in motions
    counterclockwiseguy intuitive emotions

    “He is demanding of art a task far more difficult that is, to undo the alienation of the corporeal sensorium, to restore the instinctual power of the human bodily senses for the sake of humanity’s self-preservation, and to do this, not by a…voiding the new technologies, but by passing through them.”

    Sensory Alienation… Alien Nation. You feel me? Word play… different perspectives… flashing the light at the elephant from a multitude of perspectives faster and faster until he turns into a silent film.!/item/nkad/A+Silent+Film+-+You+Will+Leave+A+Mark!/item/17f24/A+Silent+Film+-+Driven+by+Their+Beating+Hearts

    The ratio of senses defines consciousness, but it is transcribed in our individual language paradigms in our minds. I think we’re all becoming aware faster and faster, but we’re leaving a trail of geological, biological, cultural, etc… evolution behind.

    And as a true scientist, I’ll leave you with a bitter taste in your mouth:
    “A family of trees wanted, to be haunted”. “Fiat ars-pereat mundus (Let art be created though the world shall perish) had become Fascism’s creed and influenced its actions.” or hopefully just an army of hipsters… or musicians… or sirens. You decide how to “pay” attention.

    “I’m extremely excited about something that’s only cool to me”:!/item/17qfv/yakballz+-+Machine+produced+by+Chapter7+

    Again, on a personal note:

    Process aMuse

    take no prisoner dilemma,
    the situation is dire,
    burn all bridges, sex on fire,
    Kings lie naked undercover admire,
    pillage village gun for higher
    order narrative streetcar desire
    slam the brakes, ID expired
    personality trashed, Killers Bones in a pile,
    home runaway. Tom Cruise? McGuire?
    Hello? Goodbye! spell check required
    hate-stoned love - blonde Barby wire
    Eye glazed like starecase. Black ice tripwired
    She gets my back. Disappointed inspire

  • Rafe Furst

    That’s the beauty of falsifiable predictions, you don’t have to be convinced, you just have to wait until the truth claim can be settled….

  • Anonymous

    “It is hard to define, but we will all recognize it in hindsight. And when we do, historians will agree that the levee broke in 2012.”

    I do not consider this the paragon of a falsifiable claim. This is typical of the vague, open-to-interpretation statements that carnival fortune tellers make. If we were to bet, what would the settlement procedure be?

  • Rafe Furst

    You name it and I will probably agree.

  • Anonymous

    OK. In January of 2020, we will ask the professor who has most recently taught “Introduction to Philosophy of Science” (or its successor course) at Stanford the following question:

    In what year did the theory of biocentrism, as formulated originally by Lanza and refined by others, begin to achieve mainstream acceptance among physical scientists as an essential part of the scientific method.

    If he says anything other than 2012, you lose the bet.

  • Anonymous

    OK. In January of 2020, we will ask the professor who most recently taught “Introduction to the Philosophy of Science” at Stanford (or its successor course as specified in the course catalog or by the Chair of the Philosophy Department) the following question:

    In what year, did the theory of biocentrism as originally formulated by Lanza and refined by others, begin gaining mainstream acceptance among physical scientists as an essential part of the scientific method.

    If he says 2011, 2012, or 2013, you win. If he says another year or something to the effect that biocentrism never achieved such mainstream acceptance, I win.

  • Rafe Furst

    My claim was “something big” in 2012, but I would accept the multi-year on biocentrism as you propose if you prefer that. What’s the wager? Let’s make it big…

  • Anonymous

    As I noted, my objection to “something big” is vagueness. Biocentrism seemed the most salient specific point in your post. I’m glad you think the 3 year spread is a fair compromise.

    I’ll put up $1000. I propose that you syndicate this out to the Tiltboys if you want more action.

blog comments powered by Disqus