The New Scientific Enlightenment

There is a massive paradigm shift occurring: beliefs about the nature of scientific inquiry that have held for hundreds of years are being questioned.

As laypeople, we see the symptoms all around us: climatology, economics, medicine, even fundamental physics; these domains (and more) have all become battlegrounds with mounting armies of Ph.D.s and Nobel Prize winners entrenching in opposing camps.  Here’s what’s at stake:

. . .

Scientific Objectivity

In 1972 Kahneman and Tversky launched the study into human cognitive bias, which later won Kahneman the Nobel.  Even a cursory reading of this now vast literature should make each and every logically-minded scientist very skeptical of their own work.

A few scientists do take bias seriously (c.f. Overcoming Bias and Less Wrong).  Yet, nearly 40 years later, it might be fair to say that its impact on science as a whole has been limited to improving clinical trials and spawning behavioral economics.

In 2008, Farhad Manjoo poignantly illustrates that our indifference to the pervasiveness of cognitive bias, combined with digital-age network effects has lead us to a crisis of truth; none of us, from Dittohead to Nobel Laureate, are able (practically speaking) to distinguish objective, scientific truth from carefully crafted stories.

What’s worse is that we continually craft these stories for ourselves as we go through life… it’s the basis for rational, conscious thought.

What’s even worse is how individual biases combine with scientific institutional biases (e.g. publication bias) to unintentionally corrupt the scientific process.  It’s gotten to the point where in 2010, headlines like these have seemingly little impact on the scientific community, and go unnoticed by policy makers and the general population alike:

• Million-dollar industry payments to doctors go undisclosed
• Science fails to face the shortcomings of statistics
• We’re so good at medical studies that most of them are wrong

“New Kinds” of Science

“[A]mong the great majority of active scientists… [reductionism] is accepted without question…. [T]he relationship between the system and its parts is intellectually a one-way street.” (Nobel Physicist, P.W. Anderson, 1972)

“if the stars in the universe were fractally distributed it would not be necessary to rely on the Big Bang theory” (new math applied to a cosmological paradox by Mandelbrot in 1974)

“The revolutionary new discoveries [complexity] researchers have made… could change the face of every science from biology to cosmology to economics.” (Waldrop, 1993)

“[U]nexpected results force a whole new way of looking at the operation of our universe…. [T]he ultimate scope and limitations of mathematics, the possibility of a truly fundamental theory of physics, the interplay between free will and determinism, and the character of intelligence in the universe.” (summary of A New Kind of Science, 2002).

“Reductionism is so rigid in its hopes to ‘entail’ everything in the unfolding of the universe…. I think reductionism is incomplete.” (Stuart Kauffman, 2009)

The “newness” that underlies each of these claims is not exactly new; it’s been discussed since at least the time of Aristotle.  The idea being that the whole is greater than the sum of the parts.

And that if scientific method focuses solely on reductionism — dividing the whole into easily-understandable parts — this will blind us to true understanding.

What’s new is the scientific approach to studying the bottom-up dynamic known as emergence.

The Nature of Scientific Proof

What counts as scientific evidence is always an evolving target, and an issue that can make or break careers.  But when human lives are on the line, scientific proof is not so important.  Inasmuch as science is more than self-indulgent theory, callings into question of scientific rules of evidence must be taken very seriously.

All proofs assume some specific form of consistent logic.

All logics are either incomplete or inconsistent.

Therefore, no proof is complete.

Ever since Karl Popper, scientists have been obsessed with syllogisms like this one (aka deductive reasoning)and its handmaiden, falsifiability.

This obsession has been at the expense of completeness though, as there are dozens of forms of logic which are equally consistent (like induction and abduction), but which rely on and promote creativity.

Statistical methods have played an increasing role in all of science since the advent of the computer.  Yet, their well-known limitations have been summarily ignored by science.  In the aforementioned excoriation of statistics it was observed that “The difference between ‘significant’ and ‘not significant’ is not itself statistically significant”.

In The Black Swan, the author (Taleb) pulls even fewer punches about what these shortcomings mean for scientific pursuit.  Ultimately he concludes that the combination of “unknown unknowns” with cognitive biases and increasing universal complexity means we are doomed to an eternally widening gap between what we know about the world and what we can know.  Even worse, we will become increasingly ignorant of or ignorance — by construction(!) — leading to dark times ahead.

Some scientists claim that the same epistemological conundrum is leading in the exact opposite direction: towards enlightenment.  Ironically(?) a modern convergence of science and spirituality is occurring, as epitomized by integral theory beginning in the late 1960s, but branching into related forms at an accelerated pace.  See, for example, a popular documentary, the rise of nonduality, and a new scientific conference.

Life, the Universe and Everything

For centuries, mathematicians and philosophers believed that everything true about the universe could — though not in practice, at least theoretically speaking — be written down and codified.  But then in 1931, Kurt Gödel proved an incompleteness theorem that dashed this hope even in principle.

It’s taken time for the gravity of Gödel’s discovery sink into the “grey matter” surrounding fundamental physics, but it’s finally starting to happen: “Some people will be very disappointed if there is not an ultimate theory…. I used to belong to that camp, but I have changed my mind.” (Stephen Hawking, Gödel and the end of physics, 2002)

The reticence of the scientific community to accept the incontrovertible could be seen as an example of science “advancing one funeral at a time” (an epithet wielded by Nobel physicist, Max Planck).  But it turns out the core issue is a paradox that is actually thousands of years old.

The issue is self-reference and it leads to many forms of paradox and uncertainty, including Heisenberg’s famous uncertainty principle.  The implications of self-reference for physics were summed up recently as follows: “After decades of debate, disputes over the mathematical rules governing reality remain unresolved” (ScienceNews, 2010)

Self-reference not only gives fits to quantum physicists, it calls into question one of the most basic premises of science, “the assumption that our world operates according to causal laws.” (Science’s First Mistake, 2010)

And it’s not just theoretical: one of the largest scientific industries in the world is desperately trying to fight the real-world effects of self-reference.

It should come as no surprise then that, in the void, theories that were once laughable to mainstream science are getting serious attention.

One such theory, biocentrism, challenges the primacy of physics in the pantheon of science, and replaces it with biology.  Moreover, it picks up the thread of observer-dependence laid down by quantum physics and weaves it into a narrative of a conscious universe, reminiscent of integral theory and its ilk.

If, as Einstein noted, “the belief in an external world independent of the perceiving subject is the basis of all natural science”; and if, as Popper proposed, falsifiability is the criterion demarcating science from non-science; then…

We must take biocentrism seriously. Because unlike every other “new age” theory accused of over-fitting the data, biocentrism makes falsifiable predictions which will likely be settled soon.

Plus, it may be the case that biocentrist theory explains phenomena that is often observed but not politically correct to talk about.  Says one noted scientist, “What Lanza says in this book is not new. Then why does Robert have to say it at all? It is because we, the physicists, do NOT say it––or if we do say it, we only whisper it, and in private––furiously blushing as we mouth the words.”

The Truth Wears Off

As if on cue, in December of 2010, The New Yorker published a tour de force of scientific journalism by Jonah Lehrer, subtitled, “Is there something wrong with the scientific method?”  Here are quotes from two well-respected scientists in the article:

“Whenever I start talking about this, scientists get very nervous” (Jonathan Schooler)

“[Michael] Jennions admits that his findings are troubling, but expresses a reluctance to talk about them publicly. ‘This is a very sensitive issue for scientists’”

What they are talking about is a mysterious phenomenon that resembles the bizarre observer-dependent causality of quantum mechanics, but at the scale of human population studies.  Referred to alternately as the “decline effect” and “cosmic habituation”, the effect has been shown to be both repeatable and predictable.  In a nutshell, the decline effect is as follows:

Many well-established, scientifically validated “facts” seem to decline in their validity over time as they are retested.

This is despite collective best efforts to recreate the exact conditions of initial experiments and account for red herrings like flawed statistical application, regression to the mean, publication bias, reporting bias, confirmation bias, survivorship bias, “significance chasing,” financial and other conflicts of interest, data collection issues, improper experimental controls, a priori faulty experimental design, and so on.

And yes, if you were paying attention, you would notice that the decline effect has a strong element of self-referentiality….

. . .

Not the Wind, Not the Flag

When reading about the decline effect, biocentrism and nonduality, it’s difficult for us scientifically trained folk to turn down our highly-attuned falsification mechanisms long enough to allow the synapses to connect the dots.  But the dots do seem to be connecting themselves lately.

The year after receiving a Masters of Science in Computer Science / Artificial Intelligence I read a book that I still recommend to people as the most important book I’ve read to date.  It begins with the following claims:

“[T]he mind is inherently embodied, reason is shaped by the body, and since most thought is unconscious, the mind cannot be known simply by self-reflection.”

“What universal aspects of reason there are arise from the commonalities of our bodies and brains and the environments we inhabit…. [R]eason is not entirely universal.”

“[W]e have no absolute freedom in Kant’s sense, no full autonomy.”

“The utilitarian person… does not exist…. People seldom engage in a form of economic reason that could maximize utility.”

“The phenomenological person, who through… introspection alone can discover everything there is to know about the mind and the nature of experience, is a fiction.”

“There is no poststructuralist person–no completely decentered subject for whom all meaning is arbitrary, totally relative, and purely historically contingent, unconstrained by body and brain.”

“[T]he classical correspondence theory of truth is false…. [T]hat statements are true or false objectively, depending on how they map directly onto the world….”

“There is no such thing as a computational person…. The neural structures of our brains produce conceptual systems and linguistic structures that cannot be adequately accounted for by formal systems that only manipulate symbols.”

“[T]here is no Chomskyan person, for whom language is pure syntax…. [C]entral aspects of language arise evolutionarily from sensory, motor, and other neural systems that are present in “lower” animals.”

At the time, these statements seemed scientifically blasphemous, but I suspect they would seem mundane and obvious to a recent MS graduate.  Though I sensed the inherent truth being spoken, it’s taken me a while to truly come to terms with the implications for my own understanding of the world.

As best as I can articulate, the questions being raised by “all of this” are as follows:

  • Are there any universal truths at all?
  • When we observe the world scientifically, are we studying the the universe as it truly is, or are we studying the nature of our minds, the structure of consciousness?
  • Can scientific breakthrough occur without some form of completely irrational faith or spirituality?

These are questions that I still do not know the answer to, but I am getting more comfortable with the not knowing.

What I like best about the practice of embracing “I don’t know” is that it allows us to step back and get clarity on questions like, “Are there any universal truths?”  And when we do step back, we realize that the question is not about “truth” or “universal” but rather “are there”.

. . .

“Do numbers exist?”

Which brings us full circle to question of independence.

When ScienceNews ran the article chastising scientists for rampant misapplication of statistical methods, the word “independence” was used only once (in the context of replicating an observed finding).

This is a curious fact considering that the real elephant in the room, statistically speaking, is always whether correlated observations are independent of one another.  If they are not, then the statistical methods that scientists use violate the fundamental assumption required of them.

Of course, we all know that there is no such thing as true independence, don’t we?  The butterfly effect and so on.  But we use statistical theory and make probabilistic arguments all the time, both in science and in life.  Without it, we’d be paralyzed to the point of inaction.

As scientists though, we’ve got to admit, it bothers us (doesn’t it?) that one of the fundamental assumptions of our daily activity is false.  That we lie to ourselves every day so that we can get on with our work.

The biggest elephant of all though is the question of whether the world exist independent of our observing it.  You and Descartes may be sure that it does, but I still don’t know.

There’s a question that was posed to me in my freshman calculus class in college, that I thought I knew the answer to, but which has nagged at me ever since: “Do numbers exist, or are they constructs of the human mind?”

To this day, I ask the same question of luminaries at scientific gatherings, and I have yet to get a definitive answer.  Sometimes I ask it in a slightly different way: “Is math inherently part of the structure of the universe?”  I have yet to get a satisfying answer.

. . .

When the Levee Breaks

None of this existential teeth gnashing is new in science.  Thomas Kuhn coined the phrase paradigm shift to describe revolutions of thought such as the transition from Ptolemaic to Copernican cosmology in the 16th century, or the shift from classical to quantum mechanics last century.

But what is new is the accelerated number of paradigm shifts that are occurring during the span of a single human lifetime.   In other words, we are at a crossroads in history during which multiple pillars of the existing scientific method are cracking at once.

What exactly does this mean?

I don’t know.

But the levee is about to break.

And in the spirit of scientific progress I will give you my own falsifiable prediction (or is it resolution?) as the new year approaches:

Something big is happening.  Something positive.  We can all feel it, can’t we?  It is hard to define, but we will all recognize it in hindsight.  And when we do, historians will agree that the levee broke in 2012….


  • Sam Chauhan

    Great job Rafe!

  • Anonymous

    Unsurprisingly, I am very skeptical of this argument. I interpret the story you tell as proving that the scientific method _works_. There is nothing unexpected here:

    (1) We know a priori that most hypotheses are wrong.

    (2) We know a priori that there is a base rate of errors that will lead to spurious positive results: experimental design flaws, measurement errors, statistical errors, confirmation bias, and publication bias, to name the most obvious.

    Therefore, we can calculate from Bayes’ Rule that most positive results will thus actually be wrong. The exact proportion depends on your priors for the two variables. For sake of argument, say that 95% of hypotheses are wrong and that 10% of all positive results are flawed. For completeness, assume that 10% of all negative results are also wrong.

    Baye’s Rule says that in this very reasonable scenario, 68% of all positive results are actually wrong. Even though experiments are “90% accurate” in the colloquial sense.

    If the scientific method works, however, it should thus eventually conclude that most positive results are wrong. Which, as your post illustrates, is what happens.

    No need to invoke observer effects, Godel, or spirituality. Just good old fashioned probability at work here.

    • Rafe Furst

      That’s the beauty of falsifiable predictions, you don’t have to be convinced, you just have to wait until the truth claim can be settled….

      • Anonymous

        “It is hard to define, but we will all recognize it in hindsight. And when we do, historians will agree that the levee broke in 2012.”

        I do not consider this the paragon of a falsifiable claim. This is typical of the vague, open-to-interpretation statements that carnival fortune tellers make. If we were to bet, what would the settlement procedure be?

        • Rafe Furst

          You name it and I will probably agree.

          • Anonymous

            OK. In January of 2020, we will ask the professor who most recently taught “Introduction to the Philosophy of Science” at Stanford (or its successor course as specified in the course catalog or by the Chair of the Philosophy Department) the following question:

            In what year, did the theory of biocentrism as originally formulated by Lanza and refined by others, begin gaining mainstream acceptance among physical scientists as an essential part of the scientific method.

            If he says 2011, 2012, or 2013, you win. If he says another year or something to the effect that biocentrism never achieved such mainstream acceptance, I win.

            • Rafe Furst

              My claim was “something big” in 2012, but I would accept the multi-year on biocentrism as you propose if you prefer that. What’s the wager? Let’s make it big…

              • Anonymous

                As I noted, my objection to “something big” is vagueness. Biocentrism seemed the most salient specific point in your post. I’m glad you think the 3 year spread is a fair compromise.

                I’ll put up $1000. I propose that you syndicate this out to the Tiltboys if you want more action.

              • Rafe Furst

                I realized just now that I forgot to accept your wager! I’d like to appeal to your sense of conviction though and suggest we make it a little scary for you, say $10K? You could escrow $1000 now in Rightside and have ample funds to pay me off in 2020.

                Also, shall we post it on LongBets.org?

              • Anonymous

                As you know, I’m not really a betting man. $1000 is plenty scary for me. I think this will only be my third prop bet ever over $100. But if you want to be more scared, I’m sure the Tiltboys will oblige.

              • Rafe Furst

                Ok, what about going more public with it then (i.e. LongBets.org)?

              • Anonymous

                Feel free to publish it anywhere you would like. But you pay any listing fees :-)

                Probably goes without saying, but nothing that could be interpreted as specifically lobbying people in the Stanford Philosophy Department.

              • Rafe Furst

                Hmm, I guess it doesn’t go without saying b/c I’m not sure you what you
                mean.

              • Anonymous

                My agreement to publish anywhere extends only to general fora. I don’t want you, for instance, sending the bet to a Philosophy Department listserv with a long argument for your position. The only priming the expert gets is the question written above plus whatever general background they have.

                Now, if you want to write a paper about biocentrism for a philosophy journal, that’s cool. But no sending free copies to everyone in the Stanford department.

              • Rafe Furst

                Just to be perfectly clear, I accept the wager for $1000. And I will not lobby anyone in the Stanford Philosophy Department, in letter or spirit.

  • Alex

    Rafe,
    I think this is the key question:
    “When we observe the world scientifically, are we studying the the universe as it truly is, or are we studying the nature of our minds, the structure of consciousness?”

    Here are some related ponderings:
    1. http://www.media-studies.ca/articles/echoland.htm – How’s the Internet changing us? We are losing feeling and obsessing with data, structures, info. But “losing” and “obsessing” are opinionated observations. Restructure of consciousness? What is data? What is music? onto #2

    2. http://www.poetryvisualized.com/media/1950/ETYMOLOGY/
    “Weed through the slang of pompous modern man, past every status phrase coined for new roles, back past the jargon forged for every plan that sought to raise mankind to higher goals, then back beyond the crafty printing press that made a civil language of each tongue, yes, back when words could fluidly express a hero’s tale, when praise and prayer were sung, then back into the prehistoric slime to find that beastly grunt or frightened groan and like a riddle trace a word through time. It’s age-old journey’s not unlike our own, for with each subtle change you’ll find unfurled within that word the history of the world”

    3. http://www.cgjungpage.org/index.php?option=com_content&task=view&id=134&Itemid=40
    “Will we ever realize the possibility that “truth” has already arrived on earth and that we don’t know this because we keep looking for it with our eyes rather than feeling for it with our hearts?”

    On a more personal note. Here’s a little rhyme:

    A Decent into the Male-Störm
    going through the Poe-tree in motions
    counterclockwiseguy intuitive emotions

    “He is demanding of art a task far more difficult that is, to undo the alienation of the corporeal sensorium, to restore the instinctual power of the human bodily senses for the sake of humanity’s self-preservation, and to do this, not by a…voiding the new technologies, but by passing through them.”

    Sensory Alienation… Alien Nation. You feel me? Word play… different perspectives… flashing the light at the elephant from a multitude of perspectives faster and faster until he turns into a silent film.
    http://hypem.com/#!/item/nkad/A+Silent+Film+-+You+Will+Leave+A+Mark
    http://hypem.com/#!/item/17f24/A+Silent+Film+-+Driven+by+Their+Beating+Hearts

    The ratio of senses defines consciousness, but it is transcribed in our individual language paradigms in our minds. I think we’re all becoming aware faster and faster, but we’re leaving a trail of geological, biological, cultural, etc… evolution behind.

    And as a true scientist, I’ll leave you with a bitter taste in your mouth:
    “A family of trees wanted, to be haunted”. “Fiat ars-pereat mundus (Let art be created though the world shall perish) had become Fascism’s creed and influenced its actions.” or hopefully just an army of hipsters… or musicians… or sirens. You decide how to “pay” attention.

    “I’m extremely excited about something that’s only cool to me”:
    http://hypem.com/#!/item/17qfv/yakballz+-+Machine+produced+by+Chapter7+

    Again, on a personal note:

    Process aMuse

    take no prisoner dilemma,
    the situation is dire,
    burn all bridges, sex on fire,
    Kings lie naked undercover admire,
    pillage village gun for higher
    order narrative streetcar desire
    slam the brakes, ID expired
    personality trashed, Killers Bones in a pile,
    home runaway. Tom Cruise? McGuire?
    Hello? Goodbye! spell check required
    hate-stoned love – blonde Barby wire
    Eye glazed like starecase. Black ice tripwired
    She gets my back. Disappointed inspire

  • Anonymous

    OK. In January of 2020, we will ask the professor who has most recently taught “Introduction to Philosophy of Science” (or its successor course) at Stanford the following question:

    In what year did the theory of biocentrism, as formulated originally by Lanza and refined by others, begin to achieve mainstream acceptance among physical scientists as an essential part of the scientific method.

    If he says anything other than 2012, you lose the bet.

  • Jay G.

    Thanks for the great read, Rafe. Your post and the references you provided left me quite unsettled — raises some fundamental questions about our most basic assumptions. It’s cool, but scary.

    One thing that concerns me greatly is the question of what we do in the interim — in the years between now and the time when a more accurate metaphor/paradigm is developed. Obviously we need to look at studies and experiments with a more critical eye and reconfigure our notion of what constitutes scientific proof. But I find myself frightened by the potential power of uncertainty.

    If proof is endlessly elusive, it empowers anybody to say just about anything. If a study or finding doesn’t comport with pre-defined world view, the sceptic can just say he doesn’t believe in the study/experiment.

    I’ve encountered this a lot recently–smart people, really smart people taking a position WAY outside the mainstream, and when challenged say about the study or body of literature in question, “It’s flawed.”

    It’s tough to argue with such statements, particularly when they’re coming from very bright, educated people. But what I find left is that the absence of 100 percent studies, really smart people start relying on the assumptions and biases they’ve developed in their upbringing. There’s a tendency to substitute inadequate analyses with none.

    Anyway, great stuff. Much appreciated.

  • Alex

    [Thread moved from underneath the bet thread by KD.]

    just because the two of you agree on a wager as decided by a third, so called expert, doesn’t mean it’s not “vague, open-to-interpretation statements”. “that carnival fortune tellers make” puts that whole sentence in the realm of “metaphor” 4 me. It’s more “rating agency” free market realm, than science. Enough to settle a “bet”, but also leaves a gaping void.

    your and anyone’s use of language is like this:
    http://www.little-dragon.se/
    call it self-referential science if you want, but the jetpacks lie in metaphors and not reduction of type I errors or betting, which tend to get stale with age. I’d love to get a response without getting more aggressive with my assertions in order to simply solicit one. :)

    • Anonymous

      Dude, I have no idea what your comment means. Feel free to start a different top level comment thread to discuss as it’s not on directly on point in Rafe’s and my bet thread. Or put up a blog post about betting.

      Rafe and I have a mutually agreeable knowledge discovery procedure based on willingness to bet. I don’t really care if anyone else likes it or not. It’s really about him and I triangulating on what the other believes.

      • Alex

        I’m talking about my comment from a week ago. Rafe’s whole post boils down to one question:

        “When we observe the world scientifically, are we studying the the universe as it truly is, or are we studying the nature of our minds, the structure of consciousness?”

        Your mutually agreeable knowledge discovery system “scientifically”. And your agreed upon professor sounds like the Oracle or Architect in The Matrix. If that works for you guys, that’s wonderful. But I assume you want to share your arguments and views with the world if you blog about it and we are here to EXPLORE complex adaptive systems, which is why my post is about metaphors, poetry, and music. Those things are to the scientific method, what world wide web is to structured data. It’s already happening. Poetry and music are becoming considerably more personalized and evolving faster and faster. Transmission of life frameworks. It’s also a personal choice to enter this biocentric paradigm and we’re definitely not here to establish what we won’t talk about or do till 2020, Kevinsdick. You can statusquote me on that.

        • Anonymous

          What about “feel free to start a different top level comment thread to discuss” was unclear? My prior on the value of engaging you on this topic is close to zero. I’d rather we didn’t pollute Rafe’s and my specific discussion with that vain endeavor. I plan to move the previous three comments to the top level within 24 hours.

          • Alex

            Make it 4 anytime. If the two main contributors don’t follow or engage, then I don’t see any value in contributing. You gotta engage your audience or it will just be you two duking it out without duking it out till 2020

            • Anonymous

              It’s not that I don’t want to engage on this topic with anyone else before 2020. Rather, I don’t want to debate the merits of betting with you in the same thread as negotiating a specific bet with Rafe.

              (a) It makes it hard to track the different discussions and (b) the indenting causes comments to get obscured after too many posts. In fact, your previous comment was completely invisible until I moved the thread.

  • Techofrays1

    Interesting stuff! Ofcourse you are crossing over into the realm of Belief which science is not equipped to deal with. The very basis by which science operates is flawed in that it is not designed to incorperate much of the content you are putting forward.

    Add the Human elements that seem to pervade every aspect of theory and peer review and you have a system that is incapable of producing any meaningfull truth where the human condition is concerned.

    The facts are simple…

    We are still infants groping in the darkness, clinging to any comfort we can glean.

    We willfully ignore the obvious conection between ourselves and the universe as a whole. We see ourselves as independant of each other and refuse to acknowledge the obvious(Quantum entaglement anyone).

    We embrace our own destruction and snarl at any attempt to remove the security blanket of lies we need to ensure our march towards oblivion.We do this for no better reason than apathy!

    My point is simple. Humanity is doomed by it’s very nature. You may argue that science will save us or even that God will if Science can’t, but in the end, any logical conclusion of the ultimate fate of humanity must take into account the nature of humanity as a whole. Once that understanding is in place the outcome is obvious(given that a Massive shift in the nature of humanity does not occur).

    So if science 2.0 does emerge it will count little in the fate of our species. It may indeed prolong the inevitable, but it will not change the nature of humanity any more than Science 1.0, Religion, or any other breakthrough in human history.

    So go watch some porn- or go make some!!!! The clock is ticking.

  • We have been taught a slightly idealized version of the Scientific Method, which emphasizes Reductionist (Model Based) Methods. But for decades, starting in Life Sciences and spreading, we have increasingly been forced to use Holistic (Model Free) Methods. Genomics and Drug Discovery are prime examples; the term Model Free Methods was coined by Lionel S. Penrose, a famous pioneer geneticist (and incidentally father of Sir Roger Penrose) in a paper in 1935 (!).

    I discuss this at some length and even provide both a zoo of Model Free Methods an an example of them in action (the NetFlix Challenge) in my talk which is available at http://videos.syntience.com by the name “Science Beyond Reductionism”. It starts slow and loose… stay with it :-)

    My other talks there discuss how Artificial Intelligence research could start making major progress the moment we re-classify it NOT as a programming problem but as a Life Science. For more on this aspect, Google for Artificial Intuition and my name.

    We unabashedly use the word “Holism” to mean context-supported . This is not the “holistic” stuff often associated with crystals and aromatherapy, this is the hardcore epistemological holism that has been debated in the Epistemology and Philosophy of Science community every 20 years or so ever since Smuts (1926) and Schrödinger (1946).