Thinking With Only Half a Brain
A podcast appearance on science, religion, and the predictive brain
I sat down recently for a conversation with Yovanny Pulcini on her YouTube podcast, “Face to Face.” We covered my long strange journey from the hippie New Age counterculture through the human sciences to orthodox (Catholic) Christianity, the tensions between religion and science, and why a Trinitarian worldview makes surprisingly good sense. It’s a good conversation, and I hope you’ll check it out and enjoy.
One of the points Yovanny and I covered in this interview is that science, for all its many triumphs and successes, acts like only one-half of a brain.
The most cutting-edge theories in neuroscience (and its mutated cousin, artificial intelligence) see the brain as basically a prediction machine. The brain is always trying to predict your next experience, updating its models as each new bit of information comes in. This “predictive processing” theory explains a lot of basic facts — for example, that we feel unsteady after stepping onto dry land from a moving boat after hours on choppy waters.
After a while on the boat’s deck, our brain picks up on the persistent moving or rocking beneath our feet. It starts to predict that this movement will be continuous. That prediction percolates through our nervous system, and we acquire “sea legs” — that is, we become better able to keep our balance and walk because our brain is taking all the pitching and rolling into account.
But once we step back onto dry land, our legs suddenly feel like noodles, and we keep missing beats as we try to walk. This unsteadiness results from the lag between the change in our environment and our brain’s models. Basically, our brain is still predicting that the floor will be rocking beneath our feet. The disorientation we feel on dry land is a signal of “prediction error.” It takes a few minutes for the brain to replace the falsified prediction of constant movement with a new, more accurate one.
The goal of predictive processing is to minimize the gaps between our predictions and reality. Updating our beliefs like this is one way to do this. When the environment changes, we fix our models (eventually).
But according to cutting-edge theories, there’s also another way to minimize prediction error: change the world to match our predictions.
This second tactic is sometimes called “active inference.” It’s what allows us to, say, set goals and work to achieve them. The brain essentially tells the body how to act to fulfill its most important predictions. An example: if I set out to build a shed, I’m “predicting” the existence of a shed in my yard. At first, this expectation throws a big prediction error signal, because there’s objectively no shed there. But as I gather materials, set up the frame, and hang the siding, the gap between my prediction and reality lessens, until eventually — voilà! — my prediction becomes reality.
Predictive processing came up in my conversation with Yovanny because I think science cares only about the first tactic for minimizing prediction error. It’s only one-half of a brain. Scientists try to figure out what’s objectively true out there in the world by formulating and testing hypotheses. A well-designed but failed experiment means that something is wrong with your model, and you should (ideally) update your predictions. But building a shed doesn’t work like that. And things like governments, courts, universities, businesses, and marriages — that is, institutions — really don’t work like that. They aren’t empirically out there in the world.
You can run an experiment to test your prediction that hydrogen has an atomic mass of 1.008 daltons, but you can never run a controlled experiment to test whether Charles III is objectively the King of England. His status as king is mind-dependent: it hangs on everyone’s beliefs. Millions of UK citizens have to behave in ways that corroborate the prediction “Charles III is king,” or else he stops being king.
A culture that submits entirely to science will intrinsically have a hard time with this layer of constructed social reality. Our culture, for example. Skepticism is deeply embedded in the early-modern philosophies that laid the groundwork for our modern Western science. Francis Bacon thought that “in nature nothing really exists besides individual bodies, performing pure individual acts according to fixed law.” But if that’s all you think really exists, then eventually you disbelieve not only in God and spiritual realities, but also in more prosaic invisible realities, too. If everything is just matter in motion, there’s no room for the soul. But there’s also no room for school boards.
In a science-ruled culture like our own, then, we can expect a lot of chronic legitimacy problems: insufferable arguments about whether various institutions are real or credible, endless disputes among scholars and activists about whether institutional precedent has any credible hold on our loyalty or commitments, spiteful dismissals of anything that seems mind-dependent or socially constructed.
If you’ve spent any time at all in secular academia in the past two decades, this should all sound painfully familiar.
Scientists themselves don’t have to actually be the ones to spearhead these attacks on our shared social reality. As individuals, they mostly take institutions for granted like the rest of us. Instead, the whole problem of deflationary skepticism is just thoroughly ambient in our culture. Obsessed with discerning only what’s objectively true, we collectively lose the knack for actively — and collectively — willing things into existence. Whether cynically or good-naturedly, we’re intrinsically suspicious of anything whose existence depends on our own faith, which is what active inference really is about.
André Breton wrote that “the imaginary tends to become real.” That’s true most of the time. In fact, it’s one of humanity’s superpowers. Chimpanzees are intelligent, but they can’t dream up a temple or nation-state and then build it. Lacking language, their imaginative visions stay forever locked up in their own skulls. Chimps are limited socially to the here and now, the concrete and tangible.
When a human society decides to filter out the imaginary, ditch faith, and pursue scientism, it’s rejecting this uniquely human superpower. It winds up pushing humans back toward being chimpanzees: lacking institutions, unable to act in faith, stuck in the here and now.
Obviously, we’re not fully there yet. Despite our baked-in scientism, we Westerners still have plenty of institutions. We “predict” the existence of governments and scientific funding bodies and marriages and then use active inference to fulfill these predictions. (Although this is happening less and less in the case of marriage and probably many other institutions as well.) But my point is that the whole show is less stable than it might be if our civilization weren’t based on scientific skepticism.
This is all really just a complicated way of restating the old saw that science deals with facts, not values. Our scientistic society simply has a lot of trouble with the values part. We’re so busy trying to find out what’s true out in the world that we forget that we’re part of the world, and so our values and ambitions shape what the world contains.
Philosopher and psychologist William James viewed religious faith as basically a scaled-up version of the way that everything from marriage to monarchies rely on us to first believe them before we can call them real. Want to see whether God exists? James would tell us to do active inference on that prediction1: act as if God exists, then see how reality responds. Our culture’s default Occamian skepticism undermines this basically human ability to act in faith not only for God, but for everything else as well. As goes God, so goes marriage.
Of course, a lot of these observations might soon be obsolete. The era of domineering scientism is rumbling to a close. Modernity’s bluster seems less compelling, less confident, than it did ten or even five years ago. Macho New Atheism is long gone, and millions of young people (especially men) are suddenly reverting to Christianity.
But part of what’s driving young people into the arms of the Church is surely the sheer social and metaphysical instability that’s spooled out from five centuries of Science™ running things, first quietly in the background, then publicly as our sacred organizing principle. Saint Augustine argued that you can only live without God for so long. But millions of Millennials and Gen Zers know that you can only live without a stable social order for so long, too. The attempt to see the world purely through objective eyes is equally toxic to both of these needs.
The cultural commentator Mary Harrington thinks that, as interest in religion reawakens and technocratic scientism loses its prestige, we’re running the 17th century in reverse. If we are, maybe the future will see us relearn how to think — and believe — with more than half a brain. Maybe we’ll be able to balance objectivity with active inference. That is, with faith.
Anyway, enjoy the video!
After he had a quick tutorial on 21st-century neuroscience
Jim approached his recent transition from a scientific perspective… I sometimes forgot that was who he was….I had given him a copy of A Jesuit’s Guide to the Stars by Guy Consolmagno at Easter time which gave me a language to speak with him to some extent.