Let's Build A Christian Doomsday Vault for Science
Academia pumps out 5 million articles a year. Forget about keeping up. Let's identify the best stuff for posterity
Imagine this scene: it’s late in the evening. Somewhere in the labyrinthine stacks of a university library, a grad student is scrolling on his laptop, his face illuminated by the blue glow of the monitor. On that monitor is a long, long list of search results on PubMed, the National Institutes of Health’s massive life sciences database. Glassy-eyed, the bedraggled student scrolls through thousands of results, overwhelmed with the volume of articles, studies, and abstracts. He hopes for that one hit that will clear things up, tell him what he needs to know. But as midnight approaches, he’s still wading through it all, hoping to find signal in the noise.
Note to readers: after countless hours of editing, cutting, re-writing, and hair-pulling, the next post in my series on the crisis of mental health in liberal teen girls, begun way back in May, hasn’t gelled. So, with autumn tipping into full swing, I decided to temporarily put it aside and pivot to another topic that’s been capturing my imagination: the possible role of Christians in consolidating, preserving, and beginning to interpret the enormous amount of scientific knowledge that’s been produced over the past few centuries.
This might seem like a dramatic departure from my last post, but the topics are actually connected: late modernity operates off a flawed understanding of human nature that, when articulated in policy, both causes widespread mental illness (hence the mental health crisis) and leads us, as members of a scientistic culture, to prioritize producing lots of scientific information (a decentered, machine-like task) over understanding what it means (a centered, organism-like task). I’ll return to the series on women’s mental health soon, but in the meanwhile I may publish several more essays like this one, exploring the role of the church in a rapidly de-Christianizing technological society — one whose false beliefs about human nature are colliding dramatically with reality.
Yes, the grad student in that late-night library was me. I was often overwhelmed by the tsunami of database search results, the panicky realization that I’d never be able to read it all. But every grad student, every postdoc, every researcher, period, in contemporary academia, knows this problem. Like most inhabitants of late modernity, scholars are awash in an excess of data —“information overload,” as Alvin Toffler called it. The amount of new research now published each year is so gargantuan — 2022 alone saw the publication of around 5.14 million peer-reviewed articles1 — that no one could hope to keep up with it, not even within the tiniest sub-discipline.
But as the mountain of data grows, the amount of meaning in it seems to shrink. The deluge of academic research is making research itself increasingly senseless. Obviously, this presents a challenge for the pursuit of knowledge. Maybe even a crisis.
But it also presents an opportunity.
Signal in the Noise
Perverse incentives induce today’s academics to publish far more than their peers can absorb, journals to accept more papers than their readers could ever read, and universities to churn out more PhDs than will ever find academic jobs. Academia is now an enormous Ponzi scheme that no one can stop; our only option is to wait for it to flame out on its own.
But meanwhile, its researchers are generating a tremendous amount of knowledge. And although much of it is bollocks, some of it — more or less, depending on the field — is important and very much worth keeping. Somebody smart should be sifting through it, assimilating it into the worldwide body of knowledge for the (very) long term. This would entail, say, identifying the most important and seminal papers in different fields and preserving them in a durable format, asking at every step: “What ideas or findings would a scholar or scientist 500 years from now, when the 1900s and 2000s have been mostly forgotten, most want to know?”
I’m talking about doing a kind of preemptive exegetical archeology on the currently available body of scientific research — not running new experiments, but sorting through the findings and theory that we already have, assessing their value, marking them for preservation, and, in time, collating their implications for future scholars. Digesting, if you like, the enormous feast of ideas and facts that the past centuries have poured down our collective gullet.
The Golden Age Is Temporary. Its Legacy Can Last Forever
I think we should do this because (1) scientific advancement in the future requires building on what’s been done in the past, which in turn requires knowing what’s been done in the past; (2) societies do collapse sometimes, which leads to massive knowledge decay unless important texts are intentionally preserved (or forgotten in desert caves); and (3) it would be interesting and fun.
Let me focus on reason #1, facilitating ongoing and future research. Successfully sifting through the currently available knowledge in a way that could actually benefit future researchers would depend on a number of other things going right. First, we’d need skilled prioritization. In the midst of our 21st-century deluge of research, we have to identify which portions could actually be important for future researchers. It won’t be easy. A lot of what’s being dumped into journals is so low-quality that it’s noise. Its authors aren’t trying to advance truth; they’re just hoping to get hired or tenured somewhere, or maybe advance a political agenda (or both, since in many fields these are increasingly the same thing). So we need to find the signal in that noise. Find the journal articles, books, findings, and reports that matter most, and give them priority.
To do this, though, we need space and time. Just as animals and humans dream each night to condense and prune the memories accumulated over the previous day, sometimes periods of exceptional cultural productivity can leave peoples and societies in need of long-term consolidation and perspective. Making sense of the past few centuries of knowledge production may a while.
Periods of extraordinary intellectual and cultural ferment are the exception, not the rule. Classical Rome lasted for well over a thousand years, but its best literature all comes from a relatively short period in the first centuries AD and BC.2 The Greek Golden Age — the era of Democritus, Plato, and Aristotle — covered only a handful of decades around the year 400 BC. It seems that it’s hard to maintain that level of ferment for very long.
In all probability, the past two centuries have been our modern, Western Golden Age. In a remarkably short burst, we’ve gone from basic Newtonian physics to Einsteinian relativity and quantum mechanics, from a 6,000-year-old earth to the neo-Darwinian synthesis and deep-space astrophysics. It’s been a tremendous historical explosion in terms of the sheer pace and volume of discoveries. But if history is any guide, we won’t keep it up forever.
In fact, our Golden Age may already be winding down. Returns on public investment in science have declined dramatically in recent decades, even as we pour ever-more collective resources into it. Top scientists think the most important findings in their respective fields mostly date from the early 20th century, or at least from before the 1990s. While the mythology of modernity may depict an ever-expanding technological empire of Progress™, the more realistic view is that Golden Ages rarely last more than two centuries, and our time is coming up.
This might sound like a dreary prognosis, but Golden Ages aren’t necessarily more fun to live in than other ages. In fact, they’re usually marked by tremendous upheaval and dislocation. It was probably a bummer to live through the Julian civil wars that inspired Virgil and Ovid. But Golden Ages are enormously culturally productive, leaving behind a tremendous feast of ideas and art for future generations to chew on. Plato and Aristotle wrote all their great works in a few short decades after the embarrassing collapse of the Athenian empire (also a bummer for Athenians), but in doing so they laid the foundation for more than 2,000 years of subsequent Western philosophy.3
Most people throughout history have lived in the long shadow of some previous Golden Age, writing stories and poems about its heroes, sifting through the knowledge and writings it left behind. Our descendants will do the same. Long after English has died out as a spoken tongue, students will be tortuously learning it, simply because the dazzling treasury of literature and science in our global language is, and hopefully will remain, incomparable.
So I’m suggesting that we help them out. Preemptively weed out the worst and most uninformative output, collate the most important ideas, and maybe identify outstanding questions that future geniuses might solve. It’s a win-win. Even if our scientific Golden Age catches a second wind, we’ll being setting ourselves up for ongoing progress by boosting the signal in the scientific noise. Researchers working now, or a decade from now, will thank us.
But of course no one currently employed in formal secular academia would or could take on a job like the one I’m proposing. Ours is (in its own self-image) an era of breakneck progress and innovation. Academic careers always depend on producing something new, on overturning old discoveries and debunking dated knowledge. The game lacks reward structures for preserving knowledge; trying to do so would kill your career.
So the task, if it’s ever begun, will have to fall to someone else. Someone with a very long time horizon, who’s not entangled in the perverse incentives of contemporary elite academia. Someone whose career doesn’t depend on generating splashy findings or supporting the ruling ideology, someone who can step back and think seriously about what things mean.
A Unfilled Niche
One of my basic assumptions for this Substack is that, as the postmodern world and its institutions race toward the Brave New World, the church (by which I mean in general all serious Christian believers, not just a particular institution or denomination) is increasingly going to have to step in and help society and individuals in unexpected, often practical ways. In some cases, it may even need to be ready to fulfill certain once-secular public functions, simply because other institutions increasingly won’t be willing or able to.
It’s not hard to find evidence for this take: just in the past couple of years, many institutions we used to trust, from media outlets to governments to the CDC, have lost tremendous credibility. Their competency seems to be in tatters, their leaders bogged down in vicious competitions for status and prestige, their operations warped by incentives for short-term thinking over the public good.
If this continues, serious long-term cultural planning will become the de facto preserve of private, organic communities and organizations. The church is just such a community. And conveniently or not, many serious Christians are finding that we have fewer and fewer incentives, or indeed opportunities, to play the secular academic game. This can be disappointing. But it also means that, like industrious immigrants to a wealthy city, scientifically trained Christians may be able to identify and serve new functions that no one else is serving. Such as preserving and sorting through the vast storehouse of scientific knowledge.
Now, let’s explore what this would actually look like.
How It Could Work
There are already plenty of libraries of record. There’s the Library of Congress. There’s the Boston Public Library and the library of Trinity College Dublin, and many more besides. These august institutions are doing a fine job of storing and recording the collected output of the sciences and humanities for the Western, and to some extent the entire, world. We don’t need to reinvent the wheel.
What I’m talking about is instead a curated storehouse of scientific and scholarly production, something much smaller and, potentially, replicable. Instead of the complete Reviews of Modern Physics — nearly a hundred years of quarterly issues — it would collate the most transformative, seminal, impactful, or (occasionally, when productive) controversial articles in the history of (for example) modern physics.
This still wouldn’t be a small project, by any means. There are a lot of sub-disciplines within physics. It would take a lot of work to identify and collate articles that would fit the bill for the distinct areas. No one could do this by themselves; this would be an effort of team “science.” You’d have to conduct interviews with experts, survey physicists across different fields, get input from historians of physics, and take quantitative measures of citations and other impacts.
Algorithms Are Blind
It would tempting to just use algorithms. Feed the relevant data — each article’s list of other articles cited, author relationships, etc. — into an app, then let the machine crank out an impersonal list of, say, the 100 most influential articles in each physics sub-discipline. Repeat for each scientific field. Then save copies of each of the articles or books to thumb drives and store them all in an underground vault somewhere. Voilà — problem solved, for a tiny fraction of the cost of a multi-year team science effort.
But there are several fatal problems with this let-the-algorithm-do-it approach. First, it could undervalue recently published research. By definition, papers that have been out for only three months are less influential than older ones. You could try to correct for this by reverse-handicapping more recent articles, but this would introduce a lot of noise into your conclusions. There just wouldn’t be enough variance between different papers. You’d be scooping up a lot of writings that would be better left forgotten.
Maybe you could institute an arbitrary cutoff — say, no articles more recent than the last five years. But that would artificially constrain your choices. Just in physics alone, you’d be eliminating the first photographic images of the galaxy’s black hole center, ignoring heated controversies about the mass of the W boson, and nixing findings (or lack thereof) that raise questions about the very reliability of scientific reductionism.
To make matters worse, older articles can be under-cited. The 1974 paper that successfully predicted the mass of the charm quark, a subatomic particle that helps explain symmetry at small scales, has been cited 480 times over the past half-century. That’s pretty good, but one 2014 paper from the Fermi Large Area Telescope team already has 345 citations, and its main finding was that the scientists didn’t find the evidence of dark matter they were looking for. Comparatively, the 1974 paper is “wildly underappreciated.”
Interestingly, that 1974 paper was lead-authored by a woman. I don’t know whether this affected her paper’s reception, but the question raises an important point: a scholarly article’s reception can be shaped by factors that are irrelevant to its actual scientific value. As a woman, Rosalind Franklin saw her findings completely overshadowed by Watson and Crick, even though her work in x-ray crystallography helped prove that the DNA molecule was a double helix. Any survey of citation counts from the mid–20th century would have underweighted her findings.
And of course, some people get attention for the wrong reasons. Every scientist knows at least one showboat colleague who expertly whips up hype for his or her projects. These people are often over-represented in the scientific record, but their work usually doesn’t age very well. They’re like Herbert Spencer, who went from the leading English-language philosopher in the late 19th century to a forgotten nobody just a couple of decades later. Any purely quantitative algorithm would scoop up a lot of superficial but substanceless Herbert Spencers.
Reintroducing the View From Somewhere
No, to do a good job making a curated record of modern science you’d need real people to complement any data-based methods by sifting through the real output and exercise careful, considered, and informed judgment. This would of course mean introducing some subjectivity into the process. But that’s kind of the point. This project is about prioritization, after all: picking out the important signals from unimportant background noise. And acts of prioritization always require a point of view, a particular location from which relative importance can pop out. A decentered universe is a flat one.
Of course, science is designed to show us a decentered universe — what Thomas Nagel called “the view from nowhere.” Scientists attempt to parse out human biases and attain an observer-neutral point of view. This strategy has a great success rate. It seems like the sun orbits the earth, but science (specifically, Galileo) showed that it’s the opposite. More recent astrophysics decentered us even more, showing that the sun is only orbiting a galactic center, which in turn is only a minor part of a larger local group of galaxies.
In the view from nowhere, we’re not the center of reality. We’re in an unimportant suburb of a middling town in the void.
But this only reveals the limitations of the view from nowhere. If you push it far enough, it sees everything as unimportant. A scientistic civilization will therefore always tend to multiply data rather than sussing out its meaning. It will pump out countless papers rather than making sense of the facts. We are that scientistic civilization. Our information landscape is characteristically flat.
So by assembling a team of scientifically trained Christians and fellow travelers (we’d need a lot of help from all kinds of people, Christian and non-Christian alike) to collect and collate the most important findings of science, we’d self-consciously be reintroducing a particular perspective into the scientific record. This wouldn’t mean suppressing findings that support, say, evolution. I was an evolutionary social scientist. Cherrypicking the scientific record would be anathema, and anyway all truth, including evolutionary biology, belongs to the church.
But I am talking about selecting books, articles, and findings with an eye to their future usefulness, and definitions of usefulness will differ. A lot of Silicon Valley billionaire-gurus believe we’re only decades away from the total obsolescence of biological humanity. They foresee a future of human-machine interactions, or even the replacement of humanity by machines. A scientific doomsday vault built by Silicon Valley transhumanists would hoard a lot of papers about the cognitive science of motor control and perception, but probably not much developmental psychology (because who needs to remember how humans developed?) and almost no anthropology.
Any intentional preservation of the vast riches of our scientific centuries will have to conform, then, to some particular vision of the future: a bet, if you like, about what the world that discovers our time capsules will be like. Unlike the techno-futurist vision, Christians see human nature as chugging on relatively unchanged, despite fluctuations in the fortunes of individual civilizations, for as long as the Second Coming is still pending. That could be tomorrow. It could be hundreds of thousands of years from now. Either way, there will never be any radical break from the past. Middle Earth, nestled between the Creation and the Conclusion, will remain Middle Earth, no matter how much technology we dream up.
I think this is a much more accurate understanding of reality than transhumanism, although of course I would think that. In any case, let the transhumanists build their own curated scientific repositories for the AI conquerors to peruse. We’ll build one that’s tailored for a Christian vision of the future, in which flawed people who happen to bear the image of God will always still be mucking their way through life, trying to do the best they can, in need of a meaningful legacy from the past. They may want to take a crack at solving the problem of quantum gravity someday.4 We can set them up for that.
Conclusion: Only the Beginning
Collecting and collating the world’s scientific knowledge with an eye to aiding future scientists in a possibly Christian civilization is about as pie-in-the-sky an idea as you can get. But I’m offering it for the purpose of daydreaming wildly about what Christian culture could look like as its wayward offspring, Western secular civilization, loses its power to shape the world’s imagination. It’s already happening, and as it continues, those of us who live in the West probably experience some sort of collapse, just as Russia did when the USSR fall apart. But Russia’s still around, in case you haven’t noticed. Collapse doesn’t mean the end of the world.
In fact, it can mean opportunity. There are a lot of settled assumptions, many hardened ways of doing things, that a crisis may help us unsettle. The exploitive mechanism of knowledge production grinds on, but few of its workers have time or leisure to appreciate what they’re making. An outsider, or a collection of outsiders, with a different perspective — one shaped by the Lord of the Sabbath — could make a thousand breakthroughs possible simply by sorting through what we’ve got and asking seemingly naïve questions. It could be the civilizational equivalent of taking a break, even allowing sleep to reorganize our thought processes. After more than two centuries’ worth of frantic work, that may be exactly what we need.
This essay was obviously partly inspired by Walter Miller, Jr.’ classic science fiction novel A Canticle for Leibowitz, about the slow recovery of civilization and science by Catholic monks after a global catastrophe. You should read it.
Assuming that articles average 6,000 words each and you can read 40 pages an hour, it would take you almost 300 years to read them all, without bathroom or meal breaks. Or sleeping. And that’s just for the research published last year! You’re going to need a lot of coffee.
Now that I’m no longer a professional academic, I don’t have to keep using the made-up terms “CE” and “BCE.” Hallelujah for that, at least.
The 20th-century mathematician and philosopher Alfred North Whitehead famously thought that all of Western philosophy consisted “of a series of footnotes to Plato.”
Full disclosure: my suspicion is that no one will hit upon a real solution to the problem of quantum gravity until the current civilizational-scientific order has gone the way of the dodo. Just like you or I sometimes need to walk away from a quandary for a few days before the obvious solution will appear, many seemingly intractable scientific problems are probably awaiting a time when we, distracted by more urgent issues, collectively take a break from solving them for a few decades.
Your essay reminded me of: The_Village_(2004_film)
This move depicts a planned society that resembles the Amish. A retreat from knowledge-intensive modern society (and other modern ills).
I enjoyed reading this: Re: Canticle for Leibowitz -- I've certainly heard of the book all these decades; I read so much science fiction as young person that I must have read it. But also feel that I may have missed it and I remember the title because I knew I should read it. Many times in all my study and research I've asked myself, "Why do this [particular study]"? And I reminded myself of the dark ages, that loss of knowledge does occur, and that adding to humanity's knowledge is a good thing. [Rationalization probably]. I appreciate your novel take on the topic.
This was novel to me: But if history is any guide, we won’t keep it up forever.
In fact, our Golden Age may already be winding down.
That is sobering. Hm.