Civilizational Memory and Resilient Knowledge
[Update: this is a subconscious paraphrase, or at least extension, of Jonathan Blow’s excellent talk Preventing the Collapse of Civilization which I watched a few months ago. Thanks JP for the reminder.]
The US government used to make a substance called Fogbank, a critical component in fusion bombs. Then we stopped making it for a while, and “forgot” how to make it.
Actually, it turns out we never really understood how to make it at all. When we wanted to make more of it, we created a new facility based on the historical records of the first time around. That plan didn’t work. It turns out that what made Fogbank work were certain impurities in the final product, traceable to chemical impurities in the input. The modern inputs were purified better than the historical inputs, so the impurities disappeared, and Fogbank 2 didn’t work until this was all tracked down and figured out. No one knew this the first time around. (See page 20 here, via a comment on MR.)
This is story is both terrifying and comforting. It’s good that we were able to eventually figure out the problem. But fusion bombs are one of the most technically sophisticated artifacts of modern civilization, produced by the wealthiest country in world history. Massive resources have been put into their research, engineering, and the institutions responsible for them. What else could fail similarly? What would it cost to fix? What would the ramifications of temporary failure be?
Most of our modern technosphere relies on extreme specialization in complex engineering disciplines. However, much of this knowledge is implicit, rather than explicit.1{#identifier_0_597.footnote-link.footnote-identifier-link} A very short version might say: explicit knowledge is what you can write down, implicit knowledge is what you can’t. “Carbon has 6 protons” is explicit knowledge, “how to evaluate a novel” is implicit. Of course you could write a guide to evaluating a novel, but reading that guide would not lead people to perform the same evaluation that you do. Another example: most recipes are actually a blend of explicit and implicit knowledge. The amount of the ingredients and the order of adding them is usually explicit, but knowing when something is done, or when it has been suitably mixed, or small adjustments based on variable ingredients, are all implicit.
Worse, often these processes are highly context-dependent, and the people performing them don’t know what elements of the context really matter. This is the case for Fogbank – the nuclear physicists didn’t know that the impurity was relevant.
This implicit/explicit divide isn’t just on the level of individuals, but also institutions. Codifying process is virtually impossible, and any system with humans in it is organic and adaptive, so defined processes become obsolete immediately. If an institution (a research lab, a company, a division) dies, that knowledge doesn’t live on in any one individual: it dies as well. Even explicit knowledge is often under-documented in organizations. Most broadly, there is an intelligence in systems, and we often don’t know how to recreate it.
Markets can probably ameliorate some of these concerns. If components are truly critical, there should be strong incentives for firms to maintain these systems of knowledge. And one would hope that for critical components, the market incentives are such that things could be rediscovered quickly. But firms can go out of business for unrelated reasons, and much of our critical infrastructure is highly concentrated or actually quasi-governmental, so markets cannot be the only solution.
I’d like to read more about this – is there a good literature out there? What would the field be called – knowledge resilience?
Some related links and ideas:
- Some works that gave me a sense of the complex technical underpinnings of modern civilization
- The old joke about cutting the ends off a roast: https://www.snopes.com/fact-check/grandmas-cooking-secret/
- Agnes Callard’s recent essay on the different between advice and coaching – to do X, you have to become the kind of person who can do X, most pertinently by learning to evaluate X well enough to know how well you’re doing at it. https://thepointmag.com/examined-life/against-advice-agnes-callard/
- PCR is a critical process in modern biology, but it’s hard to teach! https://meaningness.com/metablog/rational-pcr
- Passing along critical survival knowledge in evolutionary history: https://slatestarcodex.com/2019/06/04/book-review-the-secret-of-our-success/
- Learning by noticing: you have to know which elements of a process are critical https://dash.harvard.edu/bitstream/handle/1/9804491/RWP12-044_Hanna.pdf
- This is closely related to the replication crisis across so many scientific disciplines — not the statistical p-hacking part, but the ability to actually exactly replicate a scientific protocol.
- I had a conversation with Matt in 2013 about the state of behavioral science, which has collected a wide array of different biases but has no underlying theory of which of them matter most in which situations. He brought up a great analogy (don’t know if it was original): imagine scientists taking measurements of the temperature at which water boils all over the world, before we had a concept of “pressure” – which is the relevant context.
- Mek’s recipes for rebooting civilization https://fromscrat.ch/
-
I don’t have a good citation for this – I really only know this dichotomy in an informal sense. Some googling suggests that it traces back to the work of Michael Polanyi but please chime in if you know more! ↩
Comments (0)
To leave a comment on this post, send me an email.
Revision History
(About)