Obliquely Weekly #4 - an addendum to this week's 'epic'
Additional pointers on themes discussed last time
This piece threatened to get out of hand for length (Substack estimated a 33 min read time), which meant leaving out quite a lot.
One strand omitted there is a nagging thought that cognitive cause-effect relationships are often taken as read, unhelpfully passing into received wisdom.
That's what was happening when the fetish for 'fact-checking' and 'debunking' morphed from an individual activity into a mini industrial complex, mostly used to defend specific ideological doctrines.
People saw false information being shared online and jumped to these conclusions:
False information is bad
Producing false information is bad
The answer is to identify and hopefully stop the production of false information by showing people who like it that the information is false
My old disinformation is really a demand-side problem™ theory.
A related thought that didn't make it into the 'epic' is this:
Everyone talks about 'confirmation bias' as if reducing that would make us more open to countervailing information.
As if confirmation bias is the disease we need to cure to gain epistemic health.
But confirmation bias is downstream of something else, which is a need to totalise.
Totalising, wherein a desire for categorising things as wholly desirable or undesirable, entirely responsible or completely blameless, seems to be among the least useful ways of observing. It prevents progress in properly understanding things that don't have a single cause or motivation. Which is most things and most people.
See also, flattening.
It's hardly surprising that totalising is so common because it's modelled for us as a way of being, by all kinds of authority figures.
Political parties encourage it, corporations use advertising to do it and those whose mission to manage our behaviour do it too.
The supposedly reasonable assumption behind gross oversimplification of issues is that people cannot be trusted with nuance. But all that happens when inconvenient truths are buried, because there are bigger truths that might matter more, is that suspicions are aroused and trust ends up torched.
I've watched with dismay as this has unfolded in the dispute over whether Covid vaccines are 'safe' or 'unsafe'. Itself a totally flattening dichotomy.
Demanding that Covid vaccines are deemed 'safe' or 'unsafe' is as helpful as demanding that cars are deemed 'safe' or 'unsafe'. Because it depends on the circumstances.
That's why it's refreshing to see the nuance in a piece like this one. It's the best discussion I've seen yet around why so many people think there's a rash of people dying from Covid vaccines and whether they really are or not.
This piece landed just as my 'epic' was nearing completion. Rather than reference it in passing there, it warranted recommending separately as an important read in itself.
On the topic of polarisation Tom Stafford introduces the notion of seeing it through the lens of complexity theory.
Three years into a nascent interest in complexity and emergent phenomenon I don't yet feel competent to opine on it myself, but this happens to offer a perfect introduction to at least part of that field.
I also appreciate how it chimes with my own priors about open-mindedness seeming to require a certain solidity of self and willingness to relinquish control.
This piece - and the recommendations in it - will keep you going for a few hours.
Brian Chau touches on a subject I ponder often. What it is about the educated professional managerial class (my own milieu really) and its fondness of credentialed 'technocracy' as a panacea. Which I tend to suspect stems from a feeling of being threatened by the 'plebs' and their attachment to ideas are deemed unfashionable by the ‘intelligentsia’.
After all, Brexit and Trump showed that these people were just too stupid to listen to us. This was certainly how I felt too, at the time.
What else could be going on that triggers a mix of grandiose and vulnerable narcissism among the midwit class?
And on the same theme, this one's a gem.
The part about educated Chinese favouring state control of information because they know how to get around it is glorious.
Very good! If I can make a recommendation on emergence and complexity readings, "The Origin of Wealth" by Eric Beinhocker is a really good place to start. Plus, it focuses on economics so I like that, but really it starts from the bottom up, so the focus is sort of irrelevant because it teaches a ton on the subject along the way. It really kick started my interest in agent based modeling, complex adaptive system modeling and the like. It's a long read too, because it is very content dense; I found I couldn't read it when at all drowsy because I missed things, and I never really felt like skipping ahead because the author was repeating something for the millionth time and I GET IT ALREADY! One of the rare books where I didn't think "This could have been like 200 pages shorter."