How the hell do you work out which experts to trust?
And can we trust Dr. Robert Malone? Plus, bonus practical content
TL;DR:
This piece is about a horrible bind. Unless you think it's a cool and interesting bind, as I tend to. The bind is that we don’t trust experts when we don't like what they say. And we mostly aren't equipped with the necessary expertise to appoint experts we can trust anyway, so we do it in subtle ways that tend to remain private to us. The same often goes for the skills we need to sift relevant information from irrelevant or misleading information, even when all the information is real and 'true'.
It finishes up with some ideas based on my own process for determining who and what to trust.
Prelude:
This week, during meditation, the session guide says:
'Notice the weight of your body as gravity pulls it into the chair'
Cool. I'm noticing it. Normally the effect of gravity is outside my immediate awareness. Now, suddenly, it isn't.
'Follow each breath, from the moment it begins to the moment it ends'
Cool. Noticing that. Ah, there's a thought flickering too, about how rarely I remember to notice that I'm breathing during the day. So I'm now noticing the breath and that thought. Cool cool.
'Notice any mood. Are you sleepy. Restless. Happy'
Cool. I'm noticing an air of contentment and gladness that things are as they are, at present. Ah, there's another thought flicker. It's about how nice the day ahead looks from here.
'Who or what is noticing that and from where does the noticing come?'
Haven't a clue. That's the thought I'm noticing. I haven't the faintest idea what is noticing these things. I could say 'I am noticing them', but that isn't an adequate answer. The noticing just is. That's the state. Noticing is happening. I haven't a clue what is doing the noticing.
If this doesn't make sense, you've probably never investigated mindfulness. It's a hoot.
And if you've never investigated mindfulness you'll think there's a better response to the guide's question than mine.
Before flexing your superior reasoning skills I'll just let you know that the guide here is a widely cited neuroscientist who knows a fuck tonne about how brains work. So that's the bar for you to clear. The guy isn't a woo woo merchant.
The self-referential I is a mystery and anyone who'll tell you otherwise just hasn't thought about it properly.
This anecdote illustrates how rare it is to be comfortable with a vacuum, where confident opinions or certainty usually sit. Counter-intuitively, it's actually a kind of skill to be OK with not knowing.
How we think we know things:
The experience above was a far cry from what it was just over two years ago when I began formal mindfulness practice.
Throughout my life I'd have instantly shoehorned some flakey opinion into the space that the question leaves.
I would have just had to fill it.
Using some circular reasoning that misses the point. Stuff about synapses and chemicals and emergent properties. Maybe ideas covered as a philosophy student, when we looked at Identity and Mind and what words like that really signify.
Most of my life it would have literally been stressful to have no answer to that question.
I'm not talking about the same kind of stressful that accidentally smashing a mug is stressful. Or having a looming deadline.
What I'm talking about is an ambient kind of tension that you never actively notice. Perhaps slightly like a toned down impulse to change your state, akin to being hungry or horny or vaguely worried about something not quite immediately pressing.
All of this sets me thinking about our need to fill the space marked 'what do I think of x' and how we go about satisfying it.
We are scared of not knowing things. Ignorance feels like a threat to OKness. A question leaves an instant vacuum that gives rise to an urge to fill it.
This, to me, is a red flag. It suggests that the answer to 'what do you think about x' is typically (if we're honest) 'because I have to think something'.
Humans have done particularly well as a species on the basis of this urge, so it has evolutionary advantages.
But it does seem to give rise to a lot of nonsense too, which typically manifests as people like me and you having opinions on things of which we have no understanding.
Here are some problems and paradoxes. If you can solve them, you are a genius and should be publishing your own newsletter rather than reading this one.
Obviously there are experts with deep area knowledge.
Obviously there are categories of information we can call 'relevant' or 'extraneous' in specific contexts.
Obviously more information is better than less information when you want to understand things properly.
Obviously we can't all have deep area knowledge of everything, so we outsource to experts who we think do have it.
Obviously we need to classify information in certain contexts as relevant or extraneous. So we typically outsource those decisions to people we think are good at that.
Finally, there's obviously more information at hand to the average person than at any previous point in history. Happy days, right?
You know that meme? 'This is fine'? Where the dog calmly sits there, chugging from a mug, in the burning house.1
That's where we are.
It's partly because of a thing philosophers call 'the expert problem'. And another stinker someone has handily labelled 'relevance realisation'.
C. Thi Nguyen is a University of Utah philosopher I've occasionally been in touch with.2 He shared a pre-publication draft of a paper looking at the problem of achieving organisational transparency by using expert performance assessment. This part leaped out:
expert reasoning often requires expertise to assess. This is true even when the reasoning is fully explicit. The climate change scientist’s evaluation of computer climate change models; the statistician’s assessment of whether a statistical argument was improperly p-hacked: these evaluations may proceed from arguments and evidence that have been made fully explicit, and the evaluation process itself may be fully explicable. However, proper evaluation depends on a significant degree of expertise.
It's one of those circular binds. You need expertise to decide which experts to outsource your information processing to, before you can decide where you stand.
The obvious crucible where this applies right now is the pandemic and what's right or wrong about how it's being managed.
How many of us have a significant degree of expertise in the field of pandemic management and medicine? How did I land on my chosen experts then? What qualified me to choose them? Why didn't I choose different ones? What would I think differently now, if I'd chosen different ones?
The relevance realisation problem is another stinker.
It's a guy called John Vervaeke, whose thinking I admire in the realm of 'sensemaking', who coined 'relevance realisation' as a description of the way we decide which facts are useful. Vervaeke is a University of Toronto professor in the field of cognitive psychology and a fascinating theorist.
I'm not going to discuss what Vervaeke has to say about the process of determining relevance because you'll get way more insight from the source himself. Suffice to say that it requires a degree of self-understanding, which is why I jumped on him in the first place. Because I keep thinking that we can't hope to make sense of the world without understanding ourselves. There’s a link in the footnotes3 to a discussion Vervaeke had fairly recently that I've listened to repeatedly because there's so much meat in there.
If I was going to assign a hierarchy to the problems in our information landscape, my gut sense is that the problem of bad faith experts falls somewhere below relevance realisation skills in an age when information is so freely available. Because the clarion call 'do your own research' has led untold numbers of unqualified people to fall foul of it.
One of those 'do your own research' people said to me the other day 'the numbers don't lie' when they were pointing at the increased number of young deaths in Britain, that started to be recorded near to the time mass vaccination began.
This claim has been popular of late. That 'the numbers' show that there's an uptick in deaths among vaccinated young people.
The guy who pointed this out to me was looking at genuine data. So you can rule out misinformation. He thought it was relevant. Because 'the numbers don't lie'.
I don't trust that guy. For reasons. Chiefly that he copies and pastes phrases from a source I don’t trust either. I also think his relevance realisation skills are unexamined.
But it was an interesting moment because I’d just read a different interpretation of the same numbers by someone I do trust. For reasons I'll also get into.
It was a piece by the science writer Tom Chivers and you can read it here. I think that Chivers has reasonable mastery of relevance realisation, so he's one of the people I outsource mine to.
What the hell are we meant to do then?
I'm not qualified to tell you what to do when you want to understand things.
One of the things I never wanted Rarely Certain to be was yet another of those 'here's how you should think about things' publications. Or - look at this insight porn - phwoar.
But here's how I approach outsourcing my understanding of things I can't hope to work out for myself.
This person is positioned as an expert. Can I trust them then?
Here’s what I ask…
Do they have credible area knowledge?
If not, move on.
What is their incentive for saying x rather than y?
Assign a rough + or - weighting, depending on the nature of the incentive.
Are they using emotive language or attempting to spark feelings in their audience.
Assign one or more red flags.
Do they seem to be an 'activist'?
Assign a negative weighting, if they do.
Do they act like dicks on social media?
If yes, more negative weighting.
Can I understand what they're saying?
If not, move on. Don't not trust them. Forget them.
Where are they publishing or speaking?
Assign positive weighting or a red flag, depending on this.
Who else cites them?
Bonus positive weighting if that's anyone who I also trust
I'll take Chivers as an example of this process. He used to be a science writer at Buzzfeed. Buzzfeed did some very good journalism, once it expanded from clickbait. He was always pretty good at acknowledging the uncertainty of things. His incentive seems to be to make a living by being a sober, insightful voice that gets published in a range of places. It seems unlikely that he is secretly promoting an agenda. He always seems to reflect the challenges posed by complexity. His language is mostly unemotive. It's never shrill or bombastic. It doesn't give me feelings.
He doesn't seem to promote any causes. He just explains things. He isn't a prick on social media. I understand his writing. He never leaves me feeling subject to shock and awe by his smarts or overwhelmed by extraneous data. Chivers published the piece I mention above in Unherd, a conservative outlet in which he was arguing that it's probably worth vaccinating children, based on the data he's looked at. It was therefore an unpopular piece and the reader comments below it were a bit of a toilet.
It’s a case of a writer refusing to submit to the phenomenon of 'audience capture' (deferring to what readers or listeners really like, rather than what you really think). Chivers has also published a book about how to read statistics. I haven't read it, but the fact that he is evidently interested in the issue of relevance disposes me further to trust his good faith.
Of course, I don't literally sit down and create a spreadsheet entitled 'Tom Chivers - Trust Scores'. This process is more organic and evolves most of the time in the background. But that's how I do it. It's not particularly sophisticated but I think it's probably saved me from some missteps around who to trust.
Speaking of trust, those figures in the science community who dismissed the lab leak hypothesis as 'racist' are people I would be wary of trusting. Hyperbolic moral claims are a glaring red flag in this process. Which kind of rules out entire swathes of the academic liberal arts cohort.
Two figures I've also come to trust in pandemic commentary are Zeynep Tufekci and Eric Topol, via a similar process.
What about negative weightings and red flags?
You can't really discuss experts and trust without mentioning the darling of the Covid antithesis4 world, Dr. Robert Malone. Famously kicked off Twitter and given an audience of 15 million the next day on Joe Rogan's podcast, Malone is evidently revered by many as an expert they trust.
I've seen many debunks of Malone's best known claims.5 After a while I began to notice that I was reading and watching fact-checks of him without really understanding the technical content, but enjoying the fact that they reinforced a general sense that I have around Malone coming across, to me at least, as potentially a bit of a narcissistic sociopath.
Applying my unscientific and strongly feely 'trust tests' to Malone results in this assessment.
He has strong area knowledge. Positive tick.
His incentive looks like this, to me. To rise from unjustly overlooked (in his view) obscurity and become successful, possibly by marketing some Covid treatment protocols that he has said he's developing. And to stick it to people who received more credit for pioneering work in the early days of mRNA technology research, by undermining trust in the successful vaccines they helped pave the way for. Red flag.
He uses emotive and inflammatory language. All the time. Double red flag.
He's definitely an activist, determined to stoke fear and anger around most Covid-related issues. Yellow flag.
He acted like a dick on social media before being removed. By acting like a dick I mean things like posting something about a 17 year old footballer who died from a heart attack and suggesting that a Covid vaccine was responsible. The footballer turned out to have died in 2013. He also shared a study about heart inflammation in vaccinated people that was withdrawn by the researchers themselves when they found a serious statistical error in their work. Malone did not clarify that the paper was erroneous and left it for his audience to keep finding. Triple red flag.
I understand what he is saying. Not really a positive tick, though, due to the nature of what he is saying, which comes over as simplistic and often hyperbolic. No positive weighting. Yellow flag.
He is exclusively publishing and speaking into a bubble of likeminded people. He has not been stress tested in discussion with anyone (as far as I'm aware) who matches his area expertise but arrived at different conclusions. Red flag.
He seems to be exclusively cited by fringe figures and their fans. Red flag.
Well, that’s not a very robust method of ruling out Malone, is it
This is where I depart from the customary, fact-checky, debunky truth seeking path. The one that relies on reductionist propositional back-and-forth, with evidence and counter-evidence that we all pretend to understand while actually just scraping Google for what looks about right. Constantly going for a coup de grâce in the comments thread.
I'll freely admit that I don't understand the science and statistics well enough to know whether Malone is trustworthy or not, while also declaring that he gives me the absolute heebie jeebies.6
I suspect that this is really how most of us operate, even while pretending (including to ourselves) that we are 100% evidence-driven, rather than making much more subtle calculations about the prominent figures and statistics that we encounter.
I also suspect that Malone's fans arrived at their trust in him for similarly subtle reasons, while also fooling themselves that really it’s because they understand the science and statistics and their man gets it.
His fans are now talking about a phenomenon they call 'mass formation psychosis' which is meant to describe how those of us who are vaccinated were kind of hypnotised in some way, by world leaders, and are going around now in a deluded fug. I'm not making this up.
The reason it’s worth mentioning is that it plays to another suspicion I have about why we hitch ourselves to certain wagons. Because they make us feel superior. More insightful, or brave, or free, or whatever, than the people on the other side of the fence. I suspect that we all do this. Those of us on the 'pro scientific consensus' side and those on the more sceptical side. The phrase 'mass formation psychosis' is, I sense, an ego stroke for Malone et al's fans. They can comfort themselves that they escaped the psychosis, thanks to superior intellectual chops, while looking down on the rest of us who fell for it. It evidently doesn't occur to them that people on the consensus side can also see them as the glassy-eyed victims of a psychological scam.
It's worth noting in passing, too, that both sides think it's largely about intelligence.
I'm doubtful that intelligence even comes into it. In fact, the role of intelligence in working out what's real is overrated and a deeply conservative preoccupation. But that argument is for another time. There’s a lot to say about the role of story (pointlessly now referred to as 'narrative', presumably to make the people saying it sound more intelligent).
There's also more to say about related issues, like intellectual sovereignty (the right of people to believe what they want), growing information deluge and even moral considerations around Covid denialism, but they'll wait.
The regular bits
Quote of the Week, from what I've enjoyed reading lately.
Matt Taibbi on Joe Biden
"Biden looks bad. During the campaign, when he was challenging strangers to pushup contests and doing sternum-pokes in crowds while nervous aides bit their lips, you could make the argument he was merely in steep mental decline, which was okay. Against Trump the standard of “technically alive” worked for a lot of voters. Biden now looks like a man deep into the peeing-on-houseplants stage, and every appearance is an adventure."
Taibbi is one of my go-to voices on the left (as opposed to university campus style leftish) and I recommend his newsletter TK News. Sometimes he annoys me a bit. I welcome that, as I continue to crawl out from under the weight of bodies eager to confirm my biases.
This Week's Bit of Joy
Go on, it's only just over a minute and it's adorable.
I could have just dropped the meme right there for the benefit of those who don't know it. But I'd rather you visit the creator's page, so here it is http://gunshowcomic.com/648 and here's a good article about how it became one of the best loved memes https://www.theverge.com/2016/5/5/11592622/this-is-fine-meme-comic
Thi was actually responsible in no small part for my decision to abandon a podcast I used to do called 'The Disinformation Age'. After this episode with Thi - https://www.mikehind.co.uk/podcast/2019/4/30/game-on-quantified-motivation-amp-value-capture-all-the-things-we-dont-notice-in-our-online-social-lives - I realised that shouting on the internet was a bigger problem for civil parsing of information than fake news.
Here's Vervaeke discussing relevance realisation with a guy called Jim Rutt. Don't be put off by Rutt's 'down home' style of chatter. He's a smart cookie too and a good faith seeker of insight https://www.jimruttshow.com/john-vervaeke-3/ For the audio-averse there's a transcript you can read instead, on the same link.
‘Antithesis’ here is used as a more respectful alternative to ‘sceptic’. I picked this up from Rebel Wisdom (see their film on the pandemic as a religious war, mentioned last week - or find it on YouTube). They talk about ‘thesis’ and ‘antithesis’ because those terms are less loaded.
If you want to see a mostly scientific critique of Malone, this one is quite detailed. Interrogating myself about whether I trust Dr Dan Wilson (aka Debunk The Funk) I come out as fairly neutral. He seems plausible and is extremely likeable. He's very 'old school' fact-checky, though, which I'm no longer convinced is a format that really achieves much, apart from rally your own side a bit. Wilson wins over Malone, for me, because Malone is so tarnished.
There are other aspects of Malone's behaviour that don't sit right with me, some of which are detailed in this article. If I had been more on the fence about Malone I'd have done more due diligence on an obvious hatchet job, but at this point I've seen enough of him to be as settled as I ever am on a view that he's extremely suss. So I can't be bothered cutting him any slack. I'll assume that he's also sometimes unfairly attacked but retain a view that the weight of evidence remains that he is not a good faith operator. A bit like Boris Johnson. https://www.washingtonpost.com/health/2022/01/24/robert-malone-vaccine-misinformation-rogan-mandates/
I suffer from an honesty affliction, I find it near-impossible to pretend I like something when I don't. Early on I discovered a weird phenomenon; when presented with two samples and asked which I preferred, I would often reply honestly, "I have no opinion." This confused a lot of people and sometimes made them angry. They would demand angrily, "What do you mean, you have no opinion?" as if I were denying I existed. Not having an opinion didn't seem to be an option.