Log in

No account? Create an account
Epistemic learned helplessness - Jackdaws love my big sphinx of quartz [entries|archive|friends|userinfo]

[ userinfo | livejournal userinfo ]
[ archive | journal archive ]

Epistemic learned helplessness [Jan. 3rd, 2013|01:10 am]
[Tags|, , ]

[Epistemic Status | Probably I'm just coming at the bog-standard idea of compartmentalization from a different angle here. I don't know if anyone else has noted how compartmentalization is a good thing before, but I bet they have.]

A friend in business recently complained about his hiring pool, saying that he couldn't find people with the basic skill of believing arguments. That is, if you have a valid argument for something, then you should accept the conclusion. Even if the conclusion is unpopular, or inconvenient, or you don't like it. He told me a good portion of the point of CfAR was to either find or create people who would believe something after it had been proven to them.

And I nodded my head, because it sounded reasonable enough, and it wasn't until a few hours later that I thought about it again and went "Wait, no, that would be the worst idea ever."

I don't think I'm overselling myself too much to expect that I could argue circles around the average high school dropout. Like I mean that on almost any topic, given almost any position, I could totally demolish her and make her look like an idiot. Reduce her to some form of "Look, everything you say fits together and I can't explain why you're wrong, I just know you are!" Or, more plausibly, "Shut up I don't want to talk about this!"

And there are people who can argue circles around me. Not on any topic, maybe, but on topics where they are experts and have spent their whole lives honing their arguments. When I was young I used to read pseudohistory books; Immanuel Velikovsky's Ages in Chaos is a good example of the best this genre has to offer. I read it and it seemed so obviously correct, so perfect, that I could barely bring myself to bother to search out rebuttals.

And then I read the rebuttals, and they were so obviously correct, so devastating, that I couldn't believe I had ever been so dumb as to believe Velikovsky.

And then I read the rebuttals to the rebuttals, and they were so obviously correct that I felt silly for ever doubting.

And so on for several more iterations, until the labyrinth of doubt seemed inescapable. What finally broke me out wasn't so much the lucidity of the consensus view so much as starting to sample different crackpots. Some were almost as bright and rhetorically gifted as Velikovsky, all presented insurmountable evidence for their theories, and all had mutually exclusive ideas. After all, Noah's Flood couldn't have been a cultural memory both of the fall of Atlantis and of a change in the Earth's orbit, let alone of a lost Ice Age civilization or of megatsunamis from a meteor strike. So given that at least some of those arguments are wrong and all seemed practically proven, I am obviously just gullible in the field of ancient history. Given a total lack of independent intellectual steering power and no desire to spend thirty years building an independent knowledge base of Near Eastern history, I choose to just accept the ideas of the prestigious people with professorships in Archaeology rather than the universally reviled crackpots who write books about Venus being a comet.

I guess you could consider this a form of epistemic learned helplessness, where I know any attempt to evaluate the arguments are just going to be a bad idea so I don't even try. If you have a good argument that the Early Bronze Age worked completely differently from the way mainstream historians believe, I just don't want to hear about it. If you insist on telling me anyway, I will nod, say that your argument makes complete sense, and then totally refuse to change my mind or admit even the slightest possibility that you might be right.

(This is the correct Bayesian action, by the way. If I know that a false argument sounds just as convincing as a true argument, argument convincingness provides no evidence either way, and I should ignore it and stick with my prior.)

I consider myself lucky in that my epistemic learned helplessness is circumscribed; there are still cases where I will trust the evidence of my own reason. In fact, I trust it in most cases other than very carefully constructed arguments known for their deceptiveness in fields I know little about. But I think the average high school dropout both doesn't and shouldn't. Anyone anywhere - politicians, scammy businessmen, smooth-talking romantic partners - would be able to argue her into anything. And so she takes the obvious and correct defensive manuever - she will never let anyone convince her of any belief that sounds "weird" (note that, if you grow up in the right circles, beliefs along the lines of astrology not working sound "weird".)

This is starting to sound a lot like ideas I've already heard centering around compartmentalization and taking ideas seriously. The only difference between their presentation and mine is that I'm saying that for 99% of people, 99% of the time, this is a terrible idea. Or, at the very least, this should be the last skill you learn, after you've learned every other skill that allows you to know which ideas are or are not correct.

The people I know who are best at taking ideas seriously are those who are smartest and most rational. I think people are working off a model where these co-occur because you need to be very clever to fight your natural and detrimental tendency not to take ideas seriously. I think it's at least possible they co-occur because you have to be really smart in order for taking ideas seriously to be even not-immediately-disastrous. You have to be really smart not to have been talked into enough terrible arguments to develop epistemic learned helplessness.

Even the smartest people I know have a commendable tendency not to take certain ideas seriously. Bostrom's simulation argument, the anthropic doomsday argument, Pascal's Mugging - I've never heard anyone give a coherent argument against any of these, but I've also never met anyone who fully accepts them and lives life according to their implications.

A friend tells me of a guy who once accepted fundamentalist religion because of Pascal's Wager. I will provisionally admit that this person takes ideas seriously. Everyone else loses.

Which isn't to say that some people don't do better than others. Terrorists seem pretty good in this respect. People used to talk about how terrorists must be very poor and uneducated to fall for militant Islam, and then someone did a study and found that they were disproportionately well-off, college educated people (many were engineers). I've heard a few good arguments in this direction before, things like how engineering trains you to have a very black-and-white right-or-wrong view of the world based on a few simple formulae, and this meshes with fundamentalism better than it meshes with subtle liberal religious messages.

But to these I would add that a sufficiently smart engineer has never been burned by arguments above his skill level before, has never had any reason to develop epistemic learned helplessness. If Osama comes up to him with a really good argument for terrorism, he thinks "Oh, there's a good argument for terrorism. I guess I should become a terrorist," as opposed to "Arguments? You can prove anything with arguments. I'll just stay right here and not do something that will get me ostracized and probably killed."

Responsible doctors are at the other end of the spectrum from terrorists in this regard. I once heard someone rail against how doctors totally ignored all the latest and most exciting medical studies. The same person, practically in the same breath, then railed against how 50% to 90% of medical studies are wrong. These two observations are not unrelated. Not only are there so many terrible studies, but pseudomedicine (not the stupid homeopathy type, but the type that links everything to some obscure chemical on an out-of-the-way metabolic pathway) has, for me, proven much like pseudohistory in that unless I am an expert in that particular field of medicine (biochemistry has a disproportionate share of these people and is also an area where I'm weak) it's hard not to take them seriously, even when they're super-wrong.

I have developed a healthy dose of epistemic learned helplessness, and the medical establishment offers a shiny tempting solution - first, a total unwillingness to trust anything, no matter how plausible it sounds, until it's gone through an endless cycle of studies and meta-analyses, and second, a bunch of Institutes and Collaborations dedicated to filtering through all these studies and analyses and telling you what lessons you should draw from them. Part of the reason Good Calories, Bad Calories was so terrifying is that it made a strong case that this establishment can be very very wrong, and I don't have good standards by which to decide whether to dismiss it as another Velikovsky, or whether to just accept that the establishment is totally untrustworthy and, as doctors sometimes put it, AMYOYO. And if the latter, how much establishment do I have to jettison and how much can be saved? Do I have to actually go through all those papers purporting to prove homeopathy with an open mind?

I am glad that some people never develop epistemic learned helplessness, or develop only a limited amount of it, or only in certain domains. It seems to me that although these people are more likely to become terrorists or Velikovskians or homeopaths, they're also the only people who can figure out if something basic and unquestionable is wrong, and make this possibility well-known enough that normal people start becoming willing to consider it.

But I'm also glad epistemic learned helplessness exists. It seems like a pretty useful social safety valve most of the time.

[User Picture]From: celandine13
2013-01-03 01:19 pm (UTC)
I love this essay.

I love it because it's an articulation of a serious argument that I respect but still end up ultimately opposed to.

I've spent a lot of time considering "What should a person do about weird claims?" The stuff that *sounds* like the ideas of a crackpot, but potentially a crackpot so clever that you can't see a hole in his reasoning -- and, also, potentially not a crackpot at all but an insightful, correct thinker. I used to have roughly the same conclusion as you. And roughly the same problem with a tendency to believe the last thing I read, and along with it, a fear of reading things that might delude me.

But the thing is, I've come to the conclusion that it's not actually that hard to make your own judgments about ideas. I was confused about strong AI for a while. What did I do? I read a bunch of papers and textbooks. I talked to my friends who were AI researchers. I still don't *really* know what's going on because I never really learned mathematical logic, but it's a hell of a lot better than a black box. I know *some* mathematics, and I can tell the difference between a proof and a hand-wavy argument, and I've had independent confirmation of the falseness of the ideas I was skeptical about...I'm pretty sure, sure enough to go on with my life, that my picture of "what's up with AI" is more or less accurate.

I'm learning how to do this with biomedical research papers. I am not a biologist so I have to black-box a lot, but not *everything*. I can tell that claims with five conjunctive hypotheses are less likely than claims with one. I can tell when a study was done with 15 subjects or 15,000. I can certainly evaluate statistical methodology. I can come to estimates of my true beliefs -- not high confidence, but not all that biased, and way better than learned helplessness.

I don't go to the trouble of doing this with everything. I haven't checked out climate change skeptics, because I don't know fluid dynamics and I'm a little scared of the work involved in learning. But mostly, my heuristic is, "When confronted with a weird claim that would be really interesting if true and isn't immediately obvious as bullshit, it's worth checking Wikipedia and reading one scholarly paper. If I'm still uncertain and still interested, it's worth reading several more scholarly papers and asking experts I know."

A lot of bunk is not that hard to debunk. I looked through an 1880 book of materia medica (herbal medicine) once; most treatments were not just useless but poisonous, and it took 30 seconds of googling to find that out. (Oil of tansy will *fuck you up*, ladies and gentlemen.)

A good all-purpose scientist can more or less trust his/her bullshit-o-meter. You should know where you're least able to evaluate claims explicitly (for me, that's physics, chemistry, and anything to do with war or foreign policy) and use implicit meta-techniques (were their results reproducible? do they make a lot of conjunctive claims? that sort of thing). But often, I can just *go in and check the math.* Tim Ferriss makes arithmetic errors in his books. You don't have to be a fitness expert to catch them.

I'm no longer afraid of being deluded by charlatans. I wouldn't go to a Scientology meeting, because they engage in physical brainwashing, but I can read racists without becoming a racist, read homeopaths without becoming a homeopath, and so on. I've banged my brain against a *lot* of things, and come out more or less clean.

Maybe not everyone can do this (my education certainly helped a lot), but it is *possible*, and I think most people who are comparably educated and bright (e.g. you) can get better at evaluating weird claims themselves and do better than they would with epistemic learned helplessness.
(Reply) (Thread)
[User Picture]From: mindstalk
2013-01-03 04:05 pm (UTC)
But I know people with science PhDs who sound as self-aware and confident but they think global warming and Keynesianism are hoaxes and that there's some huge cover-up going on regarding Benghazi and Obama's coming for our guns any day now. (This before the election, so before the Sandy mass shooting.)
(Reply) (Parent) (Thread)
[User Picture]From: celandine13
2013-01-03 06:06 pm (UTC)
For a certain interpretation of "hoax," that's actually *true* (neither macroeconomic nor climate models are anywhere near as predictive as they claim to be, and I expect them to be buggy and infrequently corrected.) The Obama coming for your guns thing is empirically unlikely, but, y'know, politics is the mind-killer.

Seriously, though, if you're actually saying "smart people turn out to be wrong when they think for themselves, so you're better off not thinking for yourself too much" then...that's just passing the buck. I can understand picking your battles and deciding to reserve judgment on stuff you don't have the resources to investigate yourself; but somebody, somewhere, has always got to do the actual thinking. Bucks can't be passed indefinitely.

And "the best lack all conviction while the worst are full of passionate intensity." I'd like more of the "best" to experiment with having more conviction. At current margins it seems pretty clear that it would help.
(Reply) (Parent) (Thread)
[User Picture]From: ciphergoth
2013-01-04 01:27 pm (UTC)
Can't we have passionate intensity without conviction?
(Reply) (Parent) (Thread)
[User Picture]From: marycatelli
2013-01-04 12:18 am (UTC)
There obviously was a cover up going on about Benghazi. The White House lied about things that they knew were true.
(Reply) (Parent) (Thread)
[User Picture]From: tcpip
2013-01-04 01:26 am (UTC)
But I know people with science PhDs who sound as self-aware and confident but they think global warming and Keynesianism are hoaxes

Yes, but how many climate scientists think that global warming is a hoax, or macroeconomists who think that Keynesism is?

It's a good question about epistemology of course. The more complex and specialised a theory the more that we have to direct our attention to very specific disciplinary areas. Even if the general principles of validity etc remain relatively simple.
(Reply) (Parent) (Thread)
[User Picture]From: squid314
2013-01-04 09:43 am (UTC)
The worst 99% of bullshit isn't so hard to catch, but just the existence of the best 1% is enough to cast doubt on legitimate contrarians.

I am curious to whether you are genuinely better than I am at this or just read less competent crackpots. I'm going to try to find which crackpots I considered competent and see whether summaries of their writings are easily accessible online and maybe link to some of them.
(Reply) (Parent) (Thread)
[User Picture]From: celandine13
2013-01-04 02:22 pm (UTC)
I don't claim to be able to personally debunk all crackpots!

I think *most* bullshit is easily recognizable; and I think the hard cases are worth *somebody* putting the time in to fact-check carefully in case they actually turn out to be correct.

In mathematics, everybody basically expects that people claiming to prove the Riemann Hypothesis are cranks. (With good reason. It's a very hard problem, everybody's heard of it, you get a million dollars and undying fame if you solve it, and there's no systematic misaligned incentives in math to bias mainstream mathematicians against successfully proving it.)

The vast majority of "proofs" are easily identified as bullshit (this is what I've heard from the mathematicians who edit journals and have to wade through them.) It takes literally thirty seconds to dismiss them. However, occasionally you get a "proof" that's wrong in a tricky and subtle way -- on priors you *expect* it to be wrong, but it'll take some time to locate the error. But you know what? Somebody checks it. It's a pain in the ass, and that poor reviewer will bitch and moan about having to do it, but that's what scientific integrity requires.
(Reply) (Parent) (Thread)
[User Picture]From: torekp
2013-01-06 06:43 pm (UTC)
Well, but that's you. You're way outside the norm. For that matter, so is Scott. Scott's advice seems meant more for aspiring rationalists than for those qualifying for their Nth degree black belt.
(Reply) (Parent) (Thread)