Log in

F#!$ epidemiology, without a condom - Jackdaws love my big sphinx of quartz [entries|archive|friends|userinfo]

[ userinfo | livejournal userinfo ]
[ archive | journal archive ]

F#!$ epidemiology, without a condom [Dec. 2nd, 2012|11:44 pm]
[Tags|, ]

Gilbert of The Last Conformer claims that sex ed doesn't work. That the constant reminders about how abstinence-only sex ed is useless are a politically-motivated distraction from the fact that it's all useless.

At this point, I am terrifyingly willing to believe this. I've been slowly making my way through Good Calories, Bad Calories, and although I am still impressed with epidemiology's ability to work hard conducting lots of carefully-designed studies into everything, I'm simultaneously horrified by the total haphazardness of the process by which the results of those studies get aggregated into conclusions and conventional wisdoms.

And Gilbert's evidence is good. I'll ignore the paper on abstinence-only since everyone agrees that's a bad idea, but he gives a meta-analysis of 22 randomized controlled trials from 2002 that shows pretty clearly no effect even among the comprehensive (ie not abstinence-only) programs.

But I disagree with him when he says that "the evidence is pretty clear". This is epidemiology! The evidence is never clear!

First, opponents of this study argue (rightly) that it can't be expected to capture the whole picture, since (for example) most of its studies were comparisons of "normal" sex education with some special intensive program, and only showed that the special intensive program wasn't any better than normal.

Second, a whole host of correlational studies have shown that comprehensive sex education does make a difference, sometimes even cutting teen pregnancies in half. For example, a study from the National Survey of Family Growth finds that after adjusting for age, social status, race, family intactness, etc, people who have had comprehensive (but not abstinence-only) sex-ed are much less likely to become pregnant. Correlational studies are frequently considered suspicious, especially ones which gain more significance after adjusting for confounders, but perhaps given the weaknesses of the study above we should take them at face value?

Third, there are at least three meta-analyses of experimental trials which give results somewhat unlike Gilbert's. Each is flawed in its own way, but each is instructive.

An analysis by the Preventative Services Task Force goes through 89 experimental studies and finds that comprehensive sex education is effective (sexual activity odds ratio 0.84, pregnancy odds ratio 0.88, STD odds ratio 0.65) but abstinence-only sex education is not. This is marred by a simultaneously published critique noting unacceptable levels of heterogeneity in the study results and explaining them by reanalyzing the data to show that community-based sex ed is pretty effective but school-based sex ed isn't. Since school-based sex ed is probably what most people are interested in, this isn't encouraging.

Johnson et al looks at interventions to prevent HIV in African-American youth. It finds that the interventions produce greater long-term condom use and fewer STDs. But these come mostly from HIV education programs rather than safe-sex programs per se, and the external validity (whether they would generalize to white people or other populations) is not assured.

The last big review, and the most impressive one, is Douglas Kirby's impressive 200 page report for the National Campaign to Prevent Teen and Unplanned Pregnancy. It looked at a whopping 115 studies and found that about two-thirds of comprehensive sex ed programs had positive effects. But there were enough different categories of effect (fewer sexual partners, lose virginity later, less pregnancy, less STD) that it's hard to take this too seriously. It's too easy to find a spurious effect in one subgroup ("Half-Asian, half-Hispanic 16 year old women have a slightly lower risk of cholera after taking our program! The system works!") More rigorously, about a third of the programs had multiple positive effects, but not all of these lasted very long.

Most promising, some of these analyses were able to come up with criteria for which programs did or didn't work. Sex ed - even comprehensive non-abstinence-based sex ed - is such a huge category that it would be surprising if there weren't major differences in effectiveness. Some studies might be of programs in juvenile shelters offering several hours of class time a week for months on end taught by experts sent in from Planned Parenthood. Another might be a teacher getting up in front of a class one time and saying "Look, try not to have sex too much, okay?". At this point there just aren't enough studies to restrict an analysis to high-quality programs, but maybe there's still hope that one which did would be less confusing?

Overall, I disagree with Gilbert that the evidence is clear sex ed doesn't work. But I also disagree with the rest of the world that the evidence is clear that it does. My best guess would be that the best sex education programs in the best settings have a small effect on pregnancies and a medium effect on STDs. One study tried to set a cost-benefit analysis and found a dollar spent on a good sex ed program paid back $2.65 in savings, which sounds possible although I don't really trust a single study on anything. On the other hand, I am not sure whether noisy real-world sex education curricula are doing very much good and I would not consider an attempt to criticize or get rid of them as obviously inconsistent with the science.

Despite this, I kind of see what people like celandine13 and Gary Taubes are complaining about regarding the recommendation process. If I wanted to make an argument for the process by which "sex ed works" became the default position being hopelessly corrupt and confused, it would be very easy to do.

A lot of studies come up with very ambiguous findings when you actually read them, but the conclusion at the bottom and in the abstract is "...but in general, we think most results of our study are consistent with the hypothesis that sex education works". In the larger lists of recommendations, these just get listed as "the CDC, Harvard, and the WHO all did studies proving that sex education works". Or they get lumped in with "thousands of studies support sex education" when those studies can be anything from surveys showing that most parents are in favor of sex education (?!) to studies showing that if you ask kids "do you think this class was helpful?" on a test they will say "yes" to very sketchy correlational trials, to occasionally a good randomized experiment. If the randomized experiments fail but the kids still tell their teachers that they liked the class, then for a sufficiently motivated recommending body that can be spun as "most studies support the intervention". A few big name people come up with all the recommendations, even though it looks like there are multiple independent ones, thus producing a false appearance of consensus. And then a few steps later people show up attacking anyone who isn't totally on board with sex education as "anti-science".

I'm still not sure how broad a problem this is. Most of the more explicitly medical things I've studied, stuff like "does this emergency heart attack treatment work?" have seemed on much firmer ground. And I would hate to have to go total Pyrrhonian skeptic on anything I haven't investigated myself, especially since my own investigations take so long to do so incomplete a job compared to someone who makes a career out of any of these fields.

I'm going to keep reading Good Calories, Bad Calories and just sorta hope he says "Haha, only kidding" at the end.

[User Picture]From: lastconformer
2012-12-03 06:03 pm (UTC)
I'll have to admit that I skimmed a few pages of the Kirby/National Campaign report before that blog post and then dismissed it purely on accidents.

If a long report published by an organization with an ax to grind has excellent graphic design clearly done with a DTP program, magazine-style separate text blocks, pictures of happy diverse people, and math&science explanations on a level clearly geared to innumerate journalists & politicians, I tend to assume it's just PR looking for a home in the cylindrical file.

I think that's a reasonable heuristic and we would live in a better world if policymakers followed it, but it's of course not actual proof of anything. It is possible for real information to hide behind that kind of mask, particularly in a world where that is exactly what policy makers want to read.

So given that you think it most impressive I grumblingly conclude I probably need to read the whole thing. Maybe I'll be back with an update some time after Christmas...

Also, you obviously know much more about that kind of stuff than I do, so I probably should defer.

OK, so much for eating crow, but I still have nit-picks:

The Johnson et al. meta-analysis isn't about youth reached through sex-education in school, the participants were on average 28 years old, and mostly recruited through communities or clinics. I think that's closer to the social work category I hedged for. That would also be compatible with the minority report on the Preventative Services Task Force study you already mentioned.

And I'm not quite comfortable with ignoring the abstinence-only study, because the normal (mostly comprehensive) program was the control group, so if there is no measurable difference that points to both being ineffective. Of course the the control group also served as waiting list, so some of the abstinence only kids could also have gotten some of the comprehensive sex ed, which could dilute an advantage it might have. Still there is evidence against the effectiveness of either version there.

Finally, the working community based stuff sounds much more intense, so the "it works on average but not on the margin" story for comprehensive sex-ed sounds like a stretch.

But then again basically every statistical study leaves some points open to quibbling by people who dislike it. So my having complaints here doesn't mean that much either.

So for the moment I'll just take your point that it's always more complicated.
(Reply) (Thread)
[User Picture]From: squid314
2012-12-03 06:40 pm (UTC)
I agree with you about the glossy brochure, and I agree that it worries me that Kirby has done like ten of these reports for the WHO and the UN and various other advocacy groups. I am trying not to let that worry influence my opinions, just because if someone were an honest and brilliant researcher his path would probably also go through getting associated with all the big groups that give money to study topics, so I don't feel like it's good evidence against him.

When I say his is "most impressive", I mean only that it is very long and comprehensive, that it reviews the most studies, and that it has an entire chapter dedicated to possible biases in meta-analyses and how it's going to avoid them. I certainly didn't do more than skim the actual research before looking at the results. But I find the fact that the results weren't really all that impressive, and he acknowledged them but didn't try to inflate them, a point in his favor.

I do think some evidence supports the "more intense" hypothesis, but then that also seems like exactly the thing the meta-analysis you cited should be demolishing, since it mostly compared normal school programs to more intense programs.

Overall I'm sticking with "confused", but I'm eager to see what you find in a more comprehensive review.
(Reply) (Parent) (Thread)