F#!$ epidemiology, without a condom
At this point, I am terrifyingly willing to believe this. I've been slowly making my way through Good Calories, Bad Calories, and although I am still impressed with epidemiology's ability to work hard conducting lots of carefully-designed studies into everything, I'm simultaneously horrified by the total haphazardness of the process by which the results of those studies get aggregated into conclusions and conventional wisdoms.
And Gilbert's evidence is good. I'll ignore the paper on abstinence-only since everyone agrees that's a bad idea, but he gives a meta-analysis of 22 randomized controlled trials from 2002 that shows pretty clearly no effect even among the comprehensive (ie not abstinence-only) programs.
But I disagree with him when he says that "the evidence is pretty clear". This is epidemiology! The evidence is never clear!
First, opponents of this study argue (rightly) that it can't be expected to capture the whole picture, since (for example) most of its studies were comparisons of "normal" sex education with some special intensive program, and only showed that the special intensive program wasn't any better than normal.
Second, a whole host of correlational studies have shown that comprehensive sex education does make a difference, sometimes even cutting teen pregnancies in half. For example, a study from the National Survey of Family Growth finds that after adjusting for age, social status, race, family intactness, etc, people who have had comprehensive (but not abstinence-only) sex-ed are much less likely to become pregnant. Correlational studies are frequently considered suspicious, especially ones which gain more significance after adjusting for confounders, but perhaps given the weaknesses of the study above we should take them at face value?
Third, there are at least three meta-analyses of experimental trials which give results somewhat unlike Gilbert's. Each is flawed in its own way, but each is instructive.
An analysis by the Preventative Services Task Force goes through 89 experimental studies and finds that comprehensive sex education is effective (sexual activity odds ratio 0.84, pregnancy odds ratio 0.88, STD odds ratio 0.65) but abstinence-only sex education is not. This is marred by a simultaneously published critique noting unacceptable levels of heterogeneity in the study results and explaining them by reanalyzing the data to show that community-based sex ed is pretty effective but school-based sex ed isn't. Since school-based sex ed is probably what most people are interested in, this isn't encouraging.
Johnson et al looks at interventions to prevent HIV in African-American youth. It finds that the interventions produce greater long-term condom use and fewer STDs. But these come mostly from HIV education programs rather than safe-sex programs per se, and the external validity (whether they would generalize to white people or other populations) is not assured.
The last big review, and the most impressive one, is Douglas Kirby's impressive 200 page report for the National Campaign to Prevent Teen and Unplanned Pregnancy. It looked at a whopping 115 studies and found that about two-thirds of comprehensive sex ed programs had positive effects. But there were enough different categories of effect (fewer sexual partners, lose virginity later, less pregnancy, less STD) that it's hard to take this too seriously. It's too easy to find a spurious effect in one subgroup ("Half-Asian, half-Hispanic 16 year old women have a slightly lower risk of cholera after taking our program! The system works!") More rigorously, about a third of the programs had multiple positive effects, but not all of these lasted very long.
Most promising, some of these analyses were able to come up with criteria for which programs did or didn't work. Sex ed - even comprehensive non-abstinence-based sex ed - is such a huge category that it would be surprising if there weren't major differences in effectiveness. Some studies might be of programs in juvenile shelters offering several hours of class time a week for months on end taught by experts sent in from Planned Parenthood. Another might be a teacher getting up in front of a class one time and saying "Look, try not to have sex too much, okay?". At this point there just aren't enough studies to restrict an analysis to high-quality programs, but maybe there's still hope that one which did would be less confusing?
Overall, I disagree with Gilbert that the evidence is clear sex ed doesn't work. But I also disagree with the rest of the world that the evidence is clear that it does. My best guess would be that the best sex education programs in the best settings have a small effect on pregnancies and a medium effect on STDs. One study tried to set a cost-benefit analysis and found a dollar spent on a good sex ed program paid back $2.65 in savings, which sounds possible although I don't really trust a single study on anything. On the other hand, I am not sure whether noisy real-world sex education curricula are doing very much good and I would not consider an attempt to criticize or get rid of them as obviously inconsistent with the science.
Despite this, I kind of see what people like celandine13 and Gary Taubes are complaining about regarding the recommendation process. If I wanted to make an argument for the process by which "sex ed works" became the default position being hopelessly corrupt and confused, it would be very easy to do.
A lot of studies come up with very ambiguous findings when you actually read them, but the conclusion at the bottom and in the abstract is "...but in general, we think most results of our study are consistent with the hypothesis that sex education works". In the larger lists of recommendations, these just get listed as "the CDC, Harvard, and the WHO all did studies proving that sex education works". Or they get lumped in with "thousands of studies support sex education" when those studies can be anything from surveys showing that most parents are in favor of sex education (?!) to studies showing that if you ask kids "do you think this class was helpful?" on a test they will say "yes" to very sketchy correlational trials, to occasionally a good randomized experiment. If the randomized experiments fail but the kids still tell their teachers that they liked the class, then for a sufficiently motivated recommending body that can be spun as "most studies support the intervention". A few big name people come up with all the recommendations, even though it looks like there are multiple independent ones, thus producing a false appearance of consensus. And then a few steps later people show up attacking anyone who isn't totally on board with sex education as "anti-science".
I'm still not sure how broad a problem this is. Most of the more explicitly medical things I've studied, stuff like "does this emergency heart attack treatment work?" have seemed on much firmer ground. And I would hate to have to go total Pyrrhonian skeptic on anything I haven't investigated myself, especially since my own investigations take so long to do so incomplete a job compared to someone who makes a career out of any of these fields.
I'm going to keep reading Good Calories, Bad Calories and just sorta hope he says "Haha, only kidding" at the end.