Saturday, November 10, 2007

Look, it says 'sex' in the hed

From several time zones to the west comes a complaint about this AP lede:

Programs that focus exclusively on abstinence have not been shown to affect teenager sexual behavior, although they are eligible for tens of millions of dollars in federal grants, according to a study released by a nonpartisan group that seeks to reduce teen pregnancies.

... on the perfectly reasonable grounds that the conclusion seems a lot more hedged in the second graf:

"At present there does not exist any strong evidence that any abstinence program delays the initiation of sex, hastens the return to abstinence or reduces the number of sexual partners" among teenagers, the study concluded.

... and that a quick trip via intrawebs to the study itself (this is a summary, but you can find the whole thing on the same page) finds pretty good reason for the hedges. In short, "study says" and "AP says" are once again two different things. Which is particularly unfortunate in this case, for a couple of reasons. Aside from the heds, which, unsurprisingly, run along the lines of:
Report: Abstinence programs don't work
Report: Abstinence programs fail

"What do we do about fixing it?" asks the complainant. "Do we demand a retraction? Do we tell our wire editors to stop running this stuff (if they'll listen)? Do we stop hiring J-school grads in favor of anybody who has been taught to critically think (and read)? Or do we just vent at TestyCopyEditors and do our jobs until newspapers die and we have to go live on the streets?"

I'm for a judicious combination of those steps. But first, a little sermonizing. And point one is that you can do quite a bit of good just by swimming upstream to the study itself. You can almost always find an abstract, and surprisingly often the whole thing is there. If it's behind a pay wall, see if you can get around it. Anybody on the staff taking courses part time at the local U and possessed of a library account? What kind of online privileges can you get with a community borrowing pass (which is usually pretty cheap, and which you ought to have anyway)?

Point two is that "scientific method" is not a cafeteria. If you want a plate of "Study finds X" or "Study sees no relationship between A and B," you get the whole damn dinner or you're cheating. The rules that make the study's findings relevant (and it is) also govern how you interpret the things that appear, in the AP's telling, to be nonfindings.

That's important because, point three, just outside the perimeter are a bunch of people who want to see our heads on sticks, our cattle thrown in the wells, our tomato fields salted and our little editorlings sold into bondage. Why are we handing them ammunition in the form of something that's easy to mistake for a deliberate ideological distortion on a social-hot-button story? (I don't think it is -- my default bet is usually on Media Stupid rather than Media Bias -- but I have no way of ruling it out.)

OK, then: What does the study say, and how should it be read? (And a brief cheer for papers like Cleveland and Dallas, which avoided the AP's blunder.) First, it's a review study; it's trying to draw conclusions from a bunch of studies, rather than conducting original research. One thing that means is it's going to make a lot of evaluations of different research designs, samples, statistical tests, and the like that are reflected in phrases like "At present there is no strong evidence" -- which frankly don't sound nearly as interesting as "programs that focus exclusively on abstinence have not been shown to affect teenager sexual behavior."

But the boring, wishy-washy one has the advantage of being true, while the conclusive, authoritative one isn't. The most rigorously tested abstinence programs* don't produce evidence of behavior change. Moving down the scale, things get trickier. One study finds a delay in sex initiation, but it's a quasi-experiment**, there's a lot of mortality in the follow-ups, and the significance is .05 (of course, if you prefer a higher arbitrary benchmark for significance in your social science, that's fine; the trick is to remember that they're both arbitrary). A couple of other quasi-experiments find some behavior changes. And there's a true experimental design that's found a behavior change, but it's only been presented at a conference so far and isn't included in the metastudy.

Bottom line? The AP's lede misstates the findings; call it a simple lapse, a routine-induced hedge deletion or whatever, it's wrong. Copy editors should press wire eds, and wire eds should call the AP and push for clarification. Don't stop hiring J-grads; just make clear in the interview or advertisement stage that you value methods and stats classes as much as extra classes in How To Write Good. And keep complaining at TCEs; it's a way of changing attitudes, if indirectly.

I'm tempted to say (and said in class today, on a different topic) that it's better to be boring and right than exciting and wrong, but in this case there's no need to be boring. A pretty good meta-analysis offers some good conclusions about what kinds of sex ed programs actually do affect behavior, and abstinence-only isn't the choice. And it concludes pretty soundly that teenagers aren't driven into frenzies of sexual activity by sex ed classes. On the social-progress-for-my-side scale, I'll take that. Why give the aspirin-between-the-knees crowd the chance to ask you whether you're being deliberately dishonest or just incapable of reading the results?

* At page 113 of the full report, you'll not only find a detailed definition of "abstinence education" but a good caution from the author: "In reality, programs do not fall neatly into one of these two groups. Rather, they exist along a continuum, which makes some of them difficult to classify."
** Usually meaning that, for whatever reason, the researchers can't meet the benchmark of random assignment to conditions. Makes sense; randomly assigning some people to the "smoking" group and others to the "nonsmoking" group, for example, would raise some issues with your IRB.

0 Comments:

Post a Comment

<< Home