August 20, 2010

A Lesson in Methodological Ignorance


Last week, a piece appeared in The Dartmouth by columnist Emily Johnson discussing the role of Teach for America, which places graduates from top universities to teach underprivileged students. Her column is a prime example of why Dartmouth's social science departments require majors to complete a research methods and design course. Some exposure to basic statistics could have saved Ms. Johnson from making a pretty big journalistic and intellectual faux pas.

Emily Johnson follows many reputable newspaper columnists and reporters into the trap of presenting questionable research as conclusive. She cites a recent policy brief by the Great Lakes Center, a group that represents teachers’ associations and unions, on the relative effectiveness of Teach for America teachers. Ms. Johnson questions the return-on-investment that society receives from TFA corps members, using the report’s authors as evidence. But Ms. Johnson fails to critically judge the data, instead accepting them in toto.

While there are questions of journalistic integrity – what role should journalists play in reporting on quantitative research methods? – I will leave those conversations for a later date (stay posted!). What is more pressing now is a critical review of Ms. Johnson’s conclusions about TFA. Here's a hint: her conclusions aren't worth the paper they're printed on.

According to Ms. Johnson, TFA is a misguided investment (despite the federal government only underwriting one-third of TFA’s budget) because “inexperienced TFA teachers simply do not compare to the experienced teachers they are replacing.”

So she makes her point clearly enough -- but fails to elucidate what specific data she uses to draw this conclusion. The report by the Great Lakes Center – as noted above, a group representing teachers' unions, which are notoriously hostile to the TFA program – pulls together several contradictory studies, and is, although grudgingly, forced to conclude that TFA studies yield “decidedly mixed effects,” and that discrepancies between studies “hinge on abstruse matters of statistical methods.”

Unfazed, Ms. Johnson follows the authors of the policy brief still further, echoing their concern about teacher retention beyond TFA’s two-year commitment. True, only half of TFA corps members continue teaching after the completion of their contractual obligation, and only twenty percent remain onboard after five years. In part, though, this is the nature of service-based programs. One-in-three members of the Peace Corps quit before the end of their two-year commitment. Surely that does not diminish the value of the Peace Corps.

Even thinking of the rate of attrition as a simple statistic causes some problems. In a sense, Ms. Johnson and the authors whom she cites attempt to quantify the unquantifiable. TFA alumni who leave the program after two years may never set foot in the classroom again, but all are imbued with a rich understanding of the challenges facing the modern American education system. Though it is nigh impossible to measure the utility of having a new class of professionals, philanthropists and policymakers who better understand the plight of the American student, some qualitative analysis shows it to be profound.

Consider two TFA alumni who started the Knowledge Is Power Program, one of the nation’s most successful networks of charter schools. The KIPP rubrics are now being used by TFA and non-TFA teachers alike. Or consider the twenty percent of Tulsa’s 2009 corps who have already committed to extend their teaching obligation while attending a master’s program that will train them to be educational leaders and administrators.

Many patents approved by the federal government never make it off the graph paper they are sketched upon. But some become hydrogen-cell batteries or cancer-fighting drugs. These societal benefits would never be realized if the government did not invest in protecting intellectual property. The same can be said for TFA. Will a majority of corps members go on to be a teacher of the year? Probably not. But many will have an inenarrable impact on education practice and policy in America.

Ms. Johnson does not address that, nor does the policy brief she cites, thus marking their conclusions as suspect from the start. Attempts to quantify the qualitative combined with a loose grasp of statistical methods and inconclusive data from biased sources does not an indictment make. Teach for America is not a panacea, but that’s a foolish rubric. TFA is, and will remain, an integral part of America’s strategy to improve teaching and education policy for the disadvantaged and underserved.

The Dartmouth, and it's columnists, should be more careful when handling statistical research. As I wrote in a letter which The Dartmouth did not see fit to publish:
If you do not want to spend time analyzing research methodology, you should not cite the research. Journalists ought not to use shoddy statistics as a political bludgeon.

For the curious reader, here's the letter in its entirety:

To the Editor:

In her recent column, Emily Johnson commits the all-too-frequent journalistic sin of not scrutinizing the statistics she tries to cite (“Teach for Prestige,” August 10). The report she references, which suggests that Teach for America teachers are less effective than the experienced instructors they replace, is not as clear-cut as Johnson makes it seem.

Primarily, the report brings to bear no new research, only cherry picking studies that condemn TFA teachers while dismissing data showing the program’s strengths. The report, published by the Great Lakes Center – a group representing teachers’ unions, which are notoriously opposed to TFA – lacks academic rigor, and does not deserve to have its findings replicated in newspapers of note.

That said, had Ms. Johnson read past the executive summary, she would have noted that even the report’s authors grudgingly admit that TFA studies yield “decidedly mixed effects,” and that discrepancies between studies “hinge on abstruse matters of statistical methods.” Hardly as conclusive as Johnson makes it seem.

Further, a much more robust study by the reputable Urban Institute finds “that TFA teachers are more effective than experienced secondary school teachers” particularly in math and science classes. Many other reports back that assertion.

If you do not want to spend time analyzing research methodology, you should not cite the research. Journalists ought not to use shoddy statistics as a political bludgeon.

Brice D. L. Acree ‘09
Dartmouth students should expect more from "the oldest college newspaper in the country."


No comments:

Post a Comment