On May 4, 2011, The Detroit Regional Workforce Fund (DRWF) published a report titled Addressing Detroit’s Basic Skills Crisis, which featured the statistic that 47% of Detroit’s adult population is functionally illiterate. Predictably, this touched off a media firestorm. Major outlets including CBS, Fox, the Huffington Post, and the Daily Mail promptly covered the release, along with other heavyweights like Matt Yglesias.
Dissecting the origin of this statistic is more about the poor data literacy of some of our news agencies than it is about Detroit’s literacy rates. Many of them referred to the report as a “new study,” missing the important detail that the research is far from new. The 47% Detroit literacy rate is the result of a 1998 analysis by the National Institute for Literacy, performed on data from the National Adult Literacy Survey (NALS), published in 1993. That’s right: those “alarming new statistics” are based on data almost two decades old. Almost all of the media coverage neglected to communicate that fact:
• On May 4, CBS Detroit reported “alarming new statistics” indicated that nearly half of Detroiters “can’t read.”
• On May 4, Mlive.com reported on a “new study” from DRWF indicating the 47% functional illiteracy rate.
• On the same day, Outside the Beltway also rehashed the CBS Detroit story.
• The same day, the Daily Mail quoted the “study” showing that nearly half of Detroiters “cannot read.”
• That day, Matt Yglesias at ThinkProgress commented on the report.
• On May 6, Good Magazine reported the figure and referred to DRWF’s work as both a “new report” and a “study.”
• The same day, the Root did the same.
• On May 7, the Huffington Post stated that of Detroit’s remaining population since 2000, half are functionally illiterate.
• On November 3, Bridge Magazine reported the figure, saying that “despite the best efforts of dozens of nonprofit tutoring agencies, the numbers have not been improving,” though there’s no way of knowing whether this is true.
Even in 1998, the NIL report underlines several limitations to its methodology, all of which have been ignored by the media. At the end of Appendix A, it states “there is no direct evidence available about the validity of the model’s predictions for the congressional district or city/town/place Census areas,” e.g. the city of Detroit. The model’s validity was confirmed only for counties, not other geography levels.
In addition, the confidence interval for the city of Detroit is noted to be greater than plus or minus five percentage points, though the report didn’t specify how much greater. In other words, not only do we not know the exact literacy rate, we also don’t know how precise the estimate is!
Literacy is an important issue in Detroit; to take action on it requires that we understand the magnitude of the problem. But to act based on outdated and methodologically limited data is hardly better than acting on no data at all. This is particularly true when uncritical reporting of those data dominates the dialogue, as has been the case with literacy in Detroit. In order to understand how to respond effectively to the issue of literacy, we must first honestly evaluate how much or little we know about it – and in present-day Detroit, there are significant gaps in our knowledge.