The 2013 National Assessment of Educational Progress Scores for math and reading are in, and the results for Houston are definitely mixed.
Locally, the conventional wisdom is that Houston continues its steady improvement in math, but lags behind in reading. HISD Superintendent Terry Grier is working with administrators and (presumably) teachers to retool reading instruction again to improve test scores and (hopefully) literacy in the district.
In general, when test scores get released, there is much rending of garments, wringing of hands and spilling of ink on the subject – ranging from thoughtful to useless to actively harmful. That’s true for NAEP as well. The Washington Post’s education blogger Valerie Strauss bluntly headlined her Nov. 12 post on the initial release of nationwide and state-level results “All that Bad Information about the New NAEP Results” before demolishing self-serving press releases from several groups trying to scaremonger or overinflate progress.
The Scoop on NAEP
So where is Houston actually? And what does it mean? For that matter – what the heck is NAEP anyway?
NAEP is a project managed by the National Assessment Governing Board, a 26-member independent and bipartisan board appointed by the U.S. Secretary of Education.
One of the legacies of local and state control of education in the United States is that states have developed their own evaluation systems. Even after the No Child Left Behind Act provided firmer guidance for developing state standards, it’s highly misleading to compare assessment results across states or develop an idea of how American students are performing overall because the exams are different from state to state.
NAEP overcomes that issue by administering the test every two years to a nationally representative sample of at least 10,000 students in a variety of subjects, including reading and math at the fourth and eighth grade levels. The test is also administered to a parallel sample of representative student populations from each state to allow state-to-state comparisons. The Department of Education then compiles the results in “The Nation’s Report Card,” which is available to the public here.
The NAEP results from Houston are actually part of a separate trial program to track the scores of the largest urban school districts, which uses the same test but uses large enough representative samples from each urban district to produce comparable results. This ongoing program is called the Trial Urban District Assessment, or TUDA, and has allowed for comparisons between large urban districts, statewide and national scores since 2002.
The Department of Education reports NAEP scores on a 0-500 scale standardized within each subject area (meaning it is possible to compare reading scores across jurisdictions and time, but not reading scores to math scores.) For each subject area and grade level, scores are subdivided into four regions: below basic mastery, basic subject mastery, proficient mastery and advanced mastery. For example, scoring above a 208 on the 2013 fourth-grade reading assessment would earn a tag of “basic” mastery, while a score above 238 would be considered “proficient” and one above 268 would score “advanced.” For the grade-level rubrics of what skills test-takers needed to demonstrate in order to reach each level, see here.
Several caveats apply to placing NAEP scores in proper context. First, as former Assistant Secretary of Education and noted test-critic Diane Ravtich of New York University points out, “proficient” is supposed to represent a fairly high-achieving score, not a “passing” grade, so interpret scores accordingly. Second, since the test only applies to a sample of students in each state or district, not the whole population of students, keep in mind that there is a margin of error around the scores. Also note that NAEP is merely one tool in the assessment toolbox. As a single standardized measure, it performs well and avoids problems of high-stakes testing, because it doesn’t affect a student’s advancement, district funding or teacher pay. However, because the test relies on small samples, the potential does open the potential that states or districts can slant the types of students taking the test; nor does it substitute for the better picture of the school district that multiple measures can provide.
Where is Houston?
The number that jumps out on the Houston’s scores (reported through TUDA – most scores are only reported at the statewide or national levels) is an apparent decline in its fourth grade reading scores from 2011 to 2013 from an average composite score of 213 to 208. However, in context the blip is quite small. In 2002, Houston schools racked up an average of 206 and have fluctuated between 206 and 213 in the years since – showing no significant change. Nor has Houston’s positions decayed in comparison to the rest of Texas’ scores – which have only ranged between 217-220. The achievement gap between the state and city is a fairly low 9 points.
True the overall achievement rate for both Houston and Texas is at the lower end of the “basic” mastery cut-off of 208, which is the second of four tiers (below basic, basic, proficient and advanced). However, the point is that there’s nothing in the numbers to point out that find Houston in any more worrisome place now than 10 years ago.
Similar trends show themselves at the eighth grade level – Houston’s average score went from 248 in 2002 to 252 today –a statistically discernable though modest increase—while Texas’ scores moved from 262 to 264 over the 11-year period, which shows no statistically discernable change. Both sets of scores only varied within a six-point interval across the decade and places them at basic levels of proficiency.
In Math, both Houston and Texas showed no real change between 2011 and 2013 at the fourth-grade level, with Houston’s average scores dropping from 237 to 236 and Texas’ increasing from 241 to 242. Both the statewide scores and Houston’s increased significantly from 2002 to 2012, with Houston registering a nine-point gain and Texas a five-point gain. Both sets of average scores are comfortably above the basic mastery cut off of 214 and fairly close to the “proficient” bar of 249.
A similar story of medium-sized long-term gains paired with little short-term gains showcased itself in the eighth-grade math results. From 2011 to 2013, Houston’s average score increased from 279 to 280, while the Texas’ declined from 290 to 288. From 2002, Houston’s scores have increased 16 points, while Texas increased 11, from 277 to 288. 2013’s average scores place both the city and the state in the “basic” mastery area, which ranges between 262 and 299.
The somewhat boring truth in a nutshell is that Houston is pretty much where it always has been – somewhat below the national and Texas state averages, but improving at slightly higher rates than its counterparts in math and similar rates in reading. That’s not to say there’s nothing wrong with HISD – there are certainly some serious problems in the district related to student achievement. However, it’s important to keep those issues in context.