When Standardized Testing, and Its Reporting, Are Not Standardized

In an earlier blog post, I wrote about the (generally unknown) fact that different DC public middle schools administered different PARCC math tests to their students last year without any accurate or obvious public acknowledgement of this.

The lack of sunshine is extensive: test scores reported on all four publicly available websites for DC public school comparisons– My School DC (run by the lottery, to enable school comparisons for school choice); school equity reports; Learn DC (run by OSSE, the office of the state superintendent of education), and DCPS profiles—show only a combined test score for DC public middle schools that combines the scoring from those different tests into one number for the school’s math test score.

Here, I explore further the ramifications of this misleading use of public data.

Take my middle school, DCPS’s Stuart-Hobson, as an example.

Here is the percent proficient at each level for the SY 14-15 PARCC math test reported for Stuart-Hobson on the My School DC, school equity reports, Learn DC, and DCPS websites:

Level 1: 24%

Level 2: 41%

Level 3: 25%

Level 4: 9%

Level 5: 1%

This information, promulgated widely by our city agencies in charge of education on those four websites, allows one to conclude that only 10% of students at Stuart-Hobson have “met” or “exceeded” expectations (levels 4 and 5) in math. And it also allows one to conclude that more than half of the school’s students scored at the lowest levels of PARCC’s math test last year.

But as confirmed by OSSE last week, these numbers are not Stuart-Hobson’s exact test score data, but rather an aggregate of test scores from different tests taken by students at the school.

That is, the numbers reported on those websites were created by OSSE combining the test scores of students who took different PARCC math tests at Stuart-Hobson. OSSE then derived a number from the scores–as if the different tests were equivalent.

Not only does this leave out important information (different tests were taken; some tests involved more advanced math and, presumably, were more difficult)–but it also obscures the actual performance of Stuart-Hobson students on those advanced tests.

For instance, 15% of the students at Stuart-Hobson (grades 6 through 8) took advanced PARCC math tests in SY 14-15. Of the Stuart-Hobson 8th graders who took the (advanced) algebra test (57 students, or about 44% of all 8th graders taking PARCC math tests at Stuart-Hobson that year), 24% of those 8th grade algebra test takers scored proficient (at least level 4).

In other words, many Stuart-Hobson students who took the more advanced PARCC math tests last year did well on them.

But you would never know it from that aggregated 10% proficient rate at the school reported above, which blithely combines test scores for different tests and completely obscures how actual students are performing.

Sadly, this poor handling and reporting of data is not just a problem for Stuart-Hobson: it is a problem at every DC public middle school that administered different PARCC math tests to different sets of its students, because each middle school’s PARCC math test results were similarly reported (and similarly obscured).

For the record, that is a total of 21 schools:

Brightwood EC

Cardozo EC

Columbia Heights EC

Deal MS

Eliot-Hine MS

Friendship PCS Technology Preparatory Academy

Hardy MS

Jefferson Middle School Academy

Johnson MS

Kelly Miller MS

McKinley MS

Oyster Adams

Raymond EC

School Without Walls at Francis Stevens

Sousa MS

Stuart-Hobson MS

Takoma EC

Truesdell EC

Two Rivers PCS

West EC

Whittier EC

Although this list does not include all DC public schools offering middle school grades, it does have a preponderance of DCPS schools. That is, of the 21 schools listed that offered advanced PARCC math tests in SY14-15, only 2 are charter schools (and only one campus of one charter school, Friendship).

Moreover, of the 28 DCPS schools that have an 8th grade, 19 of them (68%) used advanced tests for all or some of their 8th graders. Of the 37 DC public charter schools offering 8th grade, only Two Rivers and this one campus of Friendship (5% of all DC public charter middle schools) used these advanced tests for some or all of their 8th graders.

This means that for 68% of DCPS middle schools, the reported PARCC math test scores on our city’s readily available and purposely designed websites to compare schools (My School DC, Learn DC, equity reports, and DCPS’s own website) are not reporting test performance that can be compared in ANY manner to PARCC math test scores of most DC public charter schools or to PARCC math test scores of the remainder of DCPS middle schools.

At DC public middle schools in SY 14-15, the following PARCC math tests were administered:

6th grade math

7th grade math

8th grade math

Algebra 1 (advanced)

Geometry (advanced)

Some schools gave only the basic tests (the first three above); others gave a combination of the two; and a few gave only the advanced tests (last two).

Attached here is a chart showing which middle schools gave advanced PARCC math tests last year; how many students took those tests; and the percentage of the total students at each who took those tests.

These numbers are reported directly by OSSE—but are not at all available on those school comparison websites. To access the data that informs this chart (created specifically for this blog) and to understand that not every middle school is administering the same PARCC math test, you need to go to the main data reporting site of OSSE and then back out the information yourself.

For each school.

Did I say that this chart took hours to create?

How many parents will do that during lottery season?

How many parents will do it right now?

As that chart makes clear, however, the percentage of 8th graders taking advanced PARCC math tests varies wildly, from a low of 2% to 100% at each middle school that offered those tests, so deriving conclusions from these data is difficult, if not impossible (i.e., who are the students, were they indeed all taking advanced courses or only a subset of the test takers taking advanced courses?, etc.).

As reported before on this blog, OSSE says that its guidelines recommend that each school offers tests according to each school’s math curriculum.

For charter schools, that is a direct proposition: each charter school is its own decision maker.

But it is not clear–nor did OSSE know when I asked–what guidelines DCPS offers its schools as to which PARCC math tests are administered at each middle school (and which students therein take them).

Such untrackable testing methodology also has ramifications: As reported before, two different, but demographically comparable, DCPS middle schools–Hardy and Stuart-Hobson–had different numbers of students last year taking the advanced PARCC math tests. Not surprisingly, the school with the larger number of advanced test takers did not have as high scores as the other.

This suggests that a middle school being selective about test takers for more advanced PARCC math tests is a good strategy if that school wants to maximize scoring.

The problem, of course, is that this differential in math tests administered within middle school grades is not publicly acknowledged anywhere–not at OSSE, not at DCPS, nor at the schools themselves, so even if one had the time to track that decision making at each school, the answers are not available.

Needless to say, the differential in those tests, who takes them, and their scores can have ramifications that go directly to how parents perceive the quality of the school; how its teachers are scored; and how its performance is judged. After all, our public schools live and die by their test scores.

There is a bit of other nuance here as well:

DCPS education campuses, which go from elementary grades through middle school, have campus-wide PARCC reporting on those four, widely available websites. That reporting combines elementary school math scores with middle school math scores. This is not merely confusing–it is downright misleading, as it is yet again evidence of combining scores of completely different tests (in this case, among different, and distinct, populations of test takers: elementary age and middle school age).

Moreover, not every student is represented in the more detailed reporting available through OSSE, on which the chart here is based.

For instance, footnote #2 of the middle school table reported by OSSE says the following:

“Aggregated results also include ‘Full Academic Year’ rules, meaning that students attributed to a school’s results for accountability must be enrolled in the school on the date of the enrollment audit and on March 30; students who do not meet this rule were not applied to school aggregations.”

This means that not only is there an unknown percentage of test takers at each school not being reported at all, anywhere, but that of the students who take advanced math tests at the middle school level, there will be some who are in fact omitted from the reporting.

These problems with DC’s PARCC reporting occur at the high school level as well, but there a different problem exists: data from students taking more advanced tests is simply omitted from the widely available and publicly reported data on those school comparison websites—unless the entire school is using the advanced tests.

Say what you will about the prior incarnation of DC standardized testing, but at least all students at each grade took the same test.

If the purpose of PARCC is to show basic levels of accomplishment, and what is needed to go beyond that, then having different tests at all in one grade is not merely misleading, but requires care and attention for reporting that we in DC are clearly not getting.

Either we use these tests to show what is needed—or we test to show how well our students are doing in advanced courses.

Barring huge changes in reporting, you cannot have it both ways.

And given how reliant we are in using test scores to make all sorts of conclusions about our public education system, the ramifications of such poor handling of data means that schools, teachers, students, WILL be judged wrongly.

Is that something we want?

Not likely–so how about it, OSSE? How about it, Mayor Bowser? How about it, deputy mayor for education Jennifer Niles? How about it, state board of education?

One thought on “When Standardized Testing, and Its Reporting, Are Not Standardized

  1. This is appalling. Like you, I have never been a fan of the weight put on standardized testing…but the current situation of multiple tests being aggregated is untenable just as it is unconscionable. We have gone from bad to worse re our testing policy. Truly this is comparing apples to oranges and prevents parents and the public to understand how schools are performing and to make good choices re where to send their kid or whether our “reform” is actually working. I find this bewildering that DCPS/OSSE would use this…unless it is a purposeful reaction to obscuring the reality of what is happening in our schools.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s