Follow the Grades

I am sitting here looking at a mathematical formula buried in the middle of a dry bureaucratic memo, and I really don't know if I should react with tears, horror, or wrath. Every time I re-read this and think about what it really means, the first thing I feel is a green surge of nausea.

For the last several days, I have been trading e-mail with Dallas Independent School District teachers who have always suspected there was a secret formula for inflating grades on the so-called ACP exams--the tests the district gives middle- and high-school students at the end of every course to see if they have learned the material.

A teacher told me: "When I started, my students got a bunch of 85s and 90s on their ACPs, and I was thinking, man, I did great. But the older teachers laughed at me and said, 'They curve the grades on that thing. Everybody knows that.'"

She tried to find out in subsequent faculty meetings if the ACP grades really were curved, but she never got a straight answer.

The answer is yes. I have an internal DISD memo that proves it. There is a formula. The formula puffs up scores by almost a third, so that a 55 percent grade on an ACP exam becomes a 70 percent grade on a student's report card and permanent record.

District officials told me the formula is not a secret. After all, I managed to get it. But I already had a copy leaked to me. And then I had to threaten a lawsuit to get an official copy from DISD. And Robert Mendro, whose office devises the formula and sent out the memo, conceded that teachers may not have been informed of the formula because of "a lack of communication."

Yeah. Funny how that happens.

What the memo describes is a kind of double-bookkeeping for district-wide end-of-course examinations, or ACP exams (assessment of course performance). These tests are put together by the Office of Test Development to see if students have learned the minimum requirements for each course.

Normally at DISD, a score of 70 percent is passing, and everything under that is a fail. But in the case of the ACP, it's different. Very different.

For the ACP, DISD suspends its normal rules and decrees that 55 percent is a passing grade on the ACP for most courses. In fact, the passing grade goes even lower in the tougher courses. In pre-advanced placement chemistry, for example, a score of 40 percent is deemed to be passing.

Now, some of these courses have more than 100 separate things the state says have to be taught in the course. So in order to swallow this grade-inflation formula, you have to buy the idea in the first place that a kid can score as low as 40 percent on a test that's testing him on as many as 100 things, and he still passes. He is still judged to have mastered the subject.

It's a little hard to figure, isn't it? Did they ask all the questions three different ways, and the kid with the 40 percent grade got them all right once but wrong twice? Did he score great on all the pre-advanced placement chemistry questions on the test but miss all of the fake questions they slipped in about French existentialism?

It's hard to figure, isn't it?

But before you even try to figure it, let me give you one more very important piece of information from my little memo. According to the memo, all teachers whose classes earn an average grade of less than 70 percent get little black marks next to their names to show that their students have not learned the stuff.

Let me make a suggestion: If all the teachers whose students score lower than 70 percent get a little black mark next to their names, doesn't that make you think that 70 percent is the real pass mark?

Add it up. You're a teacher. If your kids score under 70, you get a "double-asterisk" in your personnel file. In other words, there is a bad consequence for you on your record for all the kids who score under 70 percent. And 70 percent is the pass mark for all other DISD grades. So wouldn't you think 70 percent must be the pass mark on the ACP?

So why would they say it's 40 percent? Or 55 percent? Now, here is where we get into the hocus-pocus. On a typical ACP exam in Dallas, the vast majority of students score lower than 70 percent. In Algebra I, semester I, for school year 1999-2000, for example, ACP exams were administered to students in 34 middle schools and high schools in Dallas. Of those 34 schools, students in exactly one school had mean scores above 70 percent. If DISD had made these same results public in this same form, then you and I and everyone else in town would have known that only one of 34 schools achieved average passing grades on the Algebra I ACP exam.

That's bad. That's a big story. Especially when it happens on all of the ACP exams at pretty similar rates. That tops the 6 p.m. news, you bet!

So guess what happens? After DISD uses the "hard," or real, results to black-mark the teachers, it makes a biiiig adjustment in the grades. On a pre-advanced placement chemistry exam, a grade of 40 percent on the ACP is adjusted to 70 percent when it is reported to students and parents. A score of 60 percent--10 points below the real pass mark--becomes an 80 on a student's record. Your kid got a B, the smart little devil!

Instead of having to tell the pubic that all but one school flunked the Algebra I exam, DISD juices up the grades and reports that 50 percent of the schools made passing grades and the rest of them were all crowded up there to within a few percentage points of passing.

The hocus-pocus actually gets worse here. There are two kinds of tests out there. One is called "criterion-referenced," which just means it asks a bunch of questions to see if you know stuff, and you get a straight grade based on how much of it you know. The other is called "normed." A normed test is a version of the old grading-on-the-curve deal. Your grade doesn't reflect how much of the stuff you actually know; it reflects how well you did in comparison to everyone else.

I'm told that those two kinds of tests are put together very differently. They are different critters. But the DISD tries to work the ACP both ways. They build it as a "criterion-referenced" test, use it that way to put the black marks on the teacher files, and then they pump the grades full of air for public consumption. When it goes public, they say the test is "normed."

When I showed this memo to a teacher friend of mine, he became extremely angry. He said he couldn't talk about this with me on the record, because he feared reprisals against his students (this is not a paranoid guy).

Speaking on background, he did say: "Under that system, nobody's accountable except teachers and students. We're responsible. But the people who put us out there in the portable with the kids and the list of 129 different course objectives that we're supposed to teach them in 177 instructional days, they're not accountable."

I sat down with Robert Mendro, a nationally respected testing and education statistician in DISD's division of evaluation, accountability, and information services. This was his memo. He started by giving me some of the standard lines about how the test is criterion-referenced but then it's also sort of normed later on.

I asked him this: If you know that a kid has answered only 40 percent of the questions on the pre-AP chemistry test correctly, why would you tell him and his parents that he passed the test?

Mendro looked at me for a long moment. He said, "I don't suppose it sends a very good message, does it?"

Over the course of last week, I discussed this memo with a number of teachers here in Dallas and with some testing experts around the country, several of whom wanted to make an additional point--something they felt ran deeper than the mere skulduggery exposed in the memo.

Alfie Kohn, a well-known author and critic of standardized testing who lives in Belmont, Massachusetts, e-mailed me: "I'd hope you can help your readers keep their eyes on the big picture: The issue isn't just this dubious scoring practice but what it reveals about the inherent arbitrariness--the utter lack of objectivity--about standardized testing and the way it fails to provide meaningful accountability (while, in the process, disrupting and distorting classroom instruction)."

Before the statistical dust storm starts--which I fully expect as soon as this column appears--let's take a big reality check here. The state administers an Algebra I exam that is separate from the TAAS tests. It's a basic "criterion-referenced" do-you-got-it or do-you-not-got-it test. Dallas students pass the statewide algebra test at half the rate for students in the rest of Texas. Only one in four Dallas students can pass the test, according to the last round of scores.

But all year long before they take the statewide test, Dallas students tend to bring home passing grades in Algebra, according to a report provided me by DISD.

Robert Mendro didn't invent this policy. His office generates the equations that "scale" the ACP scores. This was his memo. But he and the people of his division are only following orders.

I don't know whose orders. I don't know where this started or when. But this stinks. This hurts children. We should all be ashamed.


All-access pass to the top stories, events and offers around town.

  • Top Stories


All-access pass to top stories, events and offers around town.

Sign Up >

No Thanks!

Remind Me Later >