Gaming the System
The DOE didn't like the answers it was getting from its satisfaction survey—so it changed the questions.
Since 2003, including last year, HONOLULU Magazine has presented a chart grading nearly all the public schools in Hawaii. For each school, we combine five measures to get a single score, including student performance on math and reading tests and three satisfaction scores from teachers, parents and students.
All our data come from official Department of Education sources. Math and reading scores come from the Hawaii State Assessment. The satisfaction scores we derive from a survey the DOE administers in all its schools. A massive undertaking on the system’s part, the School Quality Survey (or SQS) contains nearly 200 statements, is filled out by thousands of people and enjoys a remarkable return rate. You’ve no doubt filled out similar market research surveys—survey takers (teachers, parents and students) fill in the bubbles, agreeing or disagreeing with statements designed to determine how they feel about their own school’s performance in various areas.
For our “Grading the Public Schools” charts, usually published every other year, timed to the release of the SQS, we were drawn to these very direct, no-nonsense statements to ascertain satisfaction:
Teachers, agree or disagree, I would send my own child to this school.
Parents, agree or disagree, I would recommend my child’s school to other parents.
Students, agree or disagree, If I could, I would go to a different public school.
Two interesting things have happened since the last presentation of the chart in our May 2008 issue. First, the DOE accelerated its schedule for the SQS, now running the survey every year, with the most recent results published in November 2008.
Second, and most important, the DOE changed the survey.Those three statements we use to grade the schools? Gone.
In fact, the DOE eliminated or rephrased more than 70 questions in such a way as to make them less personal to the survey taker, passive in construction rather than active, general rather than specific. The old and new SQS are available online at the DOE’s Web site, where you can compare them in depth. But, to give you a few examples:
Most of the students in our school are well behaved.
… became …
Most of the students in our school follow the school rules.
Staff development at our school is excellent and focused on standards-based educaton.
… became …
Staff development at our school is relevant to standards-based education.
My child’s teachers are well prepared and know what they are doing.
… became …
My child’s teachers are effective in their teaching.
I feel that work in my classes is just busy work and a waste of time.
… became …
I feel the work in my classes is important and valuable for my future.
My teachers expect me to do high quality work.
… became …
My teachers expect me to do quality work.
The net result? The schools are delivering better scores than they would if the questions had not been altered. This is obvious when you look at the results, year-to-year, for the questions that did not change.
Our test case: We compared the satisfaction dimension of the spring 2007 SQS for Farrington High School against the school’s results in its post-revision spring 2008 SQS.
According to the overall results, parent satisfaction is up 32 percent!
Sounds good, until you look at the responses for the unaltered or slightly altered measures. In 2007, the DOE asked how many parents agreed with the statement, “Overall, I am satisfied with my child’s school” and 70.5 percent responded positively. In 2008, the statement was worded, “I am satisfied with the overall quality of this school,” and only 61.9 percent of parents agreed—a 12 percent decline.
The parent score for “variety of subjects taught” or “courses offered”—another measure only slightly altered—rose 7 percent.
So where did this 32 percent increase in over all parent satisfaction come from?
From dropping the most direct, telling statements. The DOE simply eliminated the statements that used to reveal noticeable dissatisfaction, namely:
I would recommend my child’s school to other parents.
and
If I could, I would send my child to a different public school.
Further boosting the parent score, the DOE added a new statement: I feel the work in my child’s classes is important and valuable for my child’s future.
Nearly 90 percent of parents agreed with that. Who wouldn’t?
We see the same pattern with student satisfaction at Farrington. It, too, is up, by 9 percent overall. But only when you consider statement No. 42:
If I could, I would go to a different public school.
has been replaced with:
Overall, this is a good public school.
This switch was good for a 50 percent improvement in the school’s score for statement No. 42. On the other hand, student agreement with “I enjoy coming to school” is actually down 11 percent, and agreement with “I am satisfied with the education I am receiving at my school” is also down, by 3 percent.
Another big winner for the DOE? Replacing statement No. 6, “I feel that work in my classes is just busy work and a waste of time” with, “I feel the work in my classes is important and valuable for my future.” Student satisfaction on statement No. 6 shot up 23 percent thanks to this swap. Of course, the new statement can be affirmed, honestly and enthusiastically, even in the absence of the work actually being done.
The only area where this gamesmanship failed was with the teachers. Despite softening their survey questions, too, the DOE found a 10 percent decline across the statements used to measure teacher satisfaction. Our faith in teachers to see through BS is positively reinforced.
Oh, a critical change that caught our eye regarding Farrington’s SQS, though it was outside of the satisfaction dimension we’re discussing here—the issue of gangs on campus. In 2007, more than 52 percent of parents disagreed with the statement, “Gangs are not a problem at my child’s school.” Meaning, half the parents were, in fact, specifically concerned about gangs at Farrington. Not just bullying, which is covered in other statements, but gangs.
In 2008, the DOE simply eliminated the statement.
Now, we don’t know that the DOE eliminated the SQS statements we use in “Grading the Public Schools” just to make it harder for HONOLULU Magazine to report on the schools for our readers. We don’t have tape recordings of DOE meetings in which officials sat down and said, “We’re tired of getting beat up with our own survey, is there anything we can do with the questions so it doesn’t look so bad?”
We only know that this is absolutely the effect of the DOE’s changes. It’s harder to see the true quality of the schools now, and it is impossible to do year-to-year comparisons.
This is consistent with the DOE’s past behavior—when it doesn’t like its own grade, it changes the test. You might remember how the system crowed early last year when math and reading scores suddenly rose on the Hawaii State Assessment. Problem was, the state had changed that test, too. When the federal government tested Hawaii students with the NAEP, a test that had not been changed, most of Hawaii’s apparent improvements evaporated.
We encourage the DOE to remember what its own teachers always told us as students: “When you cheat, you only cheat yourself.” The DOE is cheating itself out of the opportunity to learn what is broken by asking the hard questions and asking them consistently, year after year, to see if there has been any improvement.
Read more about teacher quality in "Do Teachers Make the Grade?"