The answer for the Chicago Public Schools is that between 1993 and 1999, it happened in about 2 percent of classrooms in 3rd through 8th grades, report economist Steven Levitt and U. of C. doctoral student Brian Jacob. In raw numbers, that’s about 700 cases of test tampering.

Accounting for no more than 10 percent of the test-score gains during that period, the evident cheating had no appreciable impact on the citywide results, the scholars say.

After determining the likely level of cheating, the pair then sought to determine the impact of two high-profile CPS policies that are tied to testing: school probation and student retention. Their conclusion was that the practice of putting low-scoring schools on probation had increased the amount of cheating but that the practice of holding low-scoring students back had not.

The data also told Levitt and Jacob that cheating is orchestrated by individual teachers not principals.

What they analyzed

The cheating analysis involved a database that contained the response of every student in 3rd through 8th grade to every question on the math and reading portions of the Iowa Tests of Basic Skills administered each spring.

The statistical method Levitt and Jacob developed looks for improbable test score gains and unusual answer patterns classroom by classroom. High gains alone may signal good teaching; however, high gains combined with unusual answer patterns—for example, students who choose the wrong answers to easy questions but correct answers to hard questions-constitutes solid evidence of cheating, Levitt explains.

The U. of C. method even can predict the chances of a student’s answering a certain question correctly based on his or her past ITBS performance.

In Levitt’s view, adult cheaters are worth pursuing, however small their numbers, because of their potential corrupting influence on children. “It’s not that I don’t understand why teachers cheat,” he says. “I’m not saying I wouldn’t do the same thing in their shoes, but it’s still morally reprehensible.”

Last Spring, Levitt introduced his system to then Deputy Chief of Staff Arne Duncan, who was subsequently promoted to Chief Executive Officer.

Duncan says he needs to “get all the facts” before deciding whether to adopt the system but adds that he’s “more than intrigued with the idea.”

Looking at incentives

In his research on crime, Levitt had looked at whether certain penalties or law enforcement practices influenced criminals to commit fewer crimes. “A lot of my work is about how people respond to incentives,” he notes. Two School Board policies linked to the ITBS could give school staff incentive to cheat, he believed: a probation policy that imposes sanctions on schools that perform poorly and a promotion policy that requires low-scoring students to repeat a grade.

In September 1996, the School Board put schools on probation for the first time; the list included all schools where fewer than 15 percent of students scored at or above national norms in reading on the ITBS. Failure to improve, board officials warned, could result in staff dismissals.

The following spring, Levitt and Jacobs found, cheating incidents on the ITBS reading test jumped 75 percent. Nearly all of the increase came from schools in the bottom third of the system on ITBS scores. Classrooms where students had scored poorly in reading the previous spring were most likely to have evidence of cheating.

Cheating on the math test, which does not count towards academic probation, did not increase, they discovered.

The practice of retaining low-scoring students began in the spring of 1996 with 8th-graders; the next spring, it was extended to 3rd- and 6th-graders as well. Levitt says he can think of several reasons why that policy might induce teachers to cheat on the spring test: fear of looking bad if too many children are retained, not wanting to teach certain students again, and sympathy for struggling students who face retention.

So it came as a surprise that cheating at the benchmark grades rose only slightly relative to other grade levels after the promotion policy was enacted, Levitt says. “It seems that teachers are responding to the risk [that] their school might be put on probation rather than the risk of students being sent to summer school.”

Teachers, rather than administrators, appear to be the primary cheating culprits, Levitt and Jacobs say. If an administrator were responsible, they reason, cheating would appear in many classrooms in a school. What they found was that where cheating occurred, it turned up in no more than a few classrooms and typically in only one.

“Teachers just get scared,” says one new language arts teacher who recently left a school where she believes some of her colleagues cheated on the ITBS. “A lot of us aren’t tenured. A lot of us aren’t even assigned. A principal can walk into your classroom and say, ‘Okay, I’m opening your position.'”

Most new teachers in the district are hired as Full-time Basis Substitutes (FTBs), who, unlike tenured or regularly assigned teachers, can be fired at the principal’s whim. For that reason, new teachers feel tremendous pressure to post high test score gains, according to a veteran teacher from Clinton elementary in West Ridge, who believes an FTB is responsible for a cheating incident at her school.

In May, Channel 7 News reported that Clinton was one of eight schools being investigated for cheating on the ITBS. A teacher at Clinton allegedly obtained a copy of the math ITBS ahead of time and used it for practice with her students. No other schools were identified in that broadcast, and board officials have declined to name any.

Board safeguards

To prevent cheating on the ITBS, the board has long required schools to follow a list of test security procedures. For instance, tests and answer sheets must be kept in locked storage except during the test administration. Teachers in grades 4 through 8 may not administer the test to their own students, although they may be present during the testing.

These procedures can be circumvented, teachers report. One teacher at an elementary school that recently got off probation says that although she has not cheated, she has had the opportunity to do so. “There’s no undercover sneaky way to do it. You just do it,” she explains. “You ask the child. ‘Did you really read that—Or just point to the answer.'”

Other teachers at her school do cheat, she believes. Teacher aides who proctor the ITBS at her school joke about the cheating but don’t report it, she says. “You don’t snitch.”

Another teacher who says her elementary school was investigated for cheating last year believes that some of her colleagues teamed up. “‘I’ll cheat for you if you cheat for me.’ I think that’s how they work it out.”

Once the spring tests are completed and scanned, the Office of Accountability does a computer analysis to search for suspicious results. First, it looks at score gains classroom by classroom. A typical classroom would show a year’s growth over the previous year’s scores. Classrooms where students gained an average of two years would merit a closer look, according to assessment director Joseph Hahn, and that involves an item-by-item analysis.

Classrooms under suspicion, along with a random sample of other classrooms, are retested in May. In all, the board retests or “audits” about 120 classes at 100 schools each spring, mostly at the benchmark grades.

The audit is intended as a deterrent to cheating, as well as a method for detecting it. Proving cheating is another matter. For one, students rarely score as well on the retest as they did on the initial examination, according to Chief Accountability Officer Philip Hansen.

The retest doesn’t count for them, so there’s less incentive to do well. And sometimes students are uncomfortable with School Board staff who arrive to retest them, he says. “The kids get upset, they get nervous, they get mad.”

Sometimes innocent teachers are caught in the auditing net, and school staff get angry. For example, Hansen says that a principal wrote him an irate letter recently, complaining that one of his teachers had been audited three years in a row and that each year the audit validated the high gains. Hansen says the principal wanted to know why the board didn’t come out and congratulate the teacher instead. “So I just wrote a mea culpa and apologized for that,” he says.

Only classrooms with extreme differences in test score results during the audit are investigated more closely, Hansen says. That involves “the laborious and time-consuming process” of comparing how consistently each student answered similar test items on each of the two tests. Student answer sheets also are examined for erasure marks. If 90 percent or more of the responses were switched from the wrong answer to the right one, says Hansen, that’s a sign of cheating.

Schools investigated

Schools suspected of cheating are referred to an investigative team in the Office of Schools and Regions, which conducts interviews at the school site. Over the past two school years, the team has sought evidence of cheating at about 15 schools, according to investigations director Thomas Sherry, and substantiated allegations at several of them. Other investigations proved inconclusive, he says, or are ongoing.

“It’s difficult to conclusively prove test cheating,” explains Marilyn Johnson, the board’s chief attorney, because it usually requires an eye-witness. Johnson can recall only two incidents where CPS staff were disciplined in connection with a cheating investigation. In 1996, the principal and curriculum coordinator at Clay Elementary in Hegewisch were briefly suspended for failing to keep standardized tests secure. An investigation found that copies of the ITBS and a state test had been distributed to teachers.

In 2000, two staff members at Carpenter also were accused of ignoring test security protocol. The principal resigned before a hearing seeking to impose a 30-day suspension.

Now, for the first time in Johnson’s memory, the board is seeking to dismiss two teachers suspected of cheating on the ITBS. The alleged incidents occurred at two schools in May.

Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.