Like the airplanes landing at nearby Midway Airport, noisy students cram into Hearst Elementary School’s auditorium on June 7 to celebrate the end of a different kind of journey, the completion of another school year. Rewards large and small, from MP3 players to $10 gift certificates, go to those who essentially traveled furthest—the students with the highest marks or biggest gains on standardized tests.
Those scores are the centerpiece of Principal Reginald Miller’s “data wall” of classroom-level test scores—good or bad—posted in large, color-coded charts just outside the main office. In Room 215, for example, just four out of 14 students met reading standards on one of last year’s benchmark tests.
This mix of incentives and transparency is at the heart of a turnaround effort at Hearst, a low-income, predominantly African-American school that has languished on probation. Miller, now in his second year as principal, is pushing a data-driven system that features new curricula, professional development and a reshuffling of teachers’ preparation periods.
Focusing on students’ performance data helps teachers be lifelong learners, says Miller. “But there’s a risk. You have to be working with people you trust because you’re going to be putting out your dirty laundry.”
His willingness to post classroom-level data so openly may be unique in Chicago Public Schools, perhaps even detrimental to staff morale, but it ultimately reflects CPS objectives to push schools to use data to improve instruction.
“That message has surely filtered down, not just to the principals, but the teachers, too,” says Rebeca de los Reyes, Area 11 instructional officer, who oversees Hearst.
Hearst also reflects a national trend, one spurred by ever-mounting pressure to raise performance on standardized tests under No Child Left Behind. The federal law requires 100 percent of students to meet state standards by 2014, and has touched off a race to figure out how best to turn assessment data into better instruction.
Experts add a warning: Don’t “teach to the test.” Schools need to think of “data” in the broadest sense and continually work at refining their assessment, they say, instead of burning up teaching time on test-taking strategies and drill-and-kill study.
Miller, a former teacher with a background in chemical engineering, turned entrepreneur and dabbled in the restaurant business before returning to education. Because of his scientific and business background, Miller has emphasized the use of test data and a strong incentives system.
But that emphasis strained his relations with teachers. Many bristled at the posting of their students’ test results on his data wall, and 18 of the school’s 35 teachers have since left Hearst.
“It wasn’t pretty,” remembers math and science specialist Elizabeth Anthony. She says she tried to convince teachers that the data wall “formed a baseline” to measure their ongoing efforts, not an indictment of past work. Still, teachers left en masse.
Miller has plowed ahead, hiring only new teachers who he says were willing to embrace his data-driven approach.
The results so far seem to support the effort. Test scores went up in every grade level and subject last year, with one caveat: Scores among special education students remained flat and kept the school from meeting annual yearly progress goals.
That’s a shortcoming Miller hopes to address with another round of reforms and even greater attention to data.
To get the job done, much of the heavy data lifting will fall on Anthony, Hearst’s data specialist.
Last year, with help from her area math coach, Anthony learned to open data files in Microsoft Excel and sort results by classroom and subject. She suspects Miller will ask her to run similar reports using student demographic data this year, especially as the school targets its lagging special education students.
But figuring out how to sort and graph test data is just the beginning of a long process of data mastery, according to Kathryn Parker Boudett, co-editor of “Data Wise” and an expert on data-driven instruction. She and other Harvard researchers spent years working with Boston Public Schools to fine-tune data gathering and the processes that make it useful to teachers.
“Teachers get pretty glassy-eyed if you tell them that 42 percent of their students passed a test,” says Boudett. “The thing that we’ve found most powerful is just looking at the actual student work produced every day.”
Schools that want to be data-driven need to expand their idea of what data is, she adds. By getting teachers to look at a student’s written work, especially if it’s buttressed by test data that can tie the discussion to state standards, the chances for a fruitful conversation multiply, she says.
Assessing student work “is what makes them teachers. It allows them to draw on what they know best,” Boudett says.
Eventually, the Boston schools that Harvard worked with found opportunities for teachers to visit classrooms and watch one another teach. That helped teachers reflect deeply on their own instruction, Boudett says, and it put a second set of eyes into classrooms during group activities.
It takes time to analyze data. But by turning teachers’ attention to state test scores and students’ written work, Boudett says, the faculty meetings that used to dwell on scheduling and discipline problems all but disappeared in Boston. Instead, teachers began having meaningful conversations about what matters most: student learning and using data to understand and improve it.
More than test scores
Many of the changes Miller has planned for Hearst this year dovetail with Boudett’s suggestions.
This year, teachers will merge two of their four preparation periods into one 90-minute block that will be used for data-intensive, grade-level meetings. Miller also has focused professional development on the data-analysis process, requiring his teachers to read “Collaborative Analysis of Student Work,” another book aimed at helping teachers make sense of student assessment data.
Hearst also will look more closely at students’ written work, a priority CPS has set for schools districtwide.
Students “have to be able to write in complete sentences [about] what they did and how they did it. It’s not just a ‘Yes,’ it’s a ‘Yes, because…'”says Anthony.
Looking back, she says last year’s data push was ultimately about Miller setting expectations.
Teachers also had to master new math and reading curricula, which have better built-in assessment activities, according to Anthony.
Learning how to parse the Illinois Standards Achievement Test data, as well as the results from the district’s new assessments in math and reading proved beneficial but took time, Anthony notes.
Ideally, she says all students would have been placed into groups based on the questions they missed on the tests. Anthony planned to do pullout tutoring with each student, but there were too many.
In the end, she took a triage approach, tutoring only the students who were most likely to pass the ISAT with a little boost—those students who scored between 40 percent and 50 percent (the ISAT passing score) on the Learning First and Math Benchmark tests. She narrowed her job further by focusing on the standards the students most often missed.
The effort may not measure up to the “Data Wise” ideal, which calls for a more holistic analysis of student work, but it did lay the groundwork for Hearst’s more ambitious plans for this year: using Anthony’s reports to help teachers think about specific learning standards as they review students’ written work.
That has at least one of the new teachers excited. Kisha McNulty, an 8th-grade math teacher, expects the school’s focus on assessment and data analysis will help her be “reflective as a teacher.”
‘Data Wise’ district
De los Reyes says schools in Area 11 have taken slightly different approaches to the same task: Focus on student work, especially written work, and organize teachers into regular grade-level meetings to analyze it. “That’s data,” she says.
Her schools are following the district’s game plan by digging past test data and into writing assessments. CPS has tried to ramp up its professional development to match, says Patrick Baccellieri, deputy officer in the district’s new Instructional Design and Assessment office.
Baccellieri, former principal at South Loop Elementary, built a reputation as a savvy data analyst while piloting the school’s transformation. Not satisfied with a simple once-over of ISAT data, he spent countless hours delving into the test and the specific state standards the various questions measured. The fruits: learning standards re-worded so South Loop’s students could better grasp their meaning and prioritized so the teachers could focus on what Baccellieri said was most important.
Once he had thoroughly digested the ISAT, Baccellieri started breaking down his school’s test data in ways that helped him communicate his instructional aims to his teachers. In fact, data became his preferred communication tool. He soon went beyond test scores, even charting out discipline referrals by the hour.
Surprising trends lurked in that data, too, including a huge spike in referrals just after lunch. Though teachers knew discipline issues were greatest at that time, they had never seen the problem’s true magnitude—until Baccellieri’s charts surfaced. Teachers clamped down and referrals fell dramatically.
Baccellieri, who joined Botana’s group this summer, is essentially trying to replicate his South Loop plan across the district. To do it, CPS may get help from Harvard’s “Data Wise” researchers.
This summer, Baccellieri and 11 other principals, teachers and administrators from CPS traveled to Brown University for a weeklong data summit sponsored by The Joyce Foundation. With guidance from Boudett and her team, Chicago and representatives from Milwaukee, Cleveland and Providence tried to determine best data strategies and learn from one another’s efforts.
Chicago’s approach has, to date, been largely about its new assessments, Learning First and Math Benchmark. Both tests, given three times a year in quick exams (less than an hour), have provided much-needed measures in the year-long gap between ISAT tests. Results are returned to schools within two weeks and have helped schools better measure progress throughout the year.
Tying this benchmark data to a deeper analysis of written work marks the next assessment push, Baccellieri says.
“In the end, what’s really critical is to help teachers understand what’s important for 8th-graders to know and do, so they can move on and get a 20 or better on the ACT,” he says. “And then [we must] go backwards and [ask] what does that mean for 6th grade? What does that mean for 3rd?”
To help get the data in the hands of teachers, the district also is rolling out a curriculum management tool built into its new $60 million IMPACT (Instructional Management Program and Academic Communication Tool) student information system.
The ultimate goal is to have the IMPACT system tie the Learning First, Math Benchmark and ISAT data into one outlet where schools can see how the test questions connect to state learning standards and easily analyze test results by classroom, demographics or student.
Boudett says the “Data Wise” project has shied away from the technical front, given the plethora of tools available to help schools slice and dice data. She says it’s important, however, to make sure teachers get the data quickly and easily, and that any software tools used make it easy to analyze the data by test question, classroom, learning standard and individual student.
For his part, Hearst’s principal hopes that despite the early snafus, the IMPACT system will deliver and free up his math and reading specialist from the school’s heavy data lifting and analysis.
“To the extent that you can get data back faster, the better you can make decisions to improve your school,” says Miller.
To contact John Myers, call (312) 673-3874 or e-mail firstname.lastname@example.org.