the Mooresville, N.C., school district was the toast of the ed tech community – acclaimed as a shining example of how laptop computers could improve education. That year, PBS NewsHour broadcast a feature story showing how the district 25 miles north of Charlotte gave every student a laptop from fourth grade on up. A year later, The New York Times heralded the school district's impressive test score and high school graduation gains: 88 percent of students hit the proficient threshold on state tests, up from 73 percent before the laptops. When I reported about a laptop failure in a New Jersey school in 2014, I used Mooresville as a counter-example of an effective laptop program.
Now, a five-year study of Mooresville's test scores puts its success story into context and comes to several surprising and sobering conclusions.
Teachers Raise Millions for Classrooms
The first is that Mooresville didn't reap any special academic benefits from using laptops in the first three years. Modest gains started to emerge in 2012, four years after the district began its laptop program. In year five, the only sustained benefit was in math.
"I see positive effects, but maybe not quite as positive as some of the media coverage a few years ago," says Marie Hull, an economist at The University of North Carolina-Greensboro and a co-author of the study. "I think it's because I have a better control group."
The New York Times didn't get its figures wrong. Test scores did go up a lot in Mooresville after 2008, when it started handing out laptops. But Hull calculated that test scores also soared by about the same amount in neighboring counties, which didn't give laptops to each student. (Hull compared Mooresville's test score gains to those in about a half-dozen counties surrounding Mooresville with similar demographics.)
Later in 2012, Mooresville's test score gains started to outpace those in neighboring counties. In the final year of this study, 2013, Mooresville's reading score improvements fell back down, returning to the same pace as its neighbors. Only math scores were still elevated and by a relatively modest amount.
Hull says her finding of delayed gains confirms on-the-ground reports of some teachers' initial resistance to laptops and the time it took for teachers to receive training on how to integrate laptops into the classroom. "When districts or schools implement a technology program, or really any program, sometimes there's a period of time when everyone is adjusting to it," says Hull.
In other words, one should be skeptical of dramatic test score gains immediately following any big educational change.
The study, "One-to-One Technology and Student Outcomes: Evidence From Mooresville's Digital Conversion Initiative," was published online in September 2018 by the journal of Educational Evaluation and Policy Analysis.
In addition to noting test score gains, the researchers also studied changes in student behavior after receiving laptops. From student surveys, the researchers found that Mooresville students reduced their time reading books by more than four minutes a day, on average, to roughly 40 minutes a daily in 2011 from more than 45 minutes daily when the laptop program was introduced. Meanwhile, kids in neighboring counties increased their daily reading. Four minutes might not sound like a lot, but over the course of a year that adds up to more than 25 fewer hours of reading, which is substantial. Unfortunately, the state stopped administering that survey after 2011 and it's unknown if book reading rebounded. But if time spent reading continued to deteriorate, that could partially explain why reading scores didn't rise as much as the math scores did.
Students continued to spend as much time on homework as before but spent more of their homework time on a computer. There's some indication that students were less likely to be absent from school after receiving laptops – a sign that students were more motivated to come to school and learn. But the absentee data bounces around a lot.
Hull and her colleague, Katherine Duch at One Minus Beta Analytics, studied only test scores, attendance records and student surveys and didn't track how teachers and students were using the laptops. So there's no insight into what the students did on the laptops to produce the test score gains in math. Hull explained that teachers were free to use the laptops in different ways. Some used educational software; others used online discussion groups and email.