Raising the bar

Texas school districts analyze student test data to improve student growth

They are 270 miles apart, but the Longview and Austin (Texas) school districts share a commitment to improving student performance. Instead of focusing solely on school-level pass/fail rates, the districts use value-added analysis to follow the progress of individual students and then use that information to increase educational effectiveness.

The analysis helped Longview improve its student growth so much at one underperforming “priority” school that it was able to move the school away from this designation. In Austin, administrators are measuring students’ academic growth and what factors affect that. Both districts use SAS® EVAAS® for K-12. With EVAAS, the growth of each student counts. It looks at how each student grew from one year to the next, looking to see whether the student’s skills grew.

Not all the growth models out there consider error …. We spoke with the psychometricians and econometricians ... and selected EVAAS because we wanted a model that we felt accurately reflected our data.

Lisa Schmitt
Senior Research Associate, Department of Research and Evaluation
Austin Independent School District

This approach has led to in-depth discussions in both school districts about how best to help both under-performing students, and those that easily pass state-mandated tests but are otherwise stalled academically. It’s also helped school officials assign teachers to classrooms based on their strengths and helped teachers identify areas where they could improve.

Rescuing struggling learners and helping high-achieving ones soar

Longview hasn’t confined the analysis just to underperformers. Rebeca Cooper, Director of Planning, Research and Accountability, says the information gleaned from the value-added data has helped principals make better teacher assignments and student placements and provided teachers insight into which student groups they might be struggling to reach.

“I worked with one teacher who teaches gifted and talented (GT) students,” Cooper says. “She wasn't getting the growth that she thought she should be getting. When we looked at the data, she realized she did really well with her high achieving GT kids but not as well with those at the lower end of that GT group. She decided to work on how she asks questions in class to attempt to better engage these lower-growth students.”

As the school district studied the data, it became apparent that some teachers work particularly well with certain groups of students. In the case of the underperforming turnaround school, the teachers that helped students grow the most (in some cases beyond one year’s worth of growth) were concentrated at a couple of different grade levels. Moving some of those teachers to other grades helped raise student growth throughout the school. “We've started identifying the teachers who are really good with students at certain levels who can mentor teachers who may not be as good with students at that level,” Cooper says.

In addition, principals aren’t randomly assigning students to classes. If a student hasn’t shown growth (regardless of whether they are low- or high-achieving), the principal looks to match them with a teacher for the following year who has helped grow students who fall in that particular academic range. “As we moved students from grade school to middle school, we ran reports that helped us understand things like ‘this child needs a math teacher that is strong with students at this child’s level,’” Cooper explains.

Cooper is quick to point out that using value-added data doesn’t suddenly turn an underperforming school into one with high proficiency rates. For the underperforming school, its rating on the state accountability system increased by four points. That doesn’t sound like much until you realize that, on average, all students exceeded growth targets at the school.

In addition, both Longview and Austin are committed to going beyond boosting passing rates. “Passing a state test is important, but you can pass with a 71 or you can pass with a 100. What we want to see is that those kids who have always passed with a 100 are still passing with a 100, and the kids passing with a 70 are passing the following year at an 80, 85 or 90,” says Lisa Schmitt, Senior Research Associate in the Department of Research and Evaluation for the Austin Independent School District.

Helping teachers and administrators understand and appreciate value-added analysis

Since Longview uses teacher-level value-added measures as a part of the educator evaluation system, it wants to make sure that students are attributed to the right teachers. Teachers verify their rosters with the district rule that a child must have been in attendance on the last Friday in October and the day the test was taken. “Teachers like that the students who entered the school system late didn’t have their scores used in calculating the teacher level information,” Cooper says.

In Austin, where teacher evaluations only include school-level value-added measures, Schmitt wanted a model that didn’t set different expectations for differing student characteristics. Some growth models will adjust school or teacher growth measures based on the assumption that some students can’t make as much progress as others. These adjustments aren’t necessary with EVAAS because the model uses all available testing history for each student. Schmitt also wanted a reporting system that showed standard error. “Not all the growth models out there consider error, whether that is because of sample size or missing data,” she says. “We spoke with the psychometricians and econometricians … and selected EVAAS because we wanted a model that we felt accurately reflected our goals.”

Schmitt says the EVAAS data is just one indicator that helps senior leaders understand what is happening in the schools. By going beyond passing rates it helps administrators see where students are growing and where they aren’t. Schmitt says it has been particularly interesting in looking at schools considered high-performing (virtually everyone passes the state tests), but who don’t show signs that the children are growing adequately each year – in some cases they were going backward. “It helps us ask questions about how we can meet the needs of this group.”


Help students grow academically, whether struggling performers or solid students.


SAS® EVAAS® for K-12


Longview, Texas schools used the solution to help it turn around an underperforming school, while Austin, Texas schools used it to help understand how well schools are doing at growing students regardless of academic ability.

The results illustrated in this article are specific to the particular situations, business models, data input, and computing environments described herein. Each SAS customer’s experience is unique based on business and technical variables and all statements must be considered non-typical. Actual savings, results, and performance characteristics will vary depending on individual customer configurations and conditions. SAS does not guarantee or represent that every customer will achieve similar results. The only warranties for SAS products and services are those that are set forth in the express warranty statements in the written agreement for such products and services. Nothing herein should be construed as constituting an additional warranty. Customers have shared their successes with SAS as part of an agreed-upon contractual exchange or project success summarization following a successful implementation of SAS software. Brand and product names are trademarks of their respective companies.