Dublin City Schools (EVAAS)
The state of Ohio has provided its school districts with SAS® EVAAS® for K-12 since 2006. Perhaps no district has gotten more use out of the value-added educational assessment tool than Dublin City Schools. Located just north of Columbus, Dublin City Schools is one of the largest school districts in Ohio with 14,000 students. With a preponderance of upper-middle-class families engaged in their children’s education, its students historically score well on state mandated end-of-grade tests. The district puts a great deal of attention not on simply passing a test – but in having all students progress from year to year. Now that Ohio is requiring that 50 percent of a teacher’s evaluation be based on how well that teacher helps students make progress, Dublin’s Director of Data and Assessment and Program Evaluation, Craig Heath, feels the groundwork the district data team has done in helping teachers use data is making them comfortable with value-added measurement as a part of their evaluation. Below, he provides some tips on how to make value-added assessment a constructive endeavor.
Focus on growth
So much of the testing emphasis is placed on whether a child met a standard, but in the U.S. federal government's Race to the Top initiative, much greater emphasis is being placed where Heath thinks it belongs: On helping each student make at least a year's worth of educational progress, each year. That is part of the impetus for measuring teaching effectiveness on student growth. "People didn't become teachers simply to help a child pass a test; they did so to help a child grow," says Heath. Value-added assessments measure growth. So while a struggling fourth grader might still be shy of meeting a certain standard, a value-added model can measure how much improvement she made from the previous year. And a fifth grader, already well above the "proficient" cut of a state test, can be judged on whether he was challenged to maintain a well-above proficient level. By adding a value-added model to teachers' reviews, educators get feedback on how much their class improved versus how many students simply passed a test.
Educate teachers and administrators in the language and meaning of data
The language of statistical analysis, and value-added methodologies in particular, is different than the language of education. Rather than water it down, Heath amps up training so teachers feel comfortable discussing “the stretch” on state tests, standard error, outliers and mean gain models. He makes sure they understand that a prediction of a student’s success on a key test, like the Ohio graduation exam, is a diagnostic starting point rather than a self-fulfilling prophecy. He encourages teachers to use data to develop a hypothesis and analyze the data to determine whether it is correct. “We have hundreds of teachers who can stand up and talk about a baseline and an observed score. And our principals understand as well. You can’t lead what you don’t know.”
Make sure teachers can conduct their own analysis
Heath makes sure his teachers have access to the data through the EVAAS portal that allows them to dig deeper without having to make a special request to an information technology specialist. In this way, teachers can develop a hypothesis and test it by analyzing the data. This process is ongoing at one Dublin City middle school that saw a drop in sixth-grade math scores. Teachers were quickly able to determine that the drop wasn’t associated with a specific subgroup – the economically disadvantaged, students with learning disabilities or even high achievers. The teachers are now collaborating to explore what factors might be at play, test those hypotheses and get the students back on track.
Create a teacher-led training model
Heath trains a group from each school who then train the teachers on their grade level. “If it there is a fourth-grade teacher in the building who does not know how to access her value-added data on the EVAAS site, he has someone in their school to talk to rather than calling the central office,” Heath explains. As teachers begin to have their performance tied to value-added assessments, Heath has arranged for his data-savvy teachers to meet with other teachers to answer questions and concerns.
Help teachers use data to craft differentiation plans
The original No Child Left Behind legislation that led to the testing boom shined a light on achievement gaps, particularly for the economically disadvantaged and students with learning disabilities. The ensuing emphasis on getting these subgroups up to proficiency frustrated those who felt the average and above average students were ignored. Focusing on growth helps bring focus to all students. But how do teachers help students who arrive at school at such varying/differing levels? By using the data from summative assessments – and formative assessments administered before new class units – to profile student needs.
Heath encourages his teachers to get out of the mindset of teaching “fifth-grade math” and instead focus on “teaching math to the fifth graders in my classroom.” In a high-performing district like Dublin City, this has led to discussions about the “stretch” on Ohio state tests. Are there enough questions on the state tests to determine if a high-performing student is making progress? Heath helps teachers look at the test’s construction to see that the answer is yes. And he’s helped teachers analyze the results so they can see that top-scoring students are still struggling with questions that emphasize advanced critical thinking skills, meaning there is still room for improvement.
Take an analytical approach to helping the weakest students
Ohio requires a basic skills test to qualify for a high-school diploma. Advanced students begin taking – and passing – the test in 10th grade. The district used to exempt many 10th grade students with disabilities from the requirement of passing the test for a diploma, thinking that even by 12th grade the child wouldn’t pass the test, Heath explained. But SAS EVAAS for K-12 provides a projection for each student’s ability to pass the test that uses multiple years of previous test scores. And that projection has changed the discussion around issuing exemptions and, instead, encourages teachers to hold off and focus on helping struggling students gain the skills they need to earn the general diploma. “We’re trying to use these projections on the intervention side,” Heath explains.
The results illustrated in this article are specific to the particular situations, business models, data input, and computing environments described herein. Each SAS customer’s experience is unique based on business and technical variables and all statements must be considered non-typical. Actual savings, results, and performance characteristics will vary depending on individual customer configurations and conditions. SAS does not guarantee or represent that every customer will achieve similar results. The only warranties for SAS products and services are those that are set forth in the express warranty statements in the written agreement for such products and services. Nothing herein should be construed as constituting an additional warranty. Customers have shared their successes with SAS as part of an agreed-upon contractual exchange or project success summarization following a successful implementation of SAS software. Brand and product names are trademarks of their respective companies.
Copyright © SAS Institute Inc. All Rights Reserved.