facebook-pixel

Commentary: Think twice before accepting iffy data from RISE tests

(Rick Egan | The Salt Lake Tribune) Oscar Gonzolez practices the SAGE test with his 3rd grade class, at Elk Run Elementary school, Wednesday, April 22, 2015.

Last month at the Utah Capitol, the Legislature’s Public Education Appropriations Subcommittee discussed the Utah’s end-of-year RISE testing debacle.

As an elementary school teacher married to a fellow educator, I have spent countless hours (at school and at home) fuming over the problems associated with the RISE testing platform and its parent company, Questar Assessment.

Like many educators throughout state, I cheered the news earlier this month that the $44 million dollar contract between the Utah State Board of Education and Questar Assessment had been terminated.

Often, it seems that educator feedback is lost in the ether. Attending the subcommittee meeting on RISE testing provided proof that decision-makers are, in fact, listening. I encourage them to continue listening to educators when it comes to a key question posed by a lawmaker at their meeting: Is the RISE data usable?

In addressing that question, Utah’s education leaders must decide if the 2019 RISE data can be used for determining school letter grades, labeling turnaround status and analyzing teacher performance.

During the subcommittee meeting, experts from the USBE testified that the RISE data may or may not be reliable. To determine the data’s accuracy, I anticipate months of deep data dives at levels well above my pay grade. Even without seeing the numbers, however, I strongly urge decision makers to consider the impact of this year’s chaotic testing environment on the results.

While data may be obtainable from Questar, the negative impact on student performance by a disorganized testing season cannot be quantified in the reports.

As leaders consider the usability of this year’s data, I encourage them to think through the 2019 testing season from a student’s perspective. By May, all that stands in the way of summer break is three or four weeks of intensive testing. Teachers have alerted parents of the testing dates and times, and families have rescheduled doctor’s appointments to avoid being absent. Each class is ready to show what they know: pencils are sharpened, scrap paper sits within reach, and the computers are charged. Everyone’s test anxiety is at peak levels, but teachers are confident in their students’ growth throughout the year.

This year, however, when the moment to test arrived, RISE created numerous roadblocks: non-rostered class lists, log-in issues, technical glitches and unavailable tests. Each hurdle along the way eroded the students’ confidence in the system and impeded their ability to do their best.

Any data collected by the RISE test lost credibility in the eyes of teachers and students after days on end of frustrating sessions and negative (albeit accurate) news coverage. If leaders try to use the data for key decisions, they will do our students a disservice.

The mere ability to create data reports based on the RISE testing session does not mean we should use them the way we would in a “normal” testing year. The endless frustrations and hurdles likely had a significant impact on their performance, and poor data can lead to poor decisions.

I am confident that our lawmakers have student success at the center of their decisions, and I hope the challenges faced by teachers and students impacts how our state uses (or doesn’t use) the RISE data.

Otherwise, I fear our students will suffer the consequences of a faulty test for years to come.

Rachel Wright

Rachel Wright has been teaching elementary school since 2014. She is a Utah Teacher Fellow with the Hope Street Group, working to elevate the voice of educators and work with policymakers to enhance education for Utah’s students. Opinions are her own. Follow her on Twitter at @UtahRach.