In other states, the year-end tests were marked by glitches and cyberattacks and hourlong delays. One school district threw out its results because the software was so unreliable. In another, all of the students had to start over when the programming shut down and didn’t save their responses.
But even after those issues arose — and despite clearly knowing about them — Utah signed a $44 million contract with that same testing company last spring to develop the state’s standardized exams, now called RISE. And the rollout hasn’t gone well.
As students here have tried to submit their tests, their computer screens have frozen and some haven’t been able to recover their work.
“This is clearly problematic,” said Darin Nielsen, the state’s assistant superintendent of student learning. “It hasn’t performed like we had hoped or expected. There are frustrations for many people across the state.”
The outages in Utah have delayed more than 18,000 public school students in completing their assessments this April and May. For one day, no one was able to take a science exam. On at least four others, testing was stopped entirely for some school districts.
The state has had to expand the testing window into June. Now, it’s questioning whether the scores it gets back will even be valid enough to use.
Will the data accurately determine whether students are improving? Can Utah rely on the results to assign annual grades for school performance? Is it fair for members of the state Board of Education to look at the numbers to assess which schools need more funding and which should potentially be closed?
Meanwhile, students have been getting so stressed out by the delays that some teachers say they’ve broken down in tears. And educators, too, say there’s growing pressure and fears that the tests might be used to evaluate their work regardless of the circumstances.
“It’s ridiculous and a nightmare,” said Chelsie Acosta, an eighth grade educator at Glendale Middle School. “It’s been a joke with as much as the computers have been down.”
A timeline of problems
When the state sought a new testing program in 2017, it received four proposals.
Of the finalists, Questar Assessment Inc. offered the cheapest price and ultimately won the contract. It would create a computer-adaptive test for Utah for roughly half the cost of the other top competitor, Pearson, which had budgeted its services at nearly $74 million for 10 years.
The Salt Lake Tribune obtained redacted copies of the bids through an open-records request.
The documents show that Questar reported in its proposal that it had previous issues in administering tests.
In the disclosure section, where bidding vendors were required to detail whether they have been accused of poor performance, Questar listed three separate incidents. The oldest dated to 2014. And the issues involved its educational testing services software — the same one used in Utah.
In Texas, Questar said, its parent company, ETS, paid $5 million for “the disruption of testing and reporting schedules.” In California, ETS lost more than $3 million for not providing the correct testing materials and not delivering scores in the set timeframe. With AP and SAT exams ETS offered through the College Board, it had “minor to moderate failures” in administering the tests in the schools where it had contracts.
None of the other three bidders reported as many issues or issues as serious. Performance Matters, a Utah-based company, reported no complaints at all, though it had less experience.
Questar, which is based in Minnesota, offered a response to an inquiry from The Tribune via email: “We understand the issues our platform’s inconsistencies have caused with administering Utah’s RISE assessments, creating challenges for students, educators and families. We are currently implementing improvements to our platform to ensure Utah RISE testing and scoring run smoothly for the remainder of the 2019 school year and beyond.”
In the months before and after submitting its proposal to Utah in April 2017, though, the company also began seeing more problems arise in a number of other states where it contracts.
In spring 2017 in Missouri, high school students taking English and math tests scored significantly lower than in previous years. The declines were up to nearly 10 percentage points, reported The Springfield News-Leader, which was well beyond any expected margin of error.
The state threw out the unreliable results, and Questar agreed to a credit of $750,000.
More than 9,000 tests administered by the company were determined to have been scored incorrectly in Tennessee at the same time, according to The Commercial Appeal.
Both of those stories are included in results from a Google search of “Questar testing.” Utah Deputy Superintendent Scott Jones said it’s unclear if any of the selection committee members did additional research on the bidding companies. But Utah education leaders, an employee for Jones’ office confirmed, knew about those issues and others that happened in early spring 2018 before it signed the contract with Questar.
Some Tennessee students were unable to login for the exams and others were directed to incorrect questions. In January of last year — a month before Utah finished its deal with the company — the software was hacked in New York and Mississippi, releasing the private information of at least 50 students. In a few schools, kids lost all of the answers they had tried to submit and had to retake the exams.
The issues have continued with what education officials called a “deliberate cyberattack” on Questar that delayed assessments in seven states in 2018. And in 2019, Mississippi schools had to halt testing with the company because of an issue with the Questar’s data center.
But Jones said the Utah school board received from Questar “written assurance that there were no more issues” and its software had been fixed before signing the deal.
The highest score
Annual standardized exams are required by federal law in grades three through eight (as well as at least once in high school) and focus on language arts, writing, science and math.
Utah switched testing vendors last year after previously contracting with American Institutes for Research to conduct what was then called the SAGE test. That company helped direct teachers to create all of the state’s questions from scratch.
But SAGE failed to gain traction here after it was implemented in 2013. More parents each year opted their students out of the test. That’s allowed under state law, though the school board has said it undermines the accuracy of using the exams for accountability rankings.
The company vied again for the state’s contract in the 2017 bidding but was cut first. (Performance Matters was cut later.)
In its proposal, American Institutes for Research wrote of its experience: “Utah has tested 100% online with virtually no disruptions for multiple years.” It was late once in giving the state its scores and paid $200,000 for that mistake. But it also helped Utah lease its questions to others, which earned the state $15 million.
Nielsen, the state’s assistant superintendent of student learning, confirmed that there were few bumps with the company, with most happening in the first year it was put in place. None of those matched what has happened with Questar this year. But, at the time, many board members wanted to go a different direction in the hope that a new company and the new RISE tests would encourage more parents to have their kids take them.
“Much thought and discussion went into this decision, and I believe this is in the best interests of both students and taxpayers in Utah,” Terryl Warner, a former member of the Utah Board of Education and a member of the selection committee, previously told The Tribune. She did not return requests this week for comment. And school board administrators did not immediately provide a requested list of the other committee members.
Pearson, the other finalist company, also has had experience in Utah. It administered a writing assessment here from 2000 to 2009 and won a contract last year to provide end-of-year testing in high schools — which have not seen similar glitches. Cost was the main reason it wasn’t selected. It had one technical outage listed in its application, but the details were redacted.
In the final matchup, Pearson earned 703.7 points from the selection committee and Questar got 825 out of 1,000. Pearson received higher marks for software development and technical skill.
The impact on teachers
Acosta, the eighth grade teacher at Glendale Middle School, has seen her students get stressed about standardized testing during each of the four years she’s been in the classroom there.
“Every year, I have students in tears,” she said. This year, with the delays, it’s been much worse.
The process has dragged on, students have been pulled out of her class at odd times and marginalized kids, in particular, she added, are struggling with the glitches because they fear their scores won’t be tallied fairly.
Acosta believes the state has put too much weight on fetching a good price for the tests and given little consideration to how they impact students. The fact that the officials knew about the potential for problems, she said, and went ahead with the contract anyway is “unbelievable.”
Yvonne Speckman, who teaches sixth grade at Buffalo Point Elementary, said the delays are more than just a scheduling problem — lessons plans get shuffled, teaching time is cut down.
Because the computers froze when students were in the middle of tests, no other student could log on to the same device. And Questar couldn’t fix the problem without rebooting the systems — something that took 24 hours. Each glitch, then, pushed testing back another day. It happened five times.
“It’s a tight schedule, and you have a small window to test,” Speckman said. “It completely disrupts everything.”
Still, that issue is not defined as a breach of the state’s contract. So the Board of Education can’t cancel on Questar over the issues.
Instead, it can charge the company up to $50,000 each day there was a major disruption. That would amount to about $250,000 — about half of a percent of what Utah paid Questar for its services.
When Speckman’s class took the writing exam a few months ago, it wouldn’t save her students’ work. They came in the next day to finish, she said, and their essays were erased. At least five never finished because the program timed out.
Now she worries she could be held accountable for low scores that happened due to a faulty program; Speckman said she has worked hard to teach lessons this year, but it might not show in the results.
Other teachers she’s talked to at her school have said on the science exam students were asked to look at a graphic and answer questions — but there was no picture. Some kids there got locked out of math sections and couldn’t go back to finish.
“Teachers take these tests very seriously,” Speckman added. “We use this data. It’s important to us to guide our instruction. … But the idea that we’ve spent so much money on this, I just don’t know what was wrong with SAGE. It’s a huge frustration.”
Additionally, the interruptions are likely to cost the state money, too.
State Superintendent of Public Instruction Sydnee Dickson announced earlier this month that Utah will hire a third party to review the tests, how accurate they are and how much the delay might have impacted scores.
“Unfortunately, the mounting issues with the operating platform created by Questar Assessment bring up many questions that will need to be answered,” she wrote in a letter to teachers and administrators. “We are less confident about overall accountability.”
The board is also expected to weigh its options in its June meeting, added Jones, the deputy superintendent. That could include trying to find another way to end the contract or find a new vendor.
“The extent of it still remains to be seen,” he said, “as broken as it might have been.”
As of this week, nearly 900,000 tests had been submitted in Utah (students often take two or three each year). Roughly 100,000 are still outstanding.
Clarification: 2:20 p.m. June 4: This story has been updated to reflect that Questar's disclosures specified that its parent company, ETS, paid $5 million and lost more than $3 million for poor performance in Texas and California and had failures administering AP and SAT exams.