Are Children Learning

How Indiana’s A-F rules created a two-tiered system that benefits innovation schools

PHOTO: Kelly Wilkinson / The Star
IPS School 79 has among the lowest per pupil funding in the district.

Cold Spring School and School 79 were standouts on the recent ISTEP test. At both schools, more kids passed the state exam than the average for Indianapolis Public Schools, and their students made solid gains over last year.

So why did Cold Spring earn an A from the state while School 79 received a C?

It’s largely because Indiana lawmakers decided to judge some schools by a more generous yardstick than others.

Most elementary and middle schools are graded based on two factors: how their students score on state tests, and how much their scores improved. New schools and schools that join the IPS innovation network can opt to be graded for three years based only on the second measure, known as growth.

Advocates say the two-tiered system makes sense because schools shouldn’t be held accountable for the low passing rates of students that they just began educating. But in practice, the policy benefits charter and innovation schools, which enjoy strong support from Republican lawmakers.

It raises the question of whether grades that were supposed to be easy for parents to understand are too distorted to be clear.

“When you start evaluating otherwise identical schools using different measures … that is not informative,” said Marcus Winters, a Boston University researcher who has found benefits to grading schools. “It’s hiding information.”

Because Cold Spring became an innovation school last year, it was graded based on growth alone. If it were graded using the same rules as School 79, it also would’ve received a C from the state. That’s a huge improvement over the F it received last year, but it’s not as remarkable as the A that appears on its report card.

Cold Spring is not unique. Six of the eight innovation schools graded received As from the state. But only one innovation school — Phalen Leadership Academy at School 93 — would’ve gotten that grade under the rules used for grading other schools.

At 18 traditional neighborhood and magnet schools in IPS, students made large enough gains on the state test that the schools would’ve received top marks if they were innovation schools. But instead, they were given Bs, Cs, Ds and even an F. (Years of repeated low letter grades can trigger state intervention or takeover.)

The disparities have led to backlash from education advocates who are skeptical of partnering with outside operators at innovation schools. IPS leaders began creating innovation schools three years ago as a way to turn around chronically struggling schools, give more freedom to successful principals and pull charter schools under the district umbrella. The schools are managed by outside nonprofit or charter partners, and their teachers are not part of the district union.

Education advocate and lawyer MaryAnn Schlegel Ruegger pointed out that this grading quirk can have cascading effects that stack the deck in favor of outsourcing of school management.

The favorable treatment on state grades (which translate into eligibility for state and federal grants and higher ratings on school and real estate marketing sites like Great Schools and Niche and Zillow, and bragging rights to parents on the new IPS/charter school combined enrollment assignment company Enroll Indy) is the incentive to convince more financially struggling school districts throughout the state to do the same thing,” she wrote on Facebook.

Jon Valant, a fellow at the Brookings Institution who studies school choice, said that the inconsistency seems troubling. But there are benefits to judging schools by growth because operators are not penalized for restarting schools that have chronically low passing rates.

“In principle, it’s growth that is the sort of true reflection of what schools are actually doing,” he said.

Many innovation schools are making real progress when it comes to student scores on state tests. But even schools that are not benefit from the system. For example, at one innovation school, Kindezi Academy at School 69, passing rates and student growth fell from 2016 to 2017, but the letter grade nonetheless rose from an F to a D. Because it became an innovation school last year, its low passing rate is no longer pulling the grade down.

The growth-only grading scheme was also used at two IPS schools that were considered new: Center for Inquiry at School 70, which received an A, and the now-closed Arlington Middle School, which nonetheless received an F.

The rule change for grading innovation schools had wide support when lawmakers approved it in 2016, including from IPS Superintendent Lewis Ferebee who said he wanted innovation schools to get “a fresh start.”

Rep. Bob Behning, the Indianapolis Republican who authored the innovation school legislation, said he thought innovation schools should have the same options that exist for other new schools.

“Innovation network schools generally are new schools or reconfigured schools, so it’s not just schools that have changed their names,” Behning said. “So we decided that it made sense because we allowed charters to have that same flexibility.”

But the rules don’t just apply to new or restarted schools — they apply to any school that joins the innovation network. As a result, even schools like Cold Spring and KIPP Indy, which were not restarted when they became innovation schools, are treated like they are brand new. Like Cold Spring, KIPP got an A under the growth-only model — after years of C and D grades.

The two-tiered system could be short-lived. Behning said he anticipates that the grading system will change in several ways as the state overhauls the way it evaluates schools under the federal Every Student Succeeds Act.

“I think the differences between the combined letter grade and the growth-only (grade) will hopefully be mitigated in the new model, so it won’t have such stark differences,” Behning said. “The goal wasn’t just to give them a pass and not to have to hold them to the same level of accountability.”

Here is the full list of the grades new and innovation schools in Indianapolis Public Schools would have received if they were graded based on growth and proficiency.

Surprising report

EXCLUSIVE: Did online snafus skew Tennessee test scores? Analysts say not much

PHOTO: TN.gov
Education Commissioner Candice McQueen will release the results of Tennessee's 2017-18 standardized test this week, but the reliability of those TNReady scores has been in question since this spring's problem-plagued administration of the online exam.

An independent analysis of technical problems that disrupted Tennessee’s online testing program this spring is challenging popular opinion that student scores were significantly tainted as a result.

Education Commissioner Candice McQueen said Wednesday that the disruptions to computerized testing had “small to no impact” on scores, based on a monthlong analysis by the Human Resources Research Organization, or HumRRO. The Virginia-based technical group has expertise in psychometrics, the science behind educational assessments.

“We do believe these are valid, reliable scores,” McQueen told Chalkbeat on the eve of releasing state- and district-level scores for TNReady, the state’s standardized test in its third year.


Here are five things to know as Tennessee prepares to release TNReady scores


The state hired the research group to scrutinize several issues, including whether frequent online testing snafus made this year’s results unreliable. For instance, during at least seven days out of the three-week testing window, students statewide reported problems logging in, staying online, and submitting their tests — issues that eventually prompted the Legislature to roll back the importance of scores in students’ final grades, teacher evaluations, and school accountability systems.

But the analysis did not reveal a dramatic impact.

“For students who experienced the disruption, the analysis did not find any systematic effect on test scores that resulted from lapses in time between signing in and submitting their tests,” McQueen told Chalkbeat.

There was, however, a “small but consistent effect” if a student had to log on multiple times in order to complete the test, she said.

“When I say small, we’re talking about an impact that would be a handful of scale score points out of, say, a possible 200 or 250 points,” McQueen said.

Analysts found some differences in test score averages between 2017 and 2018 but concluded they were not due to the technical disruptions.

“Plausible explanations could be the students just didn’t know the (academic) standards as well and just didn’t do as well on the test,” McQueen said. “Or perhaps they were less motivated after learning that their scores would not count in their final grades after the legislation passed. … The motivation of our students is an unknown we just can’t quantify. We can’t get in their minds on motivation.”

About half of the 600,000 students who took TNReady this year tested with computers, and the other half used paper materials in the state’s transition to online exams. Those testing online included all high school students.

Out of about 502,000 end-of-course tests administered to high schoolers, educators filed about 7,600 irregularity reports – about 1.4 percent – related to problems with test administration, which automatically invalidated those results.

The state asked the analysts specifically to look at the irregularity reports for patterns that could be cause for concern, such as demographic shifts or excessive use of invalidations. They found none.

TNReady headaches started on April 16 – the first day of testing – when students struggled to log on. More problems emerged during the weeks that followed until technicians finally traced the issues to a combination of “bugs in the software” and the slowness of a computerized tool that helps students in need of audible instructions. At one point, officials with testing company Questar blamed a possible cyberattack for shutting down its online platform, but state investigators later dismissed that theory.

While this year’s scores officially are mostly inconsequential, McQueen emphasized Wednesday that the results are still valuable for understanding student performance and growth and analyzing the effectiveness of classroom instruction across Tennessee.

“TNReady scores should be looked at just like any data point in the scheme of multiple data points,” she said. “That’s how we talk about this every year. But it’s an important data point.”

heads up

Tennessee will release TNReady test scores on Thursday. Here are five things to know.

PHOTO: Getty Images/Kali9

When Tennessee unveils its latest standardized test scores on Thursday, the results won’t count for much.

Technical problems marred the return to statewide online testing this spring, prompting the passage of several emergency state laws that rendered this year’s TNReady scores mostly inconsequential. As a result, poor results can’t be used to hold students, educators, or schools accountable — for instance, firing a teacher or taking over a struggling school through the state’s Achievement School District.

But good or bad, the scores still can be useful, say teachers like Josh Rutherford, whose 11th-grade students were among those who experienced frequent online testing interruptions in April.

“There are things we can learn from the data,” said Rutherford, who teaches English at Houston County High School. “I think it would be unprofessional to simply dismiss this year’s scores.”

Heading into this week’s data dump, here are five things to know:

1. This will be the biggest single-day release of state scores since the TNReady era began three years ago.

Anyone with internet access will be able to view state- and district-level scores for math, English, and science for grades 3-12. And more scores will come later. School-by-school data will be released publicly in a few weeks. In addition, Tennessee will unveil the results of its new social studies test this fall after setting the thresholds for what constitutes passing scores at each grade level.

2. Still, this year’s results are anticlimactic.

There are two major reasons. First, many educators and parents question the scores’ reliability due to days of online testing headaches. They also worry that high school students stopped trying after legislators stepped in to say the scores don’t necessarily have to count in final grades. Second, because the scores won’t carry their intended weight, the stakes are lower this year. For instance, teachers have the option of nullifying their evaluation scores. And the state also won’t give each school an A-F grade this fall as originally planned. TNReady scores were supposed to be incorporated into both of those accountability measures.

3. The state is looking into the reliability of the online test scores.

In addition to an internal review by the Education Department, the state commissioned an independent analysis by the Human Resources Research Organization. Researchers for the Virginia-based technical group studied the impact of Tennessee’s online interruptions by looking into testing irregularity reports filed in schools and by scrutinizing variances from year to year and school to school, among other things. (Editor’s note: After this story’s initial publication, Education Commissioner Candice McQueen revealed what the analysis found. Here’s that story.)

4. The reliability of paper-and-pencil test scores are not as much in question.

Only about half of Tennessee’s 600,000 students who took TNReady this year tested on computers. The other half — in grades 3-5 and many students in grades 6-8 — took the exams the old-fashioned way. Though there were some complaints related to paper testing too, state officials say they’re confident about those results. Even so, the Legislature made no distinction between the online and paper administrations of TNReady when they ordered that scores only count if they benefit students, teachers, and schools.

5. Ultimately, districts and school communities will decide how to use this year’s data.

Even within the same district, it wasn’t uncommon for one school to experience online problems and another to enjoy a much smoother testing experience. “Every district was impacted differently,” said Dale Lynch, executive director of the state superintendents organization. “It’s up to the local district to look at the data and make decisions based on those local experiences.”

District leaders have been reviewing the embargoed scores for several weeks, and they’ll share them with teachers in the days and weeks ahead. As for families, parents can ask to see their students’ individual score reports so they can learn from this year’s results, too. Districts distribute those reports in different ways, but they’re fair game beginning Thursday. You can learn more here.