Are Children Learning

Explaining the ISTEP debate: 6 reasons why the test ballooned

PHOTO: Alan Petersime
Frustrations with repeated problems with ISTEP have lawmakers looking for solutions.

The Indiana legislature is moving fast to cut at least three hours from the state ISTEP after two weeks of sharp words and behind-the-scenes negotiations over its length. Lawmakers are expected to rush a bill through both houses for the governor to sign next week to make the changes.

But with kids just days away from taking the exam, some are still asking: what caused the blow up?

The answer is a little complicated, but here are six reasons why ISTEP more than doubled in length from last year:

1. When standards change, tests must also change.

A big fight over Indiana’s academic standards last year ended when the state rapidly changed course and adopted quickly assembled new standards.

That disrupted a carefully coordinated plan in place since 2010 for the Indiana to adopt Common Core Standards along with 45 other states and use a shared exam that would test student knowledge with results that would be comparable across the country.

When Gov. Mike Pence and state Superintendent Glenda Ritz took office in 2012, Indiana had already adopted Common Core. Schools were putting it in place grade by grade, and a new Common Core-linked exam was scheduled to replace ISTEP this year.

But Pence was wary of the shared test — called the Partnership for the Assessment of Readiness for College and Careers or PARCC — and ordered the state to withdraw from the consortium creating the test in 2013. Six months later, both Pence and Ritz supported the idea of Indiana dropping out of Common Core and endorsed new locally made standards that were adopted last April.

Like Common Core,  Indiana’s new academic standards are more in-depth and ask students to do more analysis and critical thinking.

A test matching those expectations was needed in a hurry. Instead of taking years to adapt to the new standards and create the new exam, Indiana tried to do the whole process in a matter of months. That meant asking a lot of the 2015 ISTEP.

2. This year’s test had two extra goals — add questions to match the new standards and help create a test to replace ISTEP in 2016.

More difficult standards naturally meant Indiana needed a more difficult test. But there wasn’t time to completely overhaul ISTEP this year.

Instead, ISTEP was modified for this year to add several extra features. Many of the new standards were similar to the old standards, so many questions roughly matched the style and difficulty of past ISTEP exams. But new questions were added to also test students on new, tougher concepts included in the new standards, which were designed to make sure they graduate high school ready for college and careers.

The online version of ISTEP, for example, includes more advanced testing methods that ask kids to not only answer multiple-choice questions, but also answer questions in new ways, such as by dragging and dropping points on a graph or using drop-down menus.

Finally, this year’s ISTEP had one more job: Try out some questions that could be used on the 2016 exam.

But there was a problem. Indiana law requires release each year of all essay or short-answer test questions that are used in scoring. This would turn out to be a big factor in the length of the test.

3. A huge number of questions on this year’s test actually don’t count in a student’s score.

When test questions are released to the public they are effectively retired. They can never be used again on ISTEP.

So for this year’s exam, there were two big sets of essay and short answer questions: one group that counted toward each student’s score and must be released plus a large second set being tried out for use in 2016 that wouldn’t count.

Trying out questions is important. Test makers examine how students score on them to look for unexpected surprises. Questions they ask include: Was the question harder or easier for students than predicted? Was there reason to believe it was confusing to children? Was there any evidence the question was unfair to certain groups of students?

Trying out enough questions to be able to make a completely new test for 2016 was the main factor that caused what is normally a six-hour test to swell to more than 12 hours this year. All along, however, this was intended as a one-year problem. Future state exams are expected to be only slightly longer than the six-hour tests of the past.

The legislature appears poised to waive for one year the requirement that all essay and short-answer questions be released. This would allow some of this year’s questions to be reused so there could be far fewer extra questions that don’t count.

4. A longer test means more school days devoted to testing.

Indiana students don’t take all of ISTEP at once. They take sections of the exam in smaller doses over several days.

At its Feb. 4 meeting, the state board increased the number of days schools are allowed to use to give the test. The tests will be given over the course of almost a month, beginning Feb. 25 and ending in late March, followed by another set of testing days over three weeks at the end of April into May.

Schools can choose how to split up the parts of the test. Students might take just one section per day or do more depending on what teachers and principals decide. Danielle Shockey, the state’s deputy superintendent, said a testing day could take many shapes. In some schools, student take one 35-minute test section each day. In some schools, they spend an hour each day on testing. Other schools may do more.

“They have a long window of time,” Shockey said. “They can take one session a day if they so choose. It’s a local choice.”

5. Test makers had to consider that ISTEP is plays a critical role in school A-to-F grades and teacher evaluation ratings.

ISTEP is used to measure two things: how much students know of the content they were expected to learn this year, and how much they’ve improved from a previous year. Both factor into how Indiana measures the quality of schools with its A-to-F grading system, as well as how it evaluates teachers.

To determine a school’s A-to-F grade, the state considers both the percentage of students who pass ISTEP and how much students improved from last year. For teachers, the state expects to see their students’ test scores improve over the prior year.

When tests are roughly the same each year — measuring the same standards and using similar types of questions — it is easier to gauge how much students improved from the prior year. But when the standards change and the questions are crafted differently, test makers have to add extra questions to help determine each student’s improvement from the last test.

This spring’s test will include a few questions in English and math that are specifically designed to estimate roughly on what grade level each student best fits. For example, a fourth grade test might include a few third grade level questions and a few fifth grade level questions. Some students might do well on only the third grade questions but poorly on harder questions. Others might do well on all the questions, even the more challenging fifth grade questions.

Those extra questions help the test makers better estimate whether the student improved a little, a lot or not at all over the prior year. However, those extra questions also lengthen the test, but only by minutes, not hours, Michele Walker, testing director for the education department, said. The legislature agreed they were worth keeping — those questions will remain under the plan to shorten ISTEP.

6. Then, there’s the social studies question.

The federal No Child Left Behind Act, signed into law by President Bush in 2002, requires states to test students in English and math each year in grades 3 to 8, and once in high school, and also in science once during elementary, middle and high school.

Noticeably absent? Social studies.

Although Indiana’s social studies ISTEP test is only given to fifth- and seventh-graders each year, accounting for about an hour of testing for those grades, Pence’s test consultants recommended cutting that subject to reduce testing time further since it is only required by state law. That means the legislature could make an exception for this year.

State board members were divided on this idea. Some worried that it would send the message that social studies is not important. Others argued one hour for just two grades doesn’t add much test taking time.

But the legislature liked the idea of reducing test time further this way, so the Indiana Department of Education has told schools to expect the social studies exam to be optional this year. That means some students will take it, if the school decides they should, and others will be allowed to drop it for this year only.

union power

Gutting Wisconsin teachers unions hurt students, study finds

PHOTO: Creative Commons / Michael Vadon
Wisconsin Gov. Scott Walker in 2015.

The high-profile fight to limit union power was replete with drama — including a recall election and state legislators fleeing to neighboring states.

In the 2011 battle in Wisconsin, Republican Gov. Scott Walker ultimately came out the victor. The controversial law passed, Walker won the recall, and the Democratic-aligned unions have lost much of their power.

But new research points to other losers in the fight: students in the state’s already struggling schools.

The first study to assess how Wisconsin’s high-profile weakening of unions, particularly teachers unions, affected students finds that it led to a substantial decline in test scores.

The findings come as the U.S. Supreme Court is set to hear arguments for a case, known as Janus, that could dramatically scale back union power across the country — essentially taking aspects of the Wisconsin model national. And they give credence to concerns from unions and their defenders that weakening teachers bargaining power would ultimately make schools worse, not better.

A report from the left-leaning Center for American Progress released Wednesday highlights this research — and the fact that teacher pay and average experience declined in the wake of the law, known as Act 10 — to argue that weakening unions ultimately harm schools.

“Those concerned about the quality of public education — and of all public services — should understand that Wisconsin’s Act 10 and associated budget cuts have not had the positive impact on education that its proponents claimed it would,” the CAP report argues.

Still, the research, which has not been formally peer-reviewed, only assesses the short-term impact of Wisconsin’s law. It adds to a complicated set of research findings on unions that doesn’t render a clear verdict.

Short-term effect in Wisconsin is negative, especially for low-achieving schools

The new research looks at the effects of Wisconsin Act 10, which became law in 2011 and severely limited the scope of collective bargaining and allowed members to opt of unions.

The paper’s author, Jason Baron, took advantage of what was essentially a natural experiment set up by the law. Act 10 did not affect all school districts at once — a handful of school districts were allowed to maintain union rules until their existing contract expired up to two years later. That helped isolate the immediate impact of the law.

Baron found that weakening unions led to declines in test scores, particularly in math and science. The effects were fairly large, comparable to sharply increasing class sizes. And the harm was not evenly distributed: Schools that started out furthest behind were hurt the most, while higher achieving schools saw no impact.

Other research may help explain why.

The law led to big cuts in teacher compensation, particularly for veteran teachers and especially in health insurance and retirement benefits, according to one paper. There was also a spike in teacher retirement immediately following the law’s passage.

As compensation drops, it may become harder for district and teachers to recruit and keep teachers. An increase in retirement also reduces teacher experience, which has been linked to effectiveness.

Another study found that some Wisconsin districts moved from a single salary schedule to a performance-based pay system after Act 10’s passage. Those performance pay systems were more likely to be adopted by higher-achieving districts, potentially allowing them to lure effective teachers away from struggling schools.

“Following Act 10, high-performing schools filled vacancies from teacher retirements by poaching high-quality teachers from low-performing schools through attractive compensation schemes,” the paper concludes. So while those retirements might have hit all districts equally, high-performing districts were better able to make up the difference — at the expense of low-performing schools.

There is one study that complicates the narrative in Wisconsin. As retirements spiked, it found that academic achievement actually increased in the grades that teachers left. It’s not clear what explains this.

The larger question of how teachers unions affect learning remains up for debate

A number of other recent studies have examined the relationship between teachers unions and student outcomes outside of Wisconsin. The results aren’t consistent, but the trend has been more positive for unions of late. A caveat: Some of these studies have not been published in peer-reviewed academic journals.

  • On recent efforts to weaken unions: Research in Tennessee found that it led to a drop in teacher pay, but had no effect on student test scores. But a study of four states, including Wisconsin, that recently weakened unions found evidence of reduced teacher quality as a result.
  • On what happens when charter schools unionize: Two studies in California came to differing conclusions. One found that when charters unionize, student test scores go up, but the other showed no impact.
  • On the initial rise of collective bargaining: Another paper finds that students who went to schools where districts negotiated with unions earned less money and were more likely to be unemployed as adults. But this study looks at a fairly old data set — examining those who attended schools between 1965 and 1992.

Meanwhile, it’s not clear if any of this research is likely to influence the Supreme Court, as it considers the Janus case that could make life more difficult for unions. Last month, Chief Justice John Roberts called empirical studies on political gerrymandering “sociological gobbledygook.”

study up

Trump education nominee pleads ignorance about high-profile voucher studies showing negative results

At his confirmation hearing, Mick Zais, the nominee to be second-in-command at the Department of Education, said that he was not aware of high-profile studies showing that school vouchers can hurt student achievement.

It was a remarkable acknowledgement by Zais, who said he supports vouchers and would report to Education Secretary Betsy DeVos, whose signature issue has been expanding publicly funded private school choice programs.

The issue was raised by Minnesota Sen. Al Franken, who asked whether Zais, who was previously the South Carolina schools chief, was “aware of the research on the impact of vouchers on student achievement.”

He replied: “To the best of my knowledge, whenever we give parents an opportunity to choose a school that’s a good fit for their child the result is improved outcomes.”

Franken responded, “No, that’s not true. The academic outcomes for students who used vouchers to attend private school are actually quite abysmal.”

Franken proceeded to mention recent studies from Louisiana, Indiana, Ohio, and Washington, DC that showed declines in test scores after students move to private schools with a voucher.

Zais responded: “Senator, I was unaware of those studies that you cited.”

Franken then asked if Zais’s initial response expressing confidence in school choice was anecdotal, and Zais said that it was.

What’s surprising about Zais’s response is that these studies were not just published in dusty academic journals, but received substantial media attention, including in the New York Times and Washington Post (and Chalkbeat). They’ve also sparked significant debate, including among voucher supporters, who have argued against judging voucher programs based on short-term test scores.

Meanwhile, it’s worth noting that the research confusion was a bipartisan affair at Wednesday’s confirmation hearing.

Although Franken, who referred to a New York Times article on voucher research in his question, was broadly accurate in his description of the recent studies, he said that a DC voucher study showed “significantly lower math and reading scores”; in fact, the results were only statistically significant in math, not reading.

Franken also did not mention evidence that the initial negative effects abated in later years in Indiana and for some students in Louisiana, or discuss recent research linking Florida’s voucher-style tax credit program to higher student graduation rates.

In a separate exchange, Washington Sen. Patty Murray grilled Jim Blew — the administration’s nominee for assistant secretary for planning, evaluation, and policy development — on the performance of Michigan’s charter schools. Murray said that DeVos was “one of the architects of Detroit’s charter school system,” describing the results as “disastrous for children.”

Blew disputed this: “The characterization of the charter school sector in Detroit as being a disaster seems unfair. The most reliable studies are saying, indeed, the charter school students outperform the district students.”

Murray responded: “Actually, Michigan’s achievement rates have plummeted for all kids. In addition, charter schools in Michigan are performing worse than traditional public schools.”

(Murray may be referring to an Education Trust analysis showing that Michigan ranking on NAEP exams have fallen relative to other states. The study can’t show why, or whether school choice policies are the culprit, as some have claimed.)

Blew answered: “The most reliable studies do show that the charter school students in Detroit outperform their peers in the district schools.”

Murray: “I would like to see that because that’s not the data that we have.”

Blew: “I will be happy to get if for you; it’s done by the Stanford CREDO operation.”

Murray: “I’m not aware of that organization.”

CREDO, a Stanford-based research institution, has conducted among the most widely publicized — and sometimes disputed — studies of charter schools. The group’s research on Detroit does show that the city’s charter students were outperforming similar students in district schools, though the city’s students are among the lowest-performing in the country on national tests.