broken promise?

Common Core tests were supposed to usher in a new era of comparing America’s schools. What happened?

PHOTO: Pete Souza / White House
President Barack Obama and former Secretary of Education Arne Duncan in 2010.

How well does an elementary school in Maryland stack up to one in New Jersey? Do California’s eighth graders make faster academic gains than their peers in Connecticut?

In 2010, then-Secretary of Education Arne Duncan made the case for common state tests that would allow parents and educators to find out — and predicted that the comparisons would lead to dramatic policy changes.

“For the first time, it will be possible for parents and school leaders to assess and compare in detail how students in their state are doing compared to students in other states,” Duncan said. “That transparency, and the honest dialogue it will create, will drive school reform to a whole new level.” It was a heady moment: Most states had signed on to at least one of the two cross-state testing groups, PARCC and Smarter Balanced.

Though their numbers have since dwindled substantially, the two groups still count over 20 members between them. But seven years later, it remains difficult to make detailed comparisons across states, as a potent mix of technical challenges, privacy concerns, and political calculations have kept the data relatively siloed. And there’s little evidence that the common tests have pushed states to compare notes or change course.

“This is one unkept promise [of] the common assessments,” said Mike Petrilli, president of the Fordham Institute, a conservative think tank that has backed the Common Core standards.

“I’ve been surprised that there haven’t been more attempts to compare PARCC and Smarter Balanced states,” said Chad Aldeman of Bellwether Education Partners.

What comparisons are available? PARCC publishes a PDF document with scores from different states, based on publicly available information. “We have more states than ever administering tests that will allow for comparability across states,” said Arthur Vanderveen, the CEO of New Meridian, the nonprofit that now manages PARCC. “That data is all public and available. I think the vision really has been realized.”

Smarter Balanced does not publish any data comparing states, though those scores could be collected from each participating state’s website.

The presentation of the data stands in contrast to the National Assessment of Educational Progress, a test taken by a sample of students nationwide. NAEP has an interactive site that allows users to compare state data. No such dashboards exist for Smarter Balanced or PARCC, though both tests could offer more granular comparisons of schools and students.

Tony Alpert, the head of Smarter Balanced, says a centralized website would be difficult to create and potentially confusing, since states report their results in slightly different ways.

“The notion of comparable is really complicated,” he said. Nitty-gritty issues like when a test is administered during the school year, or whether a state allows students who are learning English to use translation glossaries on the math exam, can make what seems like a black and white question — are scores comparable? — more gray, he said.

“Early on our states directed us not to provide a public website of the nature you describe, and [decided that] each state would be responsible for producing their results,” said Alpert.

Neither testing group publishes any growth scores across states — that is, how much students in one state are improving relative to students who took the test elsewhere. Many experts say growth scores are a better gauge of school quality, since they are less closely linked to student demographics. (A number of the states in both consortia do calculate growth, but only within their state.)

“I’m not sure why we would do that,” Alpert of Smarter Balanced said. States “haven’t requested that we create a common growth model across all states — and our work is directed by our members.”

That gets at a larger issue of who controls this data. For privacy reasons, student scores are not the property of the consortia, but individual states. PARCC and Smarter Balanced are also run by the states participating, which means there may be resistance to comparisons — especially ones that might be unflattering.

“The consortium doesn’t want to be in the business of ranking its members,” said Morgan Polikoff, a professor at the University of Southern California who has studied the PARCC and Smarter Balanced tests. “Except for the ones that are doing well, [states] don’t have any political incentive to want to release the results.”

As for PARCC, a testing expert who has works directly with the consortium said PARCC has made it possible to compare growth across states — the results just haven’t been released.

“Those [growth scores] have been calculated, but it’s very surprising to me that they’re not interested in making them public,” said Scott Marion, the executive director of the Center for Assessment. This information would allow for comparisons of not just student proficiency across states, but how much students improved, on average, from what state to the next.

Vanderveen confirmed that states have information to calculate growth across states.

But it’s unclear if any have done so or published the scores.

Chalkbeat asked all PARCC states. Colorado, Illinois and Maryland responded that they do not have such data; other states have not yet responded to public records requests.

Vanderveen said that states are more interested in whether students are meeting an absolute bar for performance than in making comparisons to other states. “A relative measure against how others students are performing in other states — and clearly states have decided — that is of less value,” he said.

The cross-state data could be a gold mine for researchers, who are often limited to single states where officials are most forthcoming with data. But both Polikoff and Andrew Ho, a professor at Harvard and testing expert, say they have seen little research that taps into the testing data across states, perhaps because getting state-by-state permission remains difficult.

Challenges in the ability to make comparisons across states and districts led Ho and Stanford researcher Sean Reardon to create their own solution: an entirely separate database for comparing test scores, including growth, across districts in all 50 states. But it’s still not as detailed as the consortia exams.

“One of the promises of the Common Core data was that you might be able to do student-level [growth] models for schools across different states and our data cannot do that,” he said.

heads up

Tennessee will release TNReady test scores on Thursday. Here are five things to know.

PHOTO: Getty Images/Kali9

When Tennessee unveils its latest standardized test scores on Thursday, the results won’t count for much.

Technical problems marred the return to statewide online testing this spring, prompting the passage of several emergency state laws that rendered this year’s TNReady scores mostly inconsequential. As a result, poor results can’t be used to hold students, educators, or schools accountable — for instance, firing a teacher or taking over a struggling school through the state’s Achievement School District.

But good or bad, the scores still can be useful, say teachers like Josh Rutherford, whose 11th-grade students were among those who experienced frequent online testing interruptions in April.

“There are things we can learn from the data,” said Rutherford, who teaches English at Houston County High School. “I think it would be unprofessional to simply dismiss this year’s scores.”

Heading into this week’s data dump, here are five things to know:

1. This will be the biggest single-day release of state scores since the TNReady era began three years ago.

Anyone with internet access will be able to view state- and district-level scores for math, English, and science for grades 3-12. And more scores will come later. School-by-school data will be released publicly in a few weeks. In addition, Tennessee will unveil the results of its new social studies test this fall after setting the thresholds for what constitutes passing scores at each grade level.

2. Still, this year’s results are anticlimactic.

There are two major reasons. First, many educators and parents question the scores’ reliability due to days of online testing headaches. They also worry that high school students stopped trying after legislators stepped in to say the scores don’t necessarily have to count in final grades. Second, because the scores won’t carry their intended weight, the stakes are lower this year. For instance, teachers have the option of nullifying their evaluation scores. And the state also won’t give each school an A-F grade this fall as originally planned. TNReady scores were supposed to be incorporated into both of those accountability measures.

3. The state is looking into the reliability of the online test scores.

In addition to an internal review by the Education Department, the state commissioned an independent analysis by the Human Resources Research Organization. Researchers for the Virginia-based technical group studied the impact of Tennessee’s online interruptions by looking into testing irregularity reports filed in schools and by scrutinizing variances from year to year and school to school, among other things.

4. The reliability of paper-and-pencil test scores are not in question.

Only about half of Tennessee’s 600,000 students who took TNReady this year tested on computers. The other half — in grades 3-5 and many students in grades 6-8 — took the exams the old-fashioned way. Though there were some complaints related to paper testing too, state officials say they’re confident about those results. Even so, the Legislature made no distinction between the online and paper administrations of TNReady when they ordered that scores only count if they benefit students, teachers, and schools.

5. Ultimately, districts and school communities will decide how to use this year’s data.

Even within the same district, it wasn’t uncommon for one school to experience online problems and another to enjoy a much smoother testing experience. “Every district was impacted differently,” said Dale Lynch, executive director of the state superintendents organization. “It’s up to the local district to look at the data and make decisions based on those local experiences.”

District leaders have been reviewing the embargoed scores for several weeks, and they’ll share them with teachers in the days and weeks ahead. As for families, parents can ask to see their students’ individual score reports so they can learn from this year’s results, too. Districts distribute those reports in different ways, but they’re fair game beginning Thursday. You can learn more here.

Sharing Stories

Tell us your stories about children with special needs in Detroit

PHOTO: Patrick Wall

Parents of students with special needs face difficult challenges when trying to get services for their children. Understanding their children’s rights, getting them evaluated and properly diagnosed, and creating an educational plan are among the many issues families face.

Chalkbeat Detroit wants to hear more about those issues to help inform our coverage. We are kicking off a series of conversations called a “listening tour” to discuss your concerns, and our first meeting will focus on children with special needs and disabilities. We’re partnering with the Detroit Parent Network as they look for solutions and better ways to support parents.

Our listening tour, combined with similar events in other communities Chalkbeat serves, will continue throughout this year on a variety of topics. In these meetings, we’ll look to readers, parents, educators, and students to help us know what questions we should ask, and we’ll publish stories from people who feel comfortable having their stories told. We hope you’ll share your stories and explore solutions to the challenges parents face.

Our special education listening tour discussion will take place from 5:30-7:30 p.m., Tuesday July 24, at the Detroit Parent Network headquarters, 726 Lothrop St., Detroit.

As our series continues, we’ll meet at locations around the city to hear stories and experiences parents have while navigating the complexities of getting children the education and services they deserve.

Next week’s event includes a panel discussion with parents of children with special needs, responses from parent advocates, and an open discussion with audience members.

Those who are uncomfortable sharing stories publicly will have a chance to tell a personal story on an audio recorder in a private room, or will be interviewed by a Chalkbeat Detroit reporter privately.

The event is free and open to anyone who wants to attend, but reservations are required because space is limited. To register, complete this form, call 313-309-8100 or email frontdesk@detroitparentnetwork.org.

If you can’t make our event, but have a story to share, send an email to tips.detroit@chalkbeat.org, or call or send a text message to 313-404-0692.

Stayed tuned for more information about listening tour stops, topics and locations.