,

The Challenge of Open Data and Metrics

One promise of open data is its ability to inform citizens and consumers about the quality of local services. At the Gov 2.0 Summit yesterday the US Department of Health and Human Resources announced it was releasing data on hospitals, nursing homes and clinics in the hopes that developers will create applications that show citizens and consumers how their local hospitals stacks up against others. In short, how good, or even how safe, is their local hospital?

In Canada we already have some experience with this type of measuring. The Fraser Institute publishes an annual report card of schools performance in Alberta, BC, Ontario and Washington. (For those unfamiliar with the Fraser Institute it is a right-wing think tank based in Vancouver with, shall we say, dubious research credentials but strong ideological and fundraising goals.

Perhaps unsurprisingly, private schools do rather well in the Fraser Institute’s report card. Indeed it would appear (and I may be off by one here) that the t0p 18 schools on the list are all private. This does support a narrative that private schools are inherently better than state run schools that would be consistent with the Fraser Institute’s outlook. But, of course, that would be a difficult conclusion to sustain. Private schools tend to be populated with kids from wealthy families with better educated parents and have been given a blessed head start in life. Also, and not noted in the report card, is that many private schools are comfortable turfing out under-performing or unruly students. This means that the “delayed advancement rate,” one critical metric of a schools performance, is dramatically less impacted than a public school that cannot as easily send students packing.

Indeed, the Fraser Institute’s report card is rife with problems, something that teachers unions and, say, equally ideological but left-oriented think tanks like the Centre for Policy Alternatives are all too happy to point out.

While I loath the Fraser Institute’s simplistic report card and think it is of dubious value to parents I do like that they are at least trying to give parents some tool by which to measure schools. The notion that schools, teachers and education quality can’t be measured, or are too complicated to measure is untenable. I suspect few parent – especially those in say, jobs where they are evaluated – believe it. Nor does such a position help parents assess the quality of education their child is receiving. While they understand, may be sympathetic to or even agree that this is a complicated issue it seems clear based on the success of Ontario’s school locator that many parents want and like these tools.

Ultimately the problem here isn’t the open data (despite what critics of the Ontario Government’s school comparison website would have you believe). Besides, are we now going to hide or suppress data so that parents can’t assess their kids schools? Nor is the problem school report cards per se. If anything is the problem it is that the Fraser Institute has had the field all to itself to play in. If teachers groups, other think tanks, or any other group believes that the Fraser Institute’s report cards are not too crude, why not design a better one? The data is available (and the government could easily be pressured to make more of it available). Why don’t teacher’s groups share with parents the metrics by which they believe parents should evaluate and compare schools? What this issue could use is some healthy competition and debate – one that generated more options and tools for parents. The challenge for government is to make data more easily available. By making educational data more accessible, less time, IT skills and energy is needed to organize the data and precious resources can instead be focused on developing and visualizing the scoring methodology. This is certainly seems to be Health and Human Services approach: lower transaction costs, galvanize a variety of assessment applications and foster a healthy debate. It would be nice if ministries of education in Canada took a similar view.

But the second half of that challenge is also important, and groups outside of government need to recognize they can have a role, and the consequence of not participating. The mistake is to ask how to deal with groups like the Fraser Institute that use crude metrics, instead we need to encourage more groups and encourage our own organizations to contribute to the debate, to give it more nuance, and create better tools. Leaving the field to the Fraser Institute is a dangerous strategy, one that will serve few people. This is even more the case since in the future we are likely to have more, not less data about education, health and a myriad of other services and programs.

So, the challenge for readers is – will your organization participate?

Leave a Comment

2 Comments

Leave a Reply

Profile Photo Steve Ressler

I agree. In the U.S., we have a big industry around Top U.S. colleges based on a series of data that colleges are asked to provide. Yes, there are flaws in the survey. Yes, there are flaws in ranking a subjective process.

But on the other hand, you need a tool to start making decisions by and it’s useful. I think that’s the opportunity around open data is to start aggregating and creating some tools – they all won’t be perfect but will help.

Profile Photo Andrew Krzmarzick

Great point, David. In fact, I am wondering to what degree HHS should reach out directly to the institutions and groups that would appropriate to analyze – possibly even encouraging divergent groups to partner in the process of analyzing the data and producing these kinds of tools. They could serve as bridge builders, spearheading van incredibly rigorous look at the data.