Skip to content
Join our Newsletter

Jack Knox: Fraser Institute school rankings fail the test of utility

The annual Fraser Institute ranking of B.C. elementary schools is out, showing that — shock! — private schools perform better than those where the kids arrive hungry and get stacked up like cordwood in the classroom.
VKA-opera-111001_2.jpg
Duncan Frater directs Saint Michaels University School Grade 5 students rehearsing the opera Cinderella in May 2012. SMUS was one of 20 schools that tied for first place in the annual Fraser Institute ranking of B.C. elementary schools.

The annual Fraser Institute ranking of B.C. elementary schools is out, showing that — shock! — private schools perform better than those where the kids arrive hungry and get stacked up like cordwood in the classroom.

Of course the Saint Whoever schools rank well, is the standard response. Children are screened before being accepted, special-needs kids have better support and, as a retired teacher pointed out in a letter to the editor, class sizes “are smaller than most grade-school birthday parties.” If a parent is paying both taxes and tuition, the results better justify the extra outlay.

Sure enough, this year’s report showed that 19 of the 20 schools that tied for first place — including Victoria’s Saint Michaels University School — were independents. West Vancouver’s Cedardale was the lone public institution. The other end of the scale was just as predictable: inner city and remote schools that might as well be named Sisyphus Elementary, the students destined to push uphill boulders that always rolls back on them.

If the Fraser Institute results never vary, neither does our reaction: we all A) complain that the rankings are statistics-twisting nonsense, then B) rush to see how our kids’ school placed. Nature abhors a vacuum; parents know the report’s methodology leaves a lot to be desired, but in the absence of a more comprehensive way to measure the quality of their children’s education, they’ll seize on this one. To which Helen Raptis says “Don’t.”

Ditto for David Johnson.

Raptis is associate dean of education at UVic. Johnson is an economics prof at Wilfrid Laurier University in Waterloo, Ont., and the education policy scholar at another think tank, the C.D. Howe Institute.

Both think the standardized testing on which the Fraser Institute rankings are partially based is useful — just not in the way the Fraser Institute is using it. The tests were never meant to be used as the education equivalent of TripAdvisor.

The rankings rely in part on the Foundation Skills Assessment taken by all B.C. students in Grades 4 and 7 to test their knowledge of numeracy, reading and writing (though note that in Greater Victoria, most elementary schools don’t go to Grade 7). If a parent really wants to use a yardstick to measure school performance, go to the Education Ministry website to look up that data, Raptis says.

But those tests account for just 45 per cent of an elementary school’s Fraser Institute ranking, she says. The balance of the weighting is based on indicators that haven’t been proven to affect school performance, but that are skewed against schools with a lot of kids of lower socio-economic status. The result is that a school full of poorer kids can be ranked below one with inferior test results.

Forget all the public-versus-private school talk, Raptis says. This is just an Orwellian exercise that pulls down good schools by measuring them with tools of uncertain usefulness. The low rankings of low socio-economic schools are inevitable, discouraging progress. It’s actually counter-productive, which is why the Times Colonist decided to stop publishing the Fraser Institute list a few years ago, she notes.

Johnson’s objections are different — and somewhat contradictory. He developed a more complete measuring system for the C.D. Howe Institute that incorporates socio-economic variables that the Fraser Institute ignores, he says. That allows schools in similar circumstances to be compared, allowing improvements can be made. “What you really want to do is look at schools that outperform similar schools and see what you can learn from that.”

Even then, forget saying with a straight face that School X, in 132nd place, is better than 445th-ranked School Y. We all like Top 10 lists, and there’s a sexiness to ranking schools one through 982, but Johnson scoffs at the idea of rating them that finely, particularly when doing so by focussing on year-to-year changes in the average FSA scores. In a small school, a handful of students who test particularly poorly or well can shift the marks dramatically. Better to put more weight on longer-term trends and the percentage of students who achieve at an acceptable level.

As it is, Johnson simply doesn’t find much value in the annual fuss. “I think it just annoys people.”