Skip to content
Join our Newsletter

Geoff Johnson: FSA ‘;report cards’ an abuse of a useful tool

Like an unavoidable visit from an annoying relative, the Foundation Skills Assessment has arrived on the doorstep of public education again. As tests go, the FSA is as technically well-constructed as tests of its type can be. It’s developed by B.C.

Like an unavoidable visit from an annoying relative, the Foundation Skills Assessment has arrived on the doorstep of public education again.

As tests go, the FSA is as technically well-constructed as tests of its type can be. It’s developed by B.C. educators, including teams of practising classroom teachers and subject-area specialists.

Schools have four weeks during which to administer the FSA. Most schools spread the test over at least three sessions so all Grade 4 and Grade 7 students have a chance to get past “test panic.”

Sample FSA questions are available. Practice is allowed.

One smart principal I know took full advantage of this. Understanding that her multicultural student population would likely be uncomfortable being herded into the gym to take a formal test, she had them into the gym to practice twice a week for a month before taking the actual test.

The result? Much better than expected, given that every one of her Grade 4 and Grade 7 students actually took the test. That’s not to imply that some schools may be tempted to fudge results by exempting some students but, well, the less said the better about that.

So far so good, as long as everybody realizes that, like that unwelcome relative on your doorstep, it is what happens next that sours the experience.

Specifically, it is the shotgun publication of provincewide “report cards” and the subsequent statistical “ranking” of schools by the Fraser Institute that causes so much trouble, not the FSA itself.

Paul Shaker, immediate past dean of education at Simon Fraser University, doesn’t think much of the Fraser Institute’s ranking of B.C. elementary and secondary schools. As quoted by the B.C. Teachers’ Federation after a recent debate over these “report cards,” Shaker said the think-tank’s research wouldn’t stand up to scrutiny if its findings were subjected to scientific peer review.

Shaker pointed out that there is no correlation between the Fraser Institute rankings and parental satisfaction with individual schools, as monitored by the ministry.

To bolster his argument, Shaker pointed to the U.S.-based National Assessment for Education Progress, which, unlike the Fraser Institute, embeds demographic variables into its evaluations. This “value-added approach” takes into account cultural, social and socio-economic factors that may influence a school’s overall test results.

Maybe or maybe not, but if the FSA results, overlaid on a map of the province, were to be misinterpreted as a socio-economic and socio-cultural map of each different region, that would inevitably lead to a lot of erroneous conclusions as well.

Individual student results are available to parents and schools, but can be just as misleading if interpreted in the wrong way.

It is only reasonable to realize that FSA results are a single snapshot, not a multiple-activity, more-inclusive video of student performance, and few generalized conclusions should be drawn from that moment in time.

Results measured in a few hours out of a school year should not be ignored, but should be considered as just one indicator, along with a wide variety of other equally valid data-based information collected by districts and schools on school and student performance.

Attempting to rank entire schools or districts based on FSA results invites misleading comparisons that ignore the particular circumstances that affect achievement in each classroom or school.

So it’s that time again, and Auntie has arrived with enough baggage to make us wonder how long she plans to stay.

Let’s just hope the FSA controversy and all its baggage do not define the remainder of this school year.

 

Geoff Johnson is a retired superintendent of schools.