There’s a lot happening in Ontario’s Literacy and Basic Skills (LBS) program these days. On April 12 a long-awaited and very critical evaluation report was released. Then, on April 28 an influx of new funding was announced in the Ontario Budget. This year, LBS will receive an additional 20 million dollars, a 20% increase, with some additional funding to follow for the next two years. This is great news for the field. Based on the evaluation report, many programs struggle to keep up with increasing costs, and sometimes make difficult choices to reduce program hours.
Something else caught my attention in the Ontario Budget—a statement that could represent a fundamental shift in the way the government understands literacy proficiency and the implications of literacy challenges in people’s lives. It reads:
The critical building blocks to economic success are literacy, numeracy and aptitude with digital tools like computers, the Internet and new workplace technologies. More than 15 per cent of working-age Ontarians have low levels of literacy and numeracy skills, meaning day-to-day activities, such as reading instruction manuals or sending emails, can be difficult or impossible. Many people with low literacy and numeracy skills are employed, but these skill gaps can close doors to other opportunities.1
Whoa, we just went from 50% to 15% of the population with literacy challenges, and it’s not a typo.
For nearly two decades, the literacy field and policymakers have worked with a damaging and frankly perplexing idea that 50% of Canadians don’t have a “suitable minimum” standard of literacy to fully participate in a knowledge-based society (or something like that). Statistics Canada, the key organizer of the testing initiative for the first two rounds of testing in the mid-1990s and 2000s, introduced the notion that the levels developed to convey results represented minimum performance standards. (I’ve written about this before but it’s useful to repeat in this context.) The statement simply read:
Level 3 is considered by experts as a suitable minimum for coping with the complex information produced in the knowledge society.2
Accompanying statements about those at Levels 1 and 2 (half the population!) stated they may have trouble learning something new, have weak or poor skills and may even jeopardize the well-being of their children. (Seriously.) Absent from all reports was a discussion of who the experts were, and how the designation of a minimal proficiency level was determined. A communication device was turned into a standard in one short sentence.
For nearly 20 years the unsubstantiated Level 3 standard has been used to (mis)interpret testing results and alter policy discourse, subsuming efforts to focus on those who actually do encounter challenges. Now we know that the statement isn’t simply an exaggeration or a different interpretation of the results; it is bogus.
Once the Organization for Economic Cooperation and Development (OECD) assumed a leadership role in the testing project (they had previously worked in partnership with Statistics Canada), reference to a Level 3 standard not only disappeared but was soundly renounced.
In 2013, at a symposium in Montréal, with many provincial and federal policymakers in attendance, William Thorn, the PIAAC project lead for the OECD said the following:
We’re making no claim that Level 3 is essential to managing in modern life. That’s manifestly false […] That kind of description is a supplementary interpretation which has been put on these levels, which I don’t think is justified. 3
He went on to explain that the five levels (and scores within them) that are used to convey results are not standards; they supply no information about actual performance and literacy ability at work, at school or in daily life.
The test design team from Educational Testing Services (ETS) including lead designer Irwin Kirsch stated the same thing in 2001:
[T]hese data do not reveal the types of literacy demands that are associated with particular contexts in this pluralistic society. That is, they do not enable us to say what specific level of prose, document, or quantitative skill is required to obtain, hold, or advance in a particular occupation, to manage a household, or to obtain legal or community services, for example.4
Both of these statements contradict and bring into question the numerous efforts to do just that—to “say what specific level of prose, document, or quantitative skill is required to obtain, hold, or advance in a particular occupation…”—using spin-off tests like TOWES, ESEE and PDQ, along with the Essential Skills framework and its adaptations, like the OALCF curriculum framework. Not only is the aim a deception, the methods are flawed. A test score improvement on its own tells us nothing about actual performance outside the testing situation and one’s access to opportunities. The only way to investigate the relationship is to follow individual test-takers for a long period of time, accounting for a myriad of other influences and circumstances as only one researcher has done.
Where did the 15% come from?
The reference to 15% of Ontarians with literacy challenges is taken directly from the Programme for the International Assessment of Adult Competencies (PIAAC) report. They are test-takers who score at Level 1 or below. The 15% is also the OECD average, and is slightly below the 17% of Canadians at Level 1 or below.
We’ve always had folks testing at Level 1—those who have not completed high school, are older, and those who do not speak English or French as a first language (the languages of the test in Canada), including Indigenous peoples. But those folks—the very people who attend adult literacy, language and secondary level programs—have not been the focus of policy concern, until now.
It will take some time to extricate policy, various testing and curricular systems and even teaching practices from the Level 3 standard. Less than a year ago, the provincial government used the “manifestly false” claim that half the population has literacy challenges, holding them back from job success and economic opportunity, as a rationale for their Highly Skilled Workforce Initiative:
47% of Ontarians have literacy scores and 53% have numeracy scores below Level 3 identified by ABC Life Literacy as the level required to succeed in a technology-rich environment.5
To be clear, ABC Life Literacy Canada didn’t have a role in identifying or determining the standard. They and many other literacy organizations, along with policymakers, picked up on the statement to draw attention to their work, justify spending or attract new funding, and compel action—likely the very reasons that the false statements were developed in the first place.
Here’s the thing, whether one developed the statement or perpetuated it, a conscious decision was made. And we all made choices to hook into the statement for our own purposes, even when the 50% didn’t make much sense and didn’t reflect the realities of programs. Maybe now we can finally engage in a conversation about literacy challenges that starts to make more sense.
1 Ontario Budget, Ontario Lifelong Learning and Skills Plan, April 2017, paragraph 3
2 William Thorn in Cut-offs and Levels in PIAAC: What they Mean for Understanding Results, 2013
3 Irwin Kirsch and others in NCES Technical Report, 2000, page 9
4 Building the Workforce of Tomorrow: A Shared Responsibility, June, 2016, page 7