The LBS evaluation report and distinction between design and implementation problems

On April 12, Ontario’s Ministry of Advanced Education and Skills Development (MAESD) released a report detailing the findings of a program evaluation completed November 2016. The report is here, and here is an accompanying executive summary.

The findings are hard-hitting. Those in the field will finally see their concerns acknowledged, and meticulously documented in a ministry sponsored effort. Cathexis, the evaluation consultants, must be commended for producing such a transparent and accessible report, filled with powerful findings and useful data. This is a compelling wake-up call and, frankly, a revealing case study of the ineffective and damaging use of a high-stakes pay-for- performance management system—although that was not the explicit message, it’s pretty hard to ignore.

More amazing, is the tone and voice conveyed in the report. It’s not a distant, authoritative voice. The authors do not gloss over people’s concerns or attempt to reformulate those concerns using abstract categories or heavy-handed theoretical analysis. The case studies at the end of the report, particularly the one of an Indigenous literacy program, are astounding. I read example after example of disregard, managerial arrogance and the over-riding message that compliance to a faulty system must be upheld no matter what the cost to people and programs.

Treadmill

The report was released four months after a finalized version was completed, soon after a representative group from support organizations lobbied those in high-level leadership roles within the ministry. The release sends the message that the ministry has finally decided to respond to concerns that began to be documented soon after key elements of the pay-for-performance model were introduced in 2012.  Accompanying the report and its executive summary are memos from the deputy minister and assistant deputy minister, stating that the ministry is ready to work with the field to address the report’s recommendations.

However, my worry is that these conversations could be curtailed, leading to some short-term solutions that fail to get at the under-pinning problems. My concern stems from reading a prominent statement appears in the introduction of the executive summary:

The LBS program is providing a vital, valued, and effective service to Ontarians. Its key components—the OALCF and the Performance Management Framework (PMF)—are well designed. However, serious problems have arisen in the implementation of these components.

It’s a divisive and troubling statement.  I immediately thought, ‘Here we go again; shift all blame for the problems on to the field.’ Who else implements what policy analysts and others within the ministry have designed so “well”? Only after reading the report did I feel a bit better. There is no indication of blame in the report.

I’m trying to figure out where the statement came from and why it was made. The mandate of the evaluation was to examine implementation and not design. So why include a prominent, comprehensive and evaluative statement about design?  Despite the mandate, a couple of design issues are directly examined and one is the focus of a recommendation. In addition, design related problems are alluded to, although not fully explored, throughout the report, demonstrating it’s impossible to separate implementation from design. It’s unclear if the statement represents the ministry’s perspective—and was perhaps a suggested inclusion—or if this is the perspective of the evaluators. I have requested some context and possible references or sources from the ministry.

For those of you have followed the blog, my concern and recent projects with LBS partners, have been aimed squarely at design issues, and the misuse of international literacy testing levels, skills criteria and its information-processing pedagogy in the LBS performance management system. The design-implementation divide could be used to dismiss this work and limit discussions going forward. I see this happening already in a few of the recommendations. While most of the report’s recommendations are a sound response to identified problems, those related to the performance management framework, including its testing mechanisms need to be reconsidered.

Design is a fundamental problem that has a direct impact on implementation. For the most part, the field seems to be implementing exactly what has been designed. The report didn’t suggest that programs have gone rogue, making up their own tests (Milestones and Culminating Tasks) or creating their own learner satisfaction surveys (two ideas worth thinking about in future discussions) or ignoring mandatory performance target reporting (Service Quality Standards or SQS). Even on-the-ground confusions and mixed messages are not simply an implementation problem, since the dispersed organizational structure within MAESD and the multiple roles of 130 field consultants (Employment Training Consultants or ETCs) who have responsibility for LBS and other Employment Ontario programs contribute directly to the confusion and poor communication. Overall, it seems, the field struggles to implement what has been designed and deal with how the ministry is designed.

The field struggles to implement what has been designed and deal with how the ministry is designed

What has been designed is a high-stakes PMF that ties program performance directly to funding using a series of performance measurement mechanisms, including three high-stakes tests (Milestones, Culminating Tasks and the still-to-be-determined Learner Gains tool), a not so useful learner satisfaction survey and a listing of suitability criteria that doesn’t reflect the reality of many programs. The mechanisms have been designed either by the ministry or by consultants hired to do the work.

The evaluators articulate the field’s issues with two of the test systems—Milestones and Culminating Tasks—and recommend that their “merit” be reviewed. This is encouraging. However, they don’t make a direct connection between the problems and test design, even though a recent AlphaPlus study on the Milestones came to this conclusion. They do however recommend that the suitability mechanism be redesigned, an indication that the evaluators are willing to extend the mandate of the evaluation past implementation and into design issues in some circumstances but not others. Overall, there is also some confusion in the way that the OALCF is examined in the report, which I will look at in a future post.

According to the memo from the assistant deputy minister, the ministry will be open to “ensuring the design and management of the LBS program” so that it “supports all learners and all service providers” and acknowledges that not all tools are “available and/or appropriate for all learners and/or service providers.” To facilitate this discussion it will be important to fully understand inherent design issues, including the use of a pay-for performance PMF in a under-resourced program that sees an incredibly diverse range of adult learners who attend a range of programs for anywhere from a few weeks to a few years. There are nearly no homogeneous elements in LBS such as time spent in the program, age and background of learners, funding per learner, class/learning group sizes, educator training, and curricula. But the minisitry insists on the use of the same measures and targets for all.

How the ministry is designed also has direct impacts on programs and even the design of the mechanisms used in the PMF. MAESD’s dispersed and siloed organizational structure parses responsibility for LBS across several organizational units with no apparent and sustained connection between those units, and, more importantly, no sustained and meaningful connection with the field. Since LBS was integrated into Employment Ontario, I’ve seen a constantly revolving rotation of new names in various positions, and I’m never quite sure who I should even contact about a particular question. Directly related to the siloed structure is a preference for public administration and managerial expertise over content expertise among policy analysts and the majority of ETCs.

This has serious consequences. Policy analysts are in charge of developing a complex high-stakes assessment system without any assessment, curriculum, educational, literacy development and adult learning expertise. This then leads to an over-reliance on a small pool of consultants, promoting particular products and appraoches, and a limited ability to fully evaluate the relevance, rigour and usefulness of their products. The ETCs, in turn, must oversee an educational program without any knowledge of education and learning.

Then, the siloed and parsed structure within the ministry must somehow connect with a rather complex structure within LBS that includes regional networks, stream organizations and sector organizations. Of course, upending the organizational structure of Employment Ontario is not likely on the table, but understanding the multiple design challenges related to both MAESD’s internal organizational structures and the LBS structure are important when discussing possible responses and solutions moving forward

Let’s have a truly collaborative discussion and put all problems on the table, whether framed as design or implementation issues, in order to think about and better articulate more comprehensive and lasting solutions.

Advertisements

One thought on “The LBS evaluation report and distinction between design and implementation problems

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s