I have blogged on multiple occasions (see this paper), highlighting the point that the most important thing in development is not the WHAT to do, but the HOW of implementing what’s already known.
Unfortunately, the mainstream international development discourse and funders are obsessed with the former, almost to the exclusion of the latter. The headline focus on ideas and innovations as opposed to state capabilities is only an illustration.
Randomised Control Trials (RCTs) are a good example. It informs us about the headline efficacy of an intervention. But it tells us very little about the mechanics of implementing the intervention (in addition to the WHY of the intervention), arguably the most important reason why good ideas rarely translate to realised development outcomes.
Consider Edtech. A government has fitted all classrooms with smart screens and established a computer laboratory with systems installed with personalised adaptive learning (PAL) software. How do we integrate digital media and its content with the physical classroom instruction across 20,000 schools, or even 100 schools?
There are several uncertain parts, even if the PAL software is mapped to the curriculum. How to effectively use the computer lab - number of hours/classes per week, sequencing of physical classroom and lab work, more than one child using a terminal, monitoring the child’s progress and appropriate follow-up, etc.? Similarly, how to effectively use the smart screen - pedagogy that toggles back and forth between using the blackboard and the digital content, which content to use where and when, how to deliver content effectively, and the general felicity of the teacher in intermediating the digital medium and its contents. See more here.
The efficacy of these Edtech interventions depends on getting these details right. I’ll define this implementation challenge as one of process discovery. It is the mapping of the details of the processes that increases the likelihood of successful implementation.
The process discovery maps provide the blueprints that can be used by frontline officials to implement the respective programs. They would be the default or Minimum Viable Product (MVP) for the implementation of programs. These process discovery maps can be simplified and made as user-friendly as possible to enable their practical utility. Interested officials will start with the MVP and improve it to suit their contexts and styles.
The headline efficacy evaluation RCT of development interventions is an overrated, even wasteful, pursuit. Did anybody in the world of practice of education pedagogy doubt the efficacy of software like PAL, if implemented effectively, as to require an expensive and long-drawn RCT? The main outcome from these RCTs is the publication of some papers and the burnishing of academic credentials.
The World Bank and other donors have now implemented tens of RCTs involving Edtech solutions. While there’s a rich library of evidence on the efficacy of Edtech solutions when implemented effectively in pilots, there’s very little by way of process discovery maps about the HOW of its implementation.
The same could apply to the implementation of teaching at right level, youth skilling programs, maternal and child health tracking software applications, interventions to improve public health or nutrition, measures to increase effectiveness of health insurance, use of body cameras by police and hot sport policing, adoption of better management techniques by SMEs to improve productivity, use of Dashboards by officials to monitor and follow up on programs, provision of information to improve farm productivity, etc.
Like with the Edtech example above, in all these cases, I’m not sure about the value proposition (to the practitioners in governments and other implementers) that comes with just an efficacy evaluation. Who would dispute that if done well, all of them would be efficacious, albeit in varying degrees? The challenge is to do well at scale.
In these circumstances, if donors still wish to support RCTs, here is a suitable compromise. By all means, support the RCT, but use the opportunity to also prioritise the creation of process discovery maps. They derisk processes and can serve as a template to guide the scale implementation by large systems in business-as-usual environments.
This can create the incentives for the emergence of process discovery maps along with efficacy evidence. The researchers get to publish papers, the governments get detailed implementation templates, and donors get both efficacy accountability and contribute to effective scaling.
As a rough thumb rule for donors, how about mandating that every efficacy evaluation RCT also have a process discovery map, wherever possible?
No comments:
Post a Comment