Substack

Thursday, May 16, 2024

Thoughts on international development VI

I blogged here highlighting the obsession in international development circles with new ideas and innovations and neglect of regular development interventions and examined the reasons; here questioning the belief that there are new ideas and innovations waiting to make a transformative impact; here that policies in most of the development matter very little and it's mostly about implementation; here questioning the conventional wisdom on impact evaluations and arguing that evaluations should focus on the use of administrative data (and surveys) coupled with qualitative information to help improve the effectiveness of implementation; and here that the combination of the grafting of externally generated ideas and innovations and the associated flow of easy money prevents developing countries from cultivating their ability to make good development decisions

In this post, I’ll argue that any significant scale implementation of a development program should necessarily be iterative if it’s to realise its objectives. There is virtually no program which can be taken off the shelf and implemented successfully across a region or country. Any human engagement-intensive program implementation will necessarily require continuous and grassroots iteration to succeed. It’s about the repetition of implementation, observation, iteration, and adaptation till you get to a reasonably good implementation. It’s the process of iterative adaptation. 

The exceptions to this strategy are those which involve pure logistics interventions (like cash transfers or subsidies) and execution of engineering works. 

This means it’s no good to have ideas. Instead, whatever the idea or innovation, the point is to execute it effectively. And such execution is generally about iterative adaptation. 

Let’s examine this with some examples. Take the implementation of an Edtech program in schools that gives tablets to students. For simplicity, let’s assume that the content is finalised. The delivery of the content raises several uncertainties - how and when should the student use the tablet, what’s the balance between the digital and physical content, how should the teacher tailor classroom instruction to facilitate effective use of the tablets, what kind of training to support the teacher, how do we monitor whether the student is using the tablet as intended, and how do we know that all this is creating the desired impact. 

Or take the uncertain elements of new a subsidised crop insurance scheme - what premium subsidy, what magnitude of coverage, what episodes trigger claims, who’s eligible farmer, what upper limit on eligibility, how damage assessed, how to manage quick payouts, how to prevent abuse of the scheme etc. These elements vary across crops and regions, and the getting them right is the difference between success and failure. 

Or an initiative where companies are given subsidies to hire apprentices - which types of companies to be targeted, what occupational/trade entry levels, what eligibility conditions for apprentices, how to transfer the subsidy, what stipend amount, how many years, how to limit misuse by companies and apprentices etc. In case the program is struggling to make an impact (as is the case in India now despite a strong commitment by the government), what’s going wrong and how to increase uptake? 

Or take the example of an initiative to encourage electric vehicle manufacturing. Here too there are too many uncertain elements - what part of the value chain to support to start with, which kinds of incentives, what magnitude for each incentive, what types of import restrictions, how much domestic content, how to move up the value chain, how to monitor compliance, how to phase down the incentives, what strategic considerations etc. 

As can be imagined, there’s no way ex-ante to get these uncertain elements right at the policy/program design stage itself. Issues will always emerge when the rubber hits the ground. How do we deal with them?

What makes all this even more challenging is that in all the cases, the emergent questions must be answered on an implementation done at scale across varying contexts.

Currently, governments tend to view these initiatives in largely static terms. They see the biggest challenge as getting the design of the policy or scheme right and then monitoring their implementation. At best there’ll be some tweaks at the margins, but triggered only when egregious failings surface. In the routine, the design is cast in stone and gets implemented for years. There’ll always be enough reasons to explain away any failures to meet outcomes, or most often outcomes are massaged to suit the administrative requirements. 

Instead, there’s a need to adopt a dynamic perspective on the design and implementation of policies and programs. In each case, the initial design could be seen as a minimum viable product (MVP) of a policy or program made based on theory, commonsense, and precedent. Once the MVP hits the ground running, its failings and weaknesses get exposed. There should be a process to capture feedback about what’s going wrong, analyse it for where and why, and how it can be rectified. There should be another process to incorporate the changes into the program design. There should be continuous surveillance of the implementation to watch for emerging challenges.

Finally, as mentioned earlier, given the varying contexts in countries like India, in many of these schemes and policies there might arise the need to have differential emerging designs (and their respective iterations). 

None of these iterations and process revisions are technical changes. The feedback and analysis are mostly observational and qualitative. Using this feedback, the decisions are made, essentially as exercises of judgment. The implementation of each program or policy is about a series of exercises of judgment. The effectiveness of the program/policy is critically dependent on the quality of the judgment and decision. This quality, in turn, depends on the strength of the state’s administrative and management capabilities. In other words, Hirschman’s description of development as the ability to make decisions boils down to the strength of state capability

It’s pertinent to note that this dynamic is just as true for the private sector as for public policies and programs. In fact, this approach is a feature of private startups in their growth trajectories. They release the MVP of their product or service and then continuously iterate to refine their offering. It’s to be however noted that given the far simpler nature of their markets and the more rigorous preparatory work possible, the quality of their MVP is superior to that in the public setting. Such iteration and adaptation is true not only of their products and services but the company itself. Companies keep pivoting and reinventing themselves in response to emerging market developments. 

It’s said that writers don’t start writing a book with all the chapters etched in their minds. As E L Doctrow said, “Writing is like driving at night in the fog. You can only see as far as your headlights, but you can make the whole trip that way.” The same applies to companies and public policy implementation in general. They are constantly iterating and adapting. 

No comments: