Substack

Wednesday, May 27, 2020

Heisenberg's Uncertainty Principle and performance management

Heisenberg's Uncertainty Principle states that it is impossible to simultaneously measure with precision the position and velocity of a particle. Intuitively, say, once a light is shined on a particle to expose its position, the associated energy invariably has the effect of displacing the particle, thereby changing its velocity. And so on. Smaller the particle, more pronounced the dissonance effect.

There is an remarkable similarity between this Principle and to the monitoring of performance data, especially for many types of activities where credible performance measurement is difficult. While this is true in general, it is pervasive in governments since the outcomes of many activities of public agencies are dependent on human engagement intensity or quality and thereby struggle with quantification.

In such cases, as the stakes associated and complexity of data capture increases, the reliability of the data collection decreases. Or, higher the stakes, lower the accurate informational content of the data. 

It is because once managers start looking at data to monitor performance, those being monitored start manipulating the data being collected. The extent of manipulation only increases with rising stakes and intensity of monitoring.

In other words, in many cases involving quality, it is extremely difficult to both accurately capture data and use that data to credibly assess performance. And to do this at scale and sustainably is an even bigger challenge.

Ironically, the use of technology makes this even more challenging. One, it increases the opportunities for gaming data management, besides making it easier too. Two, the associated information overload can end up crowding-out critical and relevant information. Finally, the elegance and neatness of a Dashboard creates an illusion of control that lulls users into complacency about the problems that accumulate. In the words of people who have used them all their professional lives, even if the quality of the data is good, they "confer limited control", "defocus issues", "put too much not the plate". Most tellingly, "they drive people away from performing to looking good"!

This is not an argument against not measuring performance data, but being conscious about the serious limitations of such data. It is a note of caution against those who believe that chronic state capacity failures can be overcome by data analytics and Dashboards. 

One of the strategies often suggested is to have the data extracted from a work-flow management solution. This is obviously a better alternative to the strategy of direct collection and capture of data only for the purpose of monitoring (as against also work-flow management). This is bread and butter stuff in businesses, where both work-flow management and monitoring dashboards are integrated. But the neatness disappears for many government activities for a variety of reasons. Further, once this is adopted, the incentives get diverted to gaming the work-flow itself. And given the less-than-neat nature of work-flow in public systems, the process of subverting is relatively easier. Furthermore, once you fix a node in the work-flow, the subversion merely shifts to another node. And given the limited time and patience to persist and iterate, we end up with deeply compromised work-flow management and monitoring Dashboards. But there are limits to its application in most development settings. 

None of this is to say we should eschew Dashboards nor to suggest that it is impossible to develop reasonably effective Dashboards. There are design choices and mechanisms available to develop satisfactory enough Dashboards. But they are far more nuanced and iterative than what pretty much all the prevailing attempts at making Dashboards employ. And then there is the Heisenberg's Principle to be overcome. 

No comments: