Goodhart’s law: “when a measure becomes a target, it ceases to be a good measure”. When I found this law, I had to think about this quote often attributed to Peter Drucker: “what gets measured, gets improved”.

Behind Goodhart’s law is the idea that as soon as something is tracked and it receives pressure to improve it, the statistical regularity collapses. With that, its value is gone.

The reasoning: a metric is always a summary of what is going on. As soon as you make it a goal, there is a big chance that while you still look at it as a summary, other things, that you aren’t measuring, will change. Secondly, a metric could invite cheating.

Be careful with metrics. Keep focussing on the bigger picture. The larger goal. Use metrics to check in and change them regularly to make sure it does not become a goal in itself.