In December, I heard a series of teams of Rotman business students present analysis of the same business case. They didn't all get to the same answer; that wasn't the point. But what struck me was that the most compelling teams were those who were clear on their reasoning and their process of evaluation, not the ones whose conclusions I necessarily agreed with. It was not at all about the whizbang analysis and the éclat conclusion, not in the slightest. It was about understanding what assumptions underpinned the conclusions, so I could evaluate whether I agreed with them or not.
In the debrief, I was trying to explain to the students what was so compelling about seeing the inner workings of their argument and, while they were listening politely, I do remember that it wasn’t necessarily something that I absorbed very well as a student, either. But the concept is still rolling around in my head.
One useful way to think about this is in terms of the ladder of inference concept, popularized by Chris Argyris.
This video titled ‘The Ladder of Inference Creates Bad Judgment’ is a nifty explanation of the concept.The idea is to try to avoid moving too hastily from what you see, to what you believe, and then to what you do about it – and also seek data to validate your conclusions, and seek contradictory data to invalidate them, as well.
We often hear about trying to put together a “watertight” argument, but, increasingly, I think that’s a poor metaphor in a business context. What we want is permeability, porousness, reporting (look at the open source movement for one example). “Watertight” is one step away from “black box” and I think that’s also another way of saying “limited.”
What we try to do with our consulting clients is expose the ladder of inference being used on a step by step basis. That way, we can maximize the value of any given piece of analysis. Even if – or perhaps especially if – there isn’t resounding agreement with our final recommendations, we aspire for it always to be clear how we reached them, so our clients can use the same data and analysis to reach their own conclusions. We don’t view a consulting engagement as a way for us to write a rigid blueprint
for the client’s next set of decisions, but rather a catalyst for the client to internalize the issues and reach the right conclusions for them, on their terms.
As one Mezzanine client recently put it to us at an early meeting, “I don’t want a big ‘reveal’” at the end. Let’s work together as we go.” There are huge benefits of this approach, not least is that we think our recommendations are more likely to be implemented if they are co-developed with the clients. Don’t get me wrong – we still do the work involved in the project. But we don’t assume the expertise is all housed on our side of the table, and we don’t expect that our advice will be followed just because we put it on a pretty slide.
Reasonable people can disagree about all sorts of things. So much of business in a world of data insufficiency boils down to judgement, and the most interesting conversations are those where we all know the facts, but get to have a discussion about how we interpret them. Clarity on fact versus conclusion helps us have those kinds of conversations. And most ambitious people, I think, want to be around when tougher and bigger and trickier decisions are made based on judgement – beyond the frontiers of clear answers based on data.
Are there ways you can help make your own conclusions more open to those around you? Do your clients, employees, bosses, partners, collaborators understand not just what your conclusions are, but also why you reached them? Do you take the time to understand how people you interact with reach their own conclusions? What kinds of surprises might be in store for you?