Management books from as long as I can remember have been talking about a phenomenon called “Not Invented Here” which is used to describe the resistance that people have to accept ideas that other people have come up with. Certainly this is an issue when trying to implement a new solution and get buy-in from a new group of people, however, it may just be that this resistance is less about general opposition and more about an unconscious awareness that has developed about how ideas are implemented – and that unconscious awareness might need to be made conscious in order to move forward.
Best for Whom?
The implication of trying to use an idea from someplace else is that it’s a best practice – that when applied to our circumstances that it will yield the similarly impressive results as when it was done elsewhere. However, every set of circumstances are different. Apply a low calorie high-vitamin diet to someone who’s already underweight and you’re asking for trouble. Inviting people to comment on an idea for which there are already too many opinions will only serve to confuse the situation further.
On the other hand, it’s entirely possible that although the practice isn’t best for one person, it is best for the majority of folks. In the general case, a low calorie high-vitamin diet is a good thing. Similarly, getting comments from multiple people is generally good.
Assessing who the practice is best for is an essential part of deciding whether it’s wise to accept an idea that wasn’t invented here – or whether it’s better to resist it because it won’t work in this environment or with these people. Said differently, the trick is determining whether it applies to your situation – and whether you’re controlling the essential attributes.
Essential and Accidental
When someone does a formal study, a great deal of care is taken to ensure that only one variable is changing at a time. The one controlled variable is supposed to be isolated so that it’s possible to identify the impact of changing that one variable. However, getting to a single variable is very difficult when humans are involved, as is evidenced by the Hawthorne Effect. That is a study where they attempted to see if productivity was increased by improving lighting at the Hawthorne Works plant. Good news, when they increased lighting the productivity went up. Bad news, it also went up when they reduced the lighting. The uncontrolled variable was how workers would respond to being monitored.
When any new best practice comes out, it comes with the package of ideas and behaviors that are supposed to drive the results. However, sometimes behaviors and thinking that had no impact on results are copied – and all too often the important behavior and thinking aren’t included with the best practice package because they weren’t deemed to be essential.
Researchers often attempt to replicate other researchers’ work as a way of validating it. They do this to ensure that the initial results were correct – but also to ensure that the experiment wasn’t tainted by other factors which weren’t a part of the study. However, best practices rarely get the kind of scientific rigor as an experiment. Often someone says that something worked for them and it suddenly becomes a best practice – even when the practice isn’t correlated with the results at all.
If the practice is tested, the next step is to determine whether the entire package of behaviors and thinking are required for the result or if there are a few essential components of the practice that are responsible for the results – or most of the results. The challenge is that sometimes it’s hard to identify the essential attributes of a best practice. A single practitioner may be able to implement the practice with great results – but no one else can get the same results. In cases where the essential attributes of the change can’t be identified, it may be because the practitioner has tacit knowledge that cannot be captured in the best practice.
Experience has value and not all experiences can be neatly coded into boxes, a report, or even in a presentation. Gary Klein noted in Sources of Power that fire captains knew more about fires than they were aware of. Solutions didn’t just come to them. Problems triggered pattern recognition which caused them to build mental models and ultimately propose solutions that matched their recognition-driven awareness of the problem – what Gary Klein would call a Recognition Primed Decision.
Lost Knowledge and The New Edge in Knowledge Management discuss at length how difficult it can be to codify tacit knowledge. There are so many ways to capture and code knowledge – but ultimately much of the contextual richness of the knowledge is lost and simply cannot be captured.
Sometimes the illusion is that the practice contains all of the knowledge necessary to be successful when in truth, there is tacit knowledge that has formed in the practitioner which can’t be easily captured and conveyed. If you can’t capture and convey the tacit knowledge that the practitioner has developed, there’s little hope that the best practice contains enough useful information to be effective. In fact, even if there is a way to capture the tacit knowledge, the problem may be Wicked enough that it doesn’t matter.
Everett Rogers exposed the issue of how difficult it is to predict the impact of an innovation in his book Diffusion of Innovations. The problem is that there are many systems that aren’t visible to the casual observer. Well-meaning missionaries introduced the steel axe head to aborigine tribes with the disastrous results of seriously disrupting their societal norms. We built dams to control flooding and to produce energy only to realize that we had disrupted fish reproduction – particularly salmon.
What appears to be a simple change – like giving better hand tools to aborigine tribes can result in an enormous change. Without the knowledge of the fact that the elders owned all the stone axe heads and loaned them out to the younger men of the tribe, you would never expect such dramatic negative effects with providing steel axe heads. Even with this knowledge, you’re unlikely to recognize the breadth of the problems that would be created – including an increase in prostitution.
Preventing flooding is a good thing. Humans lose their life and certainly their wealth through floods so flood control must surely be a good thing. Certainly when all of the factors are considered it can be a good adjustment. However, not realizing that some fish need to swim upstream to reproduce, the reduction of river volumes that results from building damns can create problems for the fish populations – and ultimately for the fishermen who rely on them for their living.
The problem with undiscovered systems – which lead to best practices not being best practices – is that they have an element of wickedness to them. Wicked in the way that Horst Rittel meant it when he used it to describe the problems of urban planning.
There’s an old saying attributed to Heraclitus that a man never steps into the same river twice – he’s not the same man and it’s not the same river. Sometimes even if you’ve captured the tacit knowledge, have identified the essential attributes, and have a similar situation as the practice was originally proven in, you still can’t get the same results – but that’s because the environment that you’re in is different. This is the heart of a wicked problem. Every time you attempt remediation, the problem changes, and as a result, there are a different set of actions needed to correct.
In my review of Dialogue Mapping, I discussed briefly what a Wicked problem is. I pickup the thread again in my review of The Heretic’s Guide to Best Practices. One criteria for a Wicked problem is that there is no stopping rule. That is to say there is only “more good” and “less good” – there’s never Complete or Done. However, as Horst Rittel defined it, it’s broader than that. There’s also no definitive formulation of a Wicked problem – you can’t precisely define it. Wicked problems also don’t have an enumerable set of solutions. Sometimes they’re even the solution to another problem – so in this way, Wicked problems are an interlocking set of constraints – a system of systems that all interact. Systems thinking – and how you can cause unintended consequences was a key part of The Fifth Discipline.
Often best practices are entwined with people. From the manager to the worker and to every affected customer and vendor. People and the complex systems which operate inside the best practices frequently mean that the problems that best practices are designed to solve are Wicked – and therefore resistant to having a best practice at all. That’s due in part to the fact that people have egos.
Everyone has an ego – and often ego gets a bad reputation. John Dixon says in Humilitas that “One of the failings of contemporary Western Culture is to confuse conviction with arrogance.” In other words, we confuse people who are self-confident and convicted in what they believe as arrogant. The Advantage, Good to Great, Who, and other books all caution about the out of control ego. Books like Heroic Leadership, Personality Types: Using the Enneagram for Self Discovery, and How to Be an Adult in Relationships speak about the need to control and channel ego in a more productive way.
Books like Daring Greatly, Mindset, and The Happiness Hypothesis speak about self-confidence and self-image instead of using the negative word ego, but they’re speaking of the same thing. They’re speaking about our belief systems, and for most of us, we believe that we’re better than average. That we’re able to come up with our own solutions to our own problems – so we don’t need it if we didn’t come up with it. Our fragile ego doesn’t like to have it pointed out to us that we’re not really the ones that are the best at something.
Humilitas spends a great deal of time with studies, including those by Thomas Gilovich, where various groups of people believe collectively that they are better than they can possibly be. For instance, 70% of high school seniors who believed their leadership capabilities were above average. We’ve all got an ego and we, in general, tend to see ourselves slightly better than we actually are.
So, when we are confronted with an idea or best practice that we didn’t come up with, we want to believe we can come up with something better. Perhaps we have been bitten by a previous best practice that didn’t work out. A practice that wasn’t tested or a practice that didn’t address the wickedness of our situation. However, more likely, we’re just indulging our ego by rejecting that someone else could have a better solution than we did. For most folks the mental model we have that we’re smarter than other people isn’t right.
Above, I mentioned Gary Klein’s research with fire commanders, which was documented in Sources of Power, from the context of tacit knowledge – but that’s just half the picture. The other half of the picture is that people make decisions based on what they recognize in the situation. It turns out that fire commanders – and folks in many other walks of life – learn slowly and silently about how situations unfold and build mental models on that learning. The models allow the fire commanders to simulate what the fire will do – hypothesize about its origin – and test different ways of solving them.
These mental models are powerful. Their utility was discussed in Thinking, Fast and Slow, Efficiency in Learning, Dialogue Mapping, and The Fifth Discipline. One of the points made in The Fifth Discipline is that often our espoused beliefs and our actual mental models are different. As it pertains to our ego – we will often say that we can learn from everyone. Intellectually, it makes sense that we can learn something from other people and that we should. However, how many of us – including myself – have stopped listening to someone because we disagreed with their views, heard them say something that we knew wasn’t quite right, or we just didn’t like.
The reason for covering the impact of our ego on “Not Invented Here” is so that we can make it conscious.
Unconscious Made Conscious
Mental models are powerful in their control of us – up to the point that they’re made explicit. The more conscious that you make your mental models the more you can shape and change them into alignment with the way that you want to be. In the case of “Not Invented Here,” a healthy skepticism to make sure that the idea is really the best thing for your situation. It’s important to verify that the results are repeatable across scenarios – and across practitioners. We must be sure that it’s the right answer and not just blindly accept something from somewhere else.
However it’s equally important to ensure that the reason for resistance to an idea isn’t related to our egos expecting that only we can come up with good ideas. Good luck sorting out which thing is stopping you – and your colleagues – from accepting ideas that were “Not Invented Here.”