[N]ot only do Boomers not get complexity, they are suspicious of it, thanks to their early cultural training which deifies simplicity. The result of this difference is that Boomer management models rely too much on simplistic ideological-vision-driven ideas. Consider, for instance, the classic Boomer idea of creating “communities of practice” with defined “Charters” and devoted to identifying “Best Practices.” No Gen X’er or Millenial would dare to reduce the complexity of real-world social engineering to a fixed “charter” or presume to nominate any work process as “best.” … I suspect, as Gen X’ers and Millenials take over, that the idea of vision and mission statements will be quietly retired in favor of more dynamic corporate navigation constructs.
I don’t know that this is strictly about the generations, but it certainly is insightful. I’d like to link this to the debate between what is called reductionism and complexity. Reductionists believe that “the truth is simple” and try to explain things by reducing everything to a narrow set of definable objects and a set of laws for their interactions. It’s as if there is a warehouse full of categorized objects (e.g., sub-atomic particles, genes) and a certain set of rules about how these are to interact, that are based on these things’ intrinsic, mathematizable qualities.
To me, it’s obvious that nothing in the universe is like that. First of all, laws are not forces, but mathematical conceptions of patterns of interaction. The idea that you can produce and explain things through the placement of objects and the application of laws leaves out a lot of information. For instance, in nature, when does a process start, and with what event or thing? What are the boundaries and parameters of a system of interactions? (How do you know what factors will be/become relevant?) Does one particle have the same properties as a number of them, and if not what is the “minimum”?
In recent years, complexity and complex systems theory have undercut a lot, but certainly not all (or even most), of reductionist thinking. Physicist, Robert B. Laughlin, for instance, argues that laws are not what causes certain interactions, but the reverse, laws are what emerge from interactions. Complexity theorist, Stuart Kauffman observes that reductionists just take parameters as givens, while showing that in living systems, life creates its own parameters (e.g., cell walls). Nature is self-organizing.
The point is that the old Newtonian paradigm is now crumbling and a new paradigm is well underway to becoming the reigning one, a point that Gilles Paquet has made with effect. So how does this tie to mission statements and charters and what-not?
Methodologies parameterize: They define limits of operation, without necessarily any awareness being generated of how the limits are drawn. This is why there are predictably going to be “perverse effects” of target-setting. Transaction quotas, for instance.
Programmed procedures might be great when you can reasonably identify all of the things that might have an effect on the outcome and control for that but, outside of a laboratory or some staple factories perhaps, life’s not like that. It’s putting the cart before the horse to think that the methods, processes, templates, and formalized procedures we've come up with so far provide adequate parameters of possible activity. Process templates and toolkits, initially touted as panaceas, are all eventually discarded, not always because of the shocking unforeseen, but due to the fact that they can never capture all of the complexity of normal operations. And besides, most people prefer to think rather than follow instructions, as long as they have the time to do so.
An over-reliance on methodology cuts off possibilities and prohibits learning and risk-taking and the use of good judgment, and sets up blind, impersonal systems as abstract authorities. The results of over-programming organizational processes mean you’ll lose the engagement of more risk-tolerant and innovative employees and wind up entangled in a web of rules.