Pages

Tuesday, December 29, 2009

Protocols enable innovation


David Brooks recently had an op-ed in which he expounded on something he called "The Protocol Society", a term he used to capture what authors Arnold Kling and Nick Schultz call the 'software layer' in their idea of 'Economics 2.0, as expressed in their 2009 book "From Poverty to Prosperity: Intangile assets, hidden liabilities, and the lasting triumph over scarcity"

Here's the idea: protocols, something every program and project manager lives by, are the rules of the game: some written, and some only existing as understandings by experience and practice. Protocols give us the law, and particularly contract law, copyright law, trademarks, and others that set up predicable pay-and-receive-value obligations and liabilities, supply chains, contractor opportunities, and protection of invention and intellectual property.

Kling and Schultz call protocols the 'software layer' of the modern economies, and the enabler behind the innovation and productivity revolution of the last 100 years.

Brooks makes this interesting observation: the classical laws of supply and demand that regulate and balance the tangible economy because of the scarcity of real goods, and the fact that real goods can't be in the hands of more than one person at a time, don't apply to the protocol world.

There are no supply-demand forces working on ideas, and certainly there is no scarcity of intangibles--there can be uncounted users logged on to all manner of cloud services at one time. There is no practical supply and demand regulation of the protocol society. Indeed, Brooks makes the point that protocol-centric cultures tend toward inequalities: those that are cultually attuned to adapation and adoption of new ideas, as different from new physical stuff, will prosper. In fact, culture is perhaps the most important ingredient in the mix of what makes protocols successful.

And so in the project domain, we see many of these forces at work. Projects are in many ways enabled by the presence of effective protocols. Think of virtual teams as essentially enabled by protocols but reliant on cultural commonality for success.

Project managers are not constrained by conventional supply and demand allocations for any number of project services, although perhaps the most important resource, people, are balanced by supply and demand.

And what about innovation and governance? Where does the protocols of goverance fit in? We in the program and project world certainly have experience with governance systems.


Again, Kling and Schultz cite the Chicago school and Harvard school as sources of opposing thoughts: the former being market oriented-small government and the later being governance oriented-market regulation. In the Kling-Schultz formulation, its an idea from both camps, to wit: market-market. That is, the greatest source of innovation is freedom to try and fail in the market, but the most effective regulator of failure is failure itself.

Are you on LinkedIn? Share this article with your network by clicking on the link.

Photo credit, David Brooks:

Photo credit: tunnel:

Saturday, December 19, 2009

Extreme Risk Management--The One Percent Doctrine

Ideas about extreme risk management have been around a long time. Extreme risks are those for which the consequences are irreversible, and the impact is near-catastrophic. In most cases, the likelihood of the event is low.

Insurance from high-risk underwriters--most famously Lloyds of London--has been a traditional mitigation.

But for some projects and some circumstances, insurance is not practical.

There are a couple of principles that guide action, and it's no surprise that 'utility theory' that takes into account the nonlinear, sometimes irrational, reactions of people--in this case, risk managers--is involved.

Probably the oldest is something called the Precautionary Principle. In a few words what it means is that burden of proof about consequences is shifted to the advocate of the action and away from the pessimist who is blocking the action. That is to say, for impacts so horrific and irreversible that such an outcome is unaffordable in every sense of the word, the advocate of an action that might lead to such an outcome must prove the consequences can be avoided; the pessimist need not prove that the consequences are the most likely outcome.

One project example is the decision in Houston regarding the return of Apollo 13 after the explosion that damaged the spacecraft. Gene Kranz, lead Flight Director, essentially turned back the advocates for a quick return and directed an orbit around the moon for the return. The consequences of an early return, if not successful, were fatal since the moon lander lifeboat had to be abandoned if the early return option was selected. A good description of the decision making process is found in Kranz's book: "Failure is not an option"

Tom Friedman, writing in the New York Times, described the One Percent Doctrine, a phrase made famous by Ron Suskind in his book of the same title. It described the precautionary principle as espoused by Dick Cheney: If there is even 1% chance of a horrific event happening, then consider the 1% a certainty.

The impact of the 1% doctrine is to make the impact x probability result so high that it subsumes all other risks. In the face of the 1% principle, all possible measures must be undertaken to avert the risk event--failure is not an option!

Photo credit:





Saturday, December 12, 2009

Project Management the Agile Way

It's almost here! "It" is my new book, "Project Management the Agile Way...Makng it work in the enterprise", most likely in Amazon by January 2010 if everything continues on the path with the publisher-Gods.

Agile Book
In this book, I expound on my top-five for agile, and actually blow it out to 12 major themes, from a quick overview of 4 agile methodologies, through a business case, test strategy, and eventually ending with benefit capture.

You know, on this last point, the NPV of the typical agile project is better than the traditional plan-driven methodology, at least for a few periods where the early deliveries start earning benefits early.

If you work in a business environment where the executives need to be persuaded to do projects, and a business case becomes a contract for performance, this may be the book for you. This is about agile in a business situation where projects may not be a core competency, but simply a means to an end.

I hope you enjoy the read as much as enjoyed the write!

Are you on LinkedIn? Share this article with your network by clicking on the link.

Monday, December 7, 2009

The science of complexity

Warren Weaver wrote one of the classic papers on complexity in 1948. Even though that was more than 60 years ago, some things are timeless.
Entitled "Science and Complexity", it postulates two views of complexity: 'disorganized complexity' and 'organized complexity'.

Disordered complexity deals with situations of many elements that interfere with each other in difficult-to-predict or even random ways.  Weaver writes what is ".... meant by a problem of disorganized complexity. It is a problem in which the number of variables is very large, and one in which each of the many variables has a behavior which is individually erratic, or perhaps totally unknown. However, in spite of this helter-skelter, or unknown, behavior of all the individual variables, the system as a whole possesses certain orderly and analyzable average properties."

Wow! that sounds like a software system!  On average, we know what is going to happen, but moment to moment, the odds are that something strange might occur. 

This certainly the case for latency in the Web, order frequency at an on-line store, and even the iteractions of tens of thousand of objects that make up large scale systems.  The curious thing is that as knowledge of any particular actor becomes more difficult to predict, the precision of the average response tends to improve by the square root of the number of actors in the sample.

Organized complexity, on the other hand, are equally vexing problems, but they are problems of relatively few variables.  Hence, many of the statistical techniques do not apply because the average response is not an appropriate way to observe the situation.

Weaver again writes: Organized complexity ".... are all problems which involve dealing simultaneously with a sizable number of factors which are interrelated into an organic whole.".  The behavior of a handful of objects is one of organized complexity.

One strategy heartily endorsed by Weaver in his paper is what he calls 'mixed teams', which today we call mult-disciplinary teams.  He writes with amazing foresight:  "It is tempting to forecast that the great advances that science can and must achieve in the next fifty years will be largely contributed to by voluntary mixed teams, somewhat similar to the operations analysis groups of war days, their activities made effective by the use of large, flexible, and highspeed computing machines. However, it cannot be assumed that this will be the exclusive pattern for future scientific work, for the atmosphere of complete intellectual freedom is essential to science. There will always, and properly, remain those scientists for whom intellectual freedom is necessarily a private affair"

One only has to look at the current literature on agile teams and the role of the the SME outsider as an indvidual but eccentric contributor to see that there is not much new in the world.





Are you on LinkedIn? Share this article with your network by clicking on the link.