Pages

Tuesday, December 29, 2009

Protocols enable innovation


David Brooks recently had an op-ed in which he expounded on something he called "The Protocol Society", a term he used to capture what authors Arnold Kling and Nick Schultz call the 'software layer' in their idea of 'Economics 2.0, as expressed in their 2009 book "From Poverty to Prosperity: Intangile assets, hidden liabilities, and the lasting triumph over scarcity"

Here's the idea: protocols, something every program and project manager lives by, are the rules of the game: some written, and some only existing as understandings by experience and practice. Protocols give us the law, and particularly contract law, copyright law, trademarks, and others that set up predicable pay-and-receive-value obligations and liabilities, supply chains, contractor opportunities, and protection of invention and intellectual property.

Kling and Schultz call protocols the 'software layer' of the modern economies, and the enabler behind the innovation and productivity revolution of the last 100 years.

Brooks makes this interesting observation: the classical laws of supply and demand that regulate and balance the tangible economy because of the scarcity of real goods, and the fact that real goods can't be in the hands of more than one person at a time, don't apply to the protocol world.

There are no supply-demand forces working on ideas, and certainly there is no scarcity of intangibles--there can be uncounted users logged on to all manner of cloud services at one time. There is no practical supply and demand regulation of the protocol society. Indeed, Brooks makes the point that protocol-centric cultures tend toward inequalities: those that are cultually attuned to adapation and adoption of new ideas, as different from new physical stuff, will prosper. In fact, culture is perhaps the most important ingredient in the mix of what makes protocols successful.

And so in the project domain, we see many of these forces at work. Projects are in many ways enabled by the presence of effective protocols. Think of virtual teams as essentially enabled by protocols but reliant on cultural commonality for success.

Project managers are not constrained by conventional supply and demand allocations for any number of project services, although perhaps the most important resource, people, are balanced by supply and demand.

And what about innovation and governance? Where does the protocols of goverance fit in? We in the program and project world certainly have experience with governance systems.


Again, Kling and Schultz cite the Chicago school and Harvard school as sources of opposing thoughts: the former being market oriented-small government and the later being governance oriented-market regulation. In the Kling-Schultz formulation, its an idea from both camps, to wit: market-market. That is, the greatest source of innovation is freedom to try and fail in the market, but the most effective regulator of failure is failure itself.

Are you on LinkedIn? Share this article with your network by clicking on the link.

Photo credit, David Brooks:

Photo credit: tunnel:

Saturday, December 19, 2009

Extreme Risk Management--The One Percent Doctrine

Ideas about extreme risk management have been around a long time. Extreme risks are those for which the consequences are irreversible, and the impact is near-catastrophic. In most cases, the likelihood of the event is low.

Insurance from high-risk underwriters--most famously Lloyds of London--has been a traditional mitigation.

But for some projects and some circumstances, insurance is not practical.

There are a couple of principles that guide action, and it's no surprise that 'utility theory' that takes into account the nonlinear, sometimes irrational, reactions of people--in this case, risk managers--is involved.

Probably the oldest is something called the Precautionary Principle. In a few words what it means is that burden of proof about consequences is shifted to the advocate of the action and away from the pessimist who is blocking the action. That is to say, for impacts so horrific and irreversible that such an outcome is unaffordable in every sense of the word, the advocate of an action that might lead to such an outcome must prove the consequences can be avoided; the pessimist need not prove that the consequences are the most likely outcome.

One project example is the decision in Houston regarding the return of Apollo 13 after the explosion that damaged the spacecraft. Gene Kranz, lead Flight Director, essentially turned back the advocates for a quick return and directed an orbit around the moon for the return. The consequences of an early return, if not successful, were fatal since the moon lander lifeboat had to be abandoned if the early return option was selected. A good description of the decision making process is found in Kranz's book: "Failure is not an option"

Tom Friedman, writing in the New York Times, described the One Percent Doctrine, a phrase made famous by Ron Suskind in his book of the same title. It described the precautionary principle as espoused by Dick Cheney: If there is even 1% chance of a horrific event happening, then consider the 1% a certainty.

The impact of the 1% doctrine is to make the impact x probability result so high that it subsumes all other risks. In the face of the 1% principle, all possible measures must be undertaken to avert the risk event--failure is not an option!

Photo credit:





Saturday, December 12, 2009

Project Management the Agile Way

It's almost here! "It" is my new book, "Project Management the Agile Way...Makng it work in the enterprise", most likely in Amazon by January 2010 if everything continues on the path with the publisher-Gods.

Agile Book
In this book, I expound on my top-five for agile, and actually blow it out to 12 major themes, from a quick overview of 4 agile methodologies, through a business case, test strategy, and eventually ending with benefit capture.

You know, on this last point, the NPV of the typical agile project is better than the traditional plan-driven methodology, at least for a few periods where the early deliveries start earning benefits early.

If you work in a business environment where the executives need to be persuaded to do projects, and a business case becomes a contract for performance, this may be the book for you. This is about agile in a business situation where projects may not be a core competency, but simply a means to an end.

I hope you enjoy the read as much as enjoyed the write!

Are you on LinkedIn? Share this article with your network by clicking on the link.

Monday, December 7, 2009

The science of complexity

Warren Weaver wrote one of the classic papers on complexity in 1948. Even though that was more than 60 years ago, some things are timeless.
Entitled "Science and Complexity", it postulates two views of complexity: 'disorganized complexity' and 'organized complexity'.

Disordered complexity deals with situations of many elements that interfere with each other in difficult-to-predict or even random ways.  Weaver writes what is ".... meant by a problem of disorganized complexity. It is a problem in which the number of variables is very large, and one in which each of the many variables has a behavior which is individually erratic, or perhaps totally unknown. However, in spite of this helter-skelter, or unknown, behavior of all the individual variables, the system as a whole possesses certain orderly and analyzable average properties."

Wow! that sounds like a software system!  On average, we know what is going to happen, but moment to moment, the odds are that something strange might occur. 

This certainly the case for latency in the Web, order frequency at an on-line store, and even the iteractions of tens of thousand of objects that make up large scale systems.  The curious thing is that as knowledge of any particular actor becomes more difficult to predict, the precision of the average response tends to improve by the square root of the number of actors in the sample.

Organized complexity, on the other hand, are equally vexing problems, but they are problems of relatively few variables.  Hence, many of the statistical techniques do not apply because the average response is not an appropriate way to observe the situation.

Weaver again writes: Organized complexity ".... are all problems which involve dealing simultaneously with a sizable number of factors which are interrelated into an organic whole.".  The behavior of a handful of objects is one of organized complexity.

One strategy heartily endorsed by Weaver in his paper is what he calls 'mixed teams', which today we call mult-disciplinary teams.  He writes with amazing foresight:  "It is tempting to forecast that the great advances that science can and must achieve in the next fifty years will be largely contributed to by voluntary mixed teams, somewhat similar to the operations analysis groups of war days, their activities made effective by the use of large, flexible, and highspeed computing machines. However, it cannot be assumed that this will be the exclusive pattern for future scientific work, for the atmosphere of complete intellectual freedom is essential to science. There will always, and properly, remain those scientists for whom intellectual freedom is necessarily a private affair"

One only has to look at the current literature on agile teams and the role of the the SME outsider as an indvidual but eccentric contributor to see that there is not much new in the world.





Are you on LinkedIn? Share this article with your network by clicking on the link.


Sunday, November 29, 2009

Open workspaces and communications

In the December 2009 issue of the Harvard Business Review -- the theme of which is 'spotlight on innovation'--James B. Stryker has an interesting item in the magazine's Forethought section reporting on his research into the effectiveness of open workspace. He titles his work: "In Open Workplaces, Traffic Headcount Matters"

Everyone who's been around a while remembers the mid-80's push for quiet, private workspaces, many with hard walls and a door! Then came the dot.com boom and the age of osmosis, as Alistair Cockburn puts it, and the open workspace is ushered in. That's more or less where we are today, 10 or 15 years later, with low-rise cubicles, open areas, and lots of face time with our co-worker--that is, if you go to where the work is done, and you do if you are on a SCRUM, XP, or Crystal team.

Mr Stryker cites three parameters that are key to productivity in the open workplace:

  • Visibility

Here's a wow! If the space is on a main traffic route that gets lots of notice, 60% reported increased face time with team members. Lesson: don't locate in corners!

  • Density

More people, more communication--somewhat obvious, but Styker says 16 people in a 25-foot radius really works.  That's about 120+ square feet per person, counting all the public areas.  Generous by my experience.

  • Oasis

Now this is one I really like: 22 meeting spaces within 75-feet is a recommended figure.  My last project was rich with meeting places and it really makes a difference.

Overall, Stryker's article is worth a view, even at a coffee shop or bookstore where HBR is on the shelf!


Are you on LinkedIn? Share this article with your network by clicking on the link.

Wednesday, November 18, 2009

Glen's Quote of the Day: schedule risk!

Glen Alleman comes up with some nice quotes of the day. One of my favorites is about the swooshing sound of milestones as they fly by:

I love deadlines; I especially like the SWOOSHING sound they make as they fly past
— Douglas Adams

Glen has a nice little post on this, and he mentions another favorite of mine: the Monte Carlo simulation to get a handle on schedule risk. The fact is, the statistical math is either not closed form or impractical to evaluate for any real schedule, so the only way to really see what is going on is with a simulation--and it's fast!

The Central Limit Theorem tells us the outcome distribution of a long term schedule with a lot of activities is going to be nearly symmetric--that is, about as many things are going to go right as not over a long term--but the confidence interval, in real numbers, for any particular schedule is only knowable with simulation. And simulation is the best way to get a fast read on the project manager's best friend in statistics: Expected Value

Give it a try!

Are you on LinkedIn? Share this article with your network by clicking on the link.




Saturday, November 14, 2009

The People Puzzle: non linear devices!

One of the most prolific writers in the agile space of thought provoking ideas, and author of one my favorite articles is Alistair Cockburn, the inventor of the Crystal family of agile methods.

[Some of these links are a little slow to load, so patience is advised]

One of Cockburn's favorite subjects, at least measured by his passion, is people, and a worthy note he wrote is entitled  "Characterizing People as Non-linear, First Order Components in Software Development.

In this paper, Cockburn's premise is that people are 'active devices' in software development, and like all active devices, there are success and failure modes, primarily these four:


  1. People are communicating beings, doing best face-to-face, in person, with real-time question and answer.
  2. People have trouble acting consistently over time.
  3. People are highly variable, varying from day to day and place to place.
  4. People generally want to be good citizens, are good at looking around, taking initiative, and doing “whatever is needed” to get the project to work.
He also suggests that people have these general characteristics:

  • People need both think time and communicating opportunities 
  • People work well from examples 
  • People prefer to fail conservatively than to risk succeeding differently; prefer to invent than to research, can only keep a small amount in their heads, and do make mistakes, and find it hard to change their habits.
  • Individual personalities easily dominate a project.
  • A person’s personality profile strongly affects their ability to perform specific assignmen



There's actually a lot more in the paper that makes for thoughtful reading.





Are you on LinkedIn? Share this article with your network by clicking on the link.




Friday, November 6, 2009

Project Managers: who owns the content?

In the course of a project, a lot of things get developed, much of it renderings of ideas in all manner of media.

Question: who owns the content?

Answer: it depends! If you are working under the rules of 'works made for hire' your employer probably owns the content--lock, stock, and barrel

But if you are not on a 'works made for hire' agreement, and you are the author, most likely you own the content under the principle of "original works of authorship" unless you have signed it over with a written document to someone else--at least that is true in the United States.

In the U.S., an idea can't be copyrighted, but an idea expressed in media is automatically copyrighted to the author, whether or not published, and no declaration or registration is required, ever!

Want to know more? There is a really good document, written clearly and in plain language, at the copyright office of the United States. Just click here to get it.

What about creative commons? What is that? Well, it is a means to share copyrighted material for purposes of innovation and content sharing. There are several sharing ideas under creative commons. Here's a good slideshare on the topic from Jennifer Dorman.

And what about 'free use'? Well, the courts have generally agreed that you can copy up to 250 contiquous words, or 400 words from an entire work, without getting a copyright release. But there are exceptions, especially when the 400 words is the essence of the copyrighted work. Read more at the government website on copyrights

Are you on LinkedIn?

Share this article with your network by clicking on the link.


Tuesday, November 3, 2009

The Subsidiarity Principle

What does the Catholic church have to do with project management? Well, do you remember the encyclical “Rerum Novarum” of 1891 by Pope Leo XIII? If not, then to make a quick point, among other things, it postulated the concept of subsidiary function, also called subsidiarity, to differentiate responsibilities between the Vatican and other units of the church. However, the idea has spread far and wide and is embraced in modern business thinking and progressive project management--like agile methods.

In a word, what it means is push things down: more importantly, don't interfere with subordinates. Have faith they know what they are doing, and more likely they will do the right thing.

There are rights and responsibilities that come with this principle. The central authority has a right to expect responsible behavior of its subordinate, but retains thea right to verify performance— – to trust, but with verification— – and intervene to impose corrective action.

The subordinate unit has a right to expect a degree of autonomy, with reasonable inspection and verification, so long as the subordinate acts responsibly. The subordinate has a responsibility to act in its own interests and in the interests of the central authority, taking care to not over- optimize at a low level.

When the principle of subsidiary function principle is extended to project planning, the first planning criteria is that plans should not be unnecessarily obtrusive; in particular, an agile plan should not direct, prescribe, or otherwise limit maneuverability and activity beyond the establishment of acceptable norms and conventions.

In other words, planning is to be done by the most competent and responsible decentralized project unit that is competent and responsible.

For progressives, it's not hard to buy into subsidiarity!

Are you on LinkedIn?

Share this article with your network by clicking on the link.


Friday, October 23, 2009

Ideas for Managing at the Milestone

Nothing really good happens at a milestone because too many things have to come together to have success.  At a milestone, seemingly independent activities have their fate joined.  At the milestone, everyone is in, or else everyone is stymied when the gate is not opened to pass through.  

Here's one idea: The fact is, joining together at a common event, like a milestone, is hazardous because of the concept of ‘merge bias’.  Merge bias simply means there is a strong tendency to slip to the right at a milestone, thereby stretching the schedule.  In other words, when you look at a schedule and you see a milestone, think immediately that there is a risk to shift rightthat is, milestones are hazards to on-time performance and represent the first weakness to look at when assessing the schedule for risk.

Here's another: It’s common sense that the more independent an activity is, the more freedom of action there is.  After all, there is minimum need to coordinate outcomes and processes if no one else is depending on your production.  So, by corollary, dependencies limit choices, limit agile methods, and place constraints where there would not otherwise be inhibitions.  Dependencies increase the effort that must go into coordination—team of teams, staff meetings, and the like—and this effort must come from some budget, and potentially the distraction of more coordination could impact other value-add work.


It’s no accident that milestones tend to shift to the right on the schedule.  There is actually a mathematical explanation for this phenomenon that arises from the statistical behavior of somewhat risky activities.  In a project, no activity can be known for certain—there is always some risk that things will take longer than planned, or in some cases take less time than planned


In statistics, the explanation comes from behavior of intersections and unions of events.  A union is an ‘or’ case; an intersection is an ‘and’ case.  A milestone is an intersection of two or more joining tasks.  

Statistics defines the intersection of independent events by the product of their probabilities.  There is no general formulation for the statistics of an intersection unless the events are independent.  If they are not, then the only practical way to determine the performance of the intersection—in our case, the milestone—is to simulate the project by running many trials.  Simulation for this purpose is the subject of another presentation.  

In similar fashion, we can multiply out all the other cases of one late/early and the other either late/early or on time.  For two joining paths, there is one success case—both on time—and three failure cases—one or the other or both are late.  The sum of the success case and all of the failure cases must account for all the possibilities—100%.  If the sum of all cases is not 100%—either more or less than—then the analyst has made some error in accounting for the possibilities.

For the project manager, the quick take away is that by glancing at the schedule and picking out milestones defined by joining paths, the weak points of the schedule are immediately seen.  At every such milestone, there is a bias towards shifting to the right—an obvious hazard to on time performance

Take a look at this slideshare presentation for more information

Are you on LinkedIn?

Share this article with your network by clicking on the link.



Monday, October 12, 2009

My Agile Top Five

Somebody asked me the other day for my definition of agile--I think they heard about my new book coming in January, "Project management the agile way ... making it work in the enterprise".

From the book, I say this:
Agile’ means small teams, working collectively and collaboratively, with this mission:

To deliver frequent, incremental releases of innovative functions and features, prioritized for need and affordability; evolved iteratively from a vision according to user reflection and feedback; and produced at the best possible value

Below are my top five points; I guess it is my version of the Agile Manifesto




Are you on LinkedIn? Share this article with your network by clicking on the link.

Friday, October 2, 2009

Top Five Ideas -- Statistics for Project Management

Do you love statistics? Probably not!

Nevertheless, they are present in every project and so a few minutes to grasp the top five ideas is time well spent.

In my presentation (at the slideshare link), adapted from my book "Quantitative Methods in Project Management", I talk about Monte Carlo simulation, the Central Limit Theorem, the Law of Large Numbers, probability distributions, and the merge bias problems in schedules.

View john goodpasture's profile on slideshare
Click here for a slideshare to show you the ropes on the top five ideas for statistics that help the project manager.

Are you on LinkedIn? Share this article with your network by clicking on the link.

Thursday, October 1, 2009

Driving Adoption -- Benefits realization in the agile space

There's no time like the present--Benefit tracking begins with the first release.

Adoption may be slow. Because of the natural reluctance to change, embracing new capabilities may not be automatic. To encourage adoption, competing or legacy capabilities should be withdrawn as soon as practical.

Remove the legacy
It may be possible to help adoption by having the first few iterations produce product increments that are naturally attractive and capable of creating a buzz.

Early adopters
There are, of course, early adopters who will eagerly grab new capabilities, especially technology capabilities rich in software features and functions, and especially those that are user-configurable. But early adopters are only one of five personalities in the body of knowledge known as diffusion of innovations.

The five are:
1. Innovators: Those who are anxious to work with the product in a preproduction or beta status and take risks with immature product; usually very personable and networked individuals, well connected with technology, and able to handle a high degree of uncertainty;
2. Early adopters: Those with opinion leadership eager to put product through its paces and be first on the block to have the advantage of a new capability;
3. Early majority: Those willing to adopt after visible proof that the bugs have been worked out and operational effectiveness has been proven;
4. Later majority: The reluctant but willing, not too comfortable giving up what they know best; and
5. Laggards: Those that might never adopt and so drop out of the pool of users.


Innovators often make their own decisions to engage using new ideas; they are often in at the beginning and may be drivers behind the original vision. Early adopters may wait for official sanction before taking up a new product; later adopters may be forced by decision makers to get involved. Regardless, Everett Rogers, one of the early academics in the theory of diffusing innovation, posits that everyone passes through a five-stage decision-making process, albeit on difference timelines.

Roger’s paradigm is:

1. Seek knowledge: Seek basic information to become familiar and acquainted with a new idea, product, or service;
2. Accept persuasion: Evaluate benefits in context of personal use and application;
3. Decide: Decide to adopt or reject;
4. Implement: Begin to apply the product or service to the everyday routine; and
5. Confirm: Accept the product as a fully qualified alternative to the prior capability.

Agile impact
In the agile space, this five-step process repeats with every release, although the steps begin to merge and the timeline is shorter as each release builds upon the past. The mission of business preparation is to smooth this decision process as much as possible; to prepare the knowledge base; and to prepare persuasive information so that moving to implementation and confirmation is as rapid as possible.


Are you on LinkedIn? Share this article with your network by clicking on the link.

Monday, September 28, 2009

Agile Earned Value--Part I

There's a common misperception that agile and earned value don't mix: that earned value is too heavy and anti-agle.  Not so! Agile projects earn value by delivering product incrementally, periodically, affordably, and according to the priority of the customer.


If the customer is not satisfied, he may not want to pay for our efforts. If the customer is not successful, he may not be able to pay. If he is not more successful than he already was, why should he pay?
Niels Malotaux
EVO Consulting Project Manager


The agile rule for earned value:
Each release is a value earning; without a go-live to production, there is no earned value.

In project terms, earned value means planning for an outcome and then achieving it, applying only the intended resources. When the project is completed and expectations are met, the entire value is earned—all requirements are rendered in production and equated to benefits.


Agile methods change the bookkeeping a bit, but by Agile Principle 1, delivering value is at the top of the list:

Our highest priority is to satisfy the customer
through early and continuous delivery of valuable software.


Are you on LinkedIn? Share this article with your network by clicking on the link.






Thursday, September 24, 2009

W. Edwards Deming invents PDCA

W. Edwards Deming and the PDCA cycle
Deming--working in Japan and else where in the mid-20th century--introduced very practical ideas of process control as a means to limit variations in product quality. Today, it is called defined process control.

Deming was a product guy; he came at quality from the point of view of the product: make the product the same way each time and make it work within limits that are acceptable to the customer. The modern poster child for defined process control is Six Sigma.


Six Sigma
  • Six Sigma is a problem solving methodology and defect control strategy with the purpose of identifying and mitigating error sources in defined process control.
  • The control limits are established such that production yields less than approximately 3.4 errors in one million opportunities either above or below the control limits. This figure is derived from the error possibilities within six standard deviations of a bell-shaped curve, after allowing 1.5 standard deviations drift of the long-term average defect rate.
  • The process derives its name from the Greek lower case s, called sigma and denoted σ; σ is the symbol used by statisticians for the standard deviation of a probability density function such as the bell curve.

 Agile Thinking
Ken Schwaber—a leading SCRUM methodologist—objecting to defined process control, puts it this way, “[defined process control] is based on processes that work only because their degree of imprecision is acceptable… When defined process control cannot be achieved because of the complexity of the intermediate activities; something called empirical process control has to be employed.”

In Schwaber’s view, software is too complex to expect defects to be contained within predefined error limits. Empirical control is the answer; empirical control is derived from observed facts, adapted to the situation, and not determined by preplanned limits from previous projects.

Deming contributes PDCA to agile projects
In spite of the fact that software projects offer little opportunity for statistical process control in the Six Sigma and Deming way, perhaps Deming's most noteworthy accomplishment from the perspective of project management and agile methodologies is his famous plan-do-check-act cycle that he originally adopted from Walter Shewhart. Plan-do-check-act (PDCA) envisions planning for what is to be done, then doing it—that is the plan-do. Next, measure results—measuring is the check activity—and then act on the measurement results. To act in the PDCA sense means to reflect upon lessons learned and provide feedback for corrective actions to the next iteration of the plan.


Walter A. Shewhart
Deming was influenced by the work of the process statistician Walter A. Shewhart who is credited with identifying that processes have two variables: assignable cause and chance cause.
The former is systemic and capable of being corrected and maintained to an economical minimum; the latter is randomly occurring in frequency and intensity, not always present in the process, and is mitigated by establishing performance limits for a given process.


Edwards Deming's impact on agile projects
A project management tip: Deming introduced the PDCA cycle, which is wholly embraced by the EVO method.
• The cycle really applies to all agile iterations. The plan-do is equivalent to the planning session followed by development, test, and integration.
• Especially relevant is the check-act that provides measurement and feedback for continuous improvement.
• Deming focused on eliminating unsatisfactory results before they reached the customer. In agile parlance, every object must pass its unit, functional, and system test

Are you on LinkedIn? If so, share this article with your network

Wednesday, September 23, 2009

Test Driven Development -- TDD

New to the agile scene? Curious about XP -- Extreme Programming? One practice born in XP and now widely dispersed in the agile community is Test Driven Development, TDD.


The main idea
Here are the main ideas of TDD, and they are a mind-bender for the traditionalist: Requirements are documented in the form of test scripts, and test scripts are the beginning point for product design.


TDD works this way: detailed design and development, after consideration for architecture, is a matter of three big steps:

Step 1: document development requirements with test scripts. Beginning with functional requirements in the iteration backlog, and in collaboration with users, the developer writes technical design requirements in the form of a test script, and writes the script in a form to be run with test-automating tools. Automated tools enable quick turnaround. Automated tests run fast, run on-demand, run repeatedly the same way, and run under various data and system conditions.


Step 2: run the test, modifying the object design until it passes. If the test fails, as it often will, the developer implements the quickest and simplest solution that will likely pass. Step 2 is repeated until the test passes.


Step 3: refine the detail design of the object. After the solution passes the test, the developer iterates the detail design to be compliant with quality standards. In doing so, only the internal detail is changed and improved; no change is made to the object’s external characteristics. To verify no external changes and continued functionality, the modified solution is retested internally.


A project management tip: TDD, unit tests, and acceptance tests
  • TDD was invented as a design practice; it is commonly applied to the lowest level design units
  • “TDD's origins were a desire to get strong automatic regression testing that supported evolutionary design.”4
  • Unit tests, distinct from TDD tests, are postimplementation design verification tests. In most respects, unit tests and TDD tests are very similar, but the purposes are very different: one drives design and the other verifies implementation.
  • The concept of automated tests is also applicable to higher-level tests, such as product integration tests and user acceptance tests, but these are not for the purpose of driving design.
  • At higher levels, designs that pass unit tests are being tested in a larger context and by independent testers, including end users.


Are you on LinkedIn? If so, share this article with your network by clicking on the link!

Tuesday, September 22, 2009

Governance in the Agile space

We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten. Don't let yourself be lulled into inaction.
Bill Gates


Governance and agile methods may seem like an oxymoron—but not so. A means to govern is essential for orderly project functioning. Without governance, the advantages of adaptive and evolutionary methods could be overwhelmed by functions bolted together haphazardly and rendered operationally ineffective, expensive to maintain, and not beneficialdisadvantageous to customers and stakeholders


Governance 'should's'
  • A governance program should be purposeful about maximizing the business potential of a project.
  • Governance should be dedicated toward minimizing the risks to business performance.
  • A governance program should enable and promote innovative and imaginative solutions, and
  • Governance should deter behavior that strays too far from norms.
In short, a governance program exists for five reasons that are in effect the governance mission statement:

Governance Mission
1. To oversee and approve investment on behalf of business beneficiaries;
2. To codify decision-making rights to enable make it possible for teams to have autonomy and freedom of maneuver;
3. To enable and promote innovation, evolution, and technical excellence within the framework of architecture and operating norms;
4. To be the ultimate arbiter of risks that affect business performance and accountability; and
5. To provide accountability for compliance to mandatory standards.
Governance is built on quality principles.

Four principles guide an effective governance implementation:

Governance Principles
1. Governance should be applied proportionately to the amount at stake.
2. Governance should provide clarity for mission and purpose, scope boundaries, decision-making authority, and decision rights.
3. Governance should respect the principle of subsidiary function: governance should not intrude into the management of functions that are best left to functional and project managers.
4. Governance must be lean, timely, and responsive, respecting agile principles to provide enough, but ‘just enough’, oversight and control to accomplish the governance mission.


A project management tip: Decision policy for the project manager
• The simplest policy for decision making is... always make a best-value decision based on the collective value of the risk-weighted factors
• When deciding among alternatives, pick the alternative that informs the business most favorably, even if there is suboptimum result for the project


Are you networked on LinkedIn? If so, share this article with your
network by clicking on the link.

Wednesday, September 2, 2009

Agile Benefits and DCF

In my musing on agile, I assert one advantage is that benefits come quicker and therefore are less susceptible to the risks of future uncertainties since the future is more near term.

The finance guys have a term for this: discounted cash flow, DCF.

So what is DCF and how does it work? In a word, the idea of a 'discount' is to value less a benefit in the future compared to a benefit in the present -- the financial equivalent to a bird in hand vs a bird in the bush.

The future is where uncertainties lurk.  It's just not a deflated dollar; it's also market uncertainties, the varagies of customer delight, and competitive effects.

Below is a Slideshare presentation that gives you some pictures of how this works:
Are you networked on LinkedIn? If so, share this article with your network by clicking on the link.



Thursday, August 27, 2009

What is this thing called Agile?

I've been working on a book about agile project management for enterprise projects. My publisher just gave me a title! "Project Management the Agile Way -- Making It Work in the Enterprise". Look for it in January, 2010.

Until then, the article below is a petty good introduction to some of the top-level ideas. You can down load "What is this Thing Called Agile" from my library at slideshare.com

There are some other materials there you may find interesting. Let me know!

Click here to share this article with your network on LinkedIn!

Wednesday, August 26, 2009

Being Agile with Kano Analysis

Kano analysis is a new product requirements tool that gives visualization to the relative merits, or value, of requirements. Agile is a project method that focus' on intimate customer involvement in the development of new user value.

The slideshow below gives some of the highpoints of the Kano tool in the context of the Agile method.

You will see that one quadrant of the Kano chart is the 'customer delight' quadrant. Here, often latent requirements emerge and become product 'ah-hah!'s. Ah-hah!s are the real discriminators. These are the ones that will fascinate users, heighten their interest in the project, and become the grist for early adopters.

It's around the ah-hah!s that you can build the project theme -- the real value proposition for the customer.

Are you networked on LinkedIn? If so, share this article with your network by clicking on the link.


Tuesday, August 25, 2009

The Top Five Ideas for Project Management

Someone asked me the other day what the top five ideas I would give to an accidental project manager -- someone who has the job but not the training. What is it you pass along without much jargon that will get them started. I put my ideas in a slideshare presentation you see below.


Are you networked on LinkedIn? If so, share this article with your network by clicking on the link.


Friday, July 24, 2009

Why Teams Don't Work

Well, actually teams do work, but not automatically. In fact, good high-performance teamwork is hard to come by. Here are some pot holes to avoid:


  • Teams are made too large. Teams are a way to organize small groups; they are not the antidote to bureaucracy.
  • Boundaries are too often left fuzzy – confusion is a productivity killer. What's in, what's out?
  • The mission is not made compelling; boredom and disinterest dont' set the stage for inspired work.
  • Team members too often are selected by making the easy choices, selected by position and availability and not by rigorous evaluation
  • There is no allowance for a 'nemesis' member to neutralize 'group think'.
  • Team membership is allowed to turnover too rapidly, thereby diluting cohesion and squandering productivity dependent upon personal relationships.


  • There is bad decision-making process or inadequate decision-making skills.
  • The membership includes difficult people, talented eccentrics that don't share and work collectively
  • There is competition among members, often leading to secrecy and compartmentalization, quite opposite to collaboration
  • There are empowerment uncertainties, awkward and untimely decision chains, and confusion about roles, rights, and responsibilities.
  • Many personnel issues are left unresolved: "What do I have to give up?"



For more reading and insight, take a look at these references:
Robbins, H and Finley, M. "The New Why Teams Don't Work: What goes wrong and how to make it right" Berrett - Koehler, San Francisco, 2000

Some ideas from this blog were inspired by: Coutu, D. "Why Teams Don't Work", Interview with Dr. J. Richard Hackman, Harvard Business Review, Boston, May 2009


Are you networked on LinkedIn? If so, share this article with your network by clicking on the link.

Sunday, June 14, 2009

CRACK Customers

Barry Boehm on the CRACK Customer

Dr. Barry Boehm, a noted software methodologist with a long and illustrative career at TRW, DARPA, and USC, and author of the COCOMO model and Sprial methodology, writes about the ideal customer for agile projects. They are:

-- Collaborative: they will engage with their customer peers and with the development team
-- Representative: they know the product or system requirements and can represent their constituents accurately
-- Authorized: they can make the decisions needed to keep velocity up, and their decisions stick!
-- Committed: they take their job seriously and make every effort to advance project success
-- Knowledgeable: they can understand what the developers are telling them in the context of the business or market situation.

Take a look at other Boehm'isms on agile in his book, with Richard Turner, "Balancing Agility and Discipline: a guide for the perplexed", published by Addison-Wesley in 2004

Are you networked on LinkedIn? If so, share this article with your network by clicking on the link.

Monday, June 8, 2009

Quotations worth a moment




I enjoy a bit of wit and humor. Here are a few of my favorite quotes:






Instead of trying to make the trains run on time, it might be better to do away with the trains!Anonymous


There is no undo button for our oceans of time.
Tom Pike





Brooks Law Adding manpower to a late software project will make it later!
Fred P. Brooks

About sequencing:
Nine women can't have a baby in one month

Anonymous


Somebody's sitting in the shade today because someone planted a tree a long time ago
Warren Buffett

Lies, damn lies, and statistics!
Benjamin Disraeli

There are no facts about the future
David Hulett

Value increases when the satisfaction of the customer augments and the expenditure of resources diminshes
Robert Tassinari

We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten.
Bill Gates

If the customer is not satisfied, he may not want to pay for our efforts. If the customer is not successful, he may not be able to pay. If he is not more successful than he already was, why should he pay?
Niels Malotaux

It is very difficult to make a vigorous, plausible, job-risking defense of an estimate that is derived by no quantitative method, supported by little data, and certified chiefly by the hunches of the managers
Fred P. Brooks

A requirements paradox:
Requirements must be stable for predictable results
However, the requirements always change
Niels Malotaux



Saturday, June 6, 2009

Agile Virtual Teams

Somebody asked: can a virtual team do Agile? Of course, with some adjustments. Here are my thoughts on this.

The communications channel:

Virtual teams often begin by emulating the behavior and circumstances of real teams. The first thought is communications. Real teams can handle a much greater N2 communication intensity because much of person-to-person communication is non-verbal.

What is N2? It's really N-squared. It's the approximate number of ways objects can communicate. The real formula is N(N-1). There are 5*4 ways 5 people can talk among themselves.

Non-verbal is a very high bandwidth channel, capable of communicating a large information message instantly, although the messages are often highly encoded and subject to inaccurate decoding. It's much easier to sort out the cacophony of discussion if you can put face and voice and context together.

Consequently, when planning for virtual teams, bear in mind that virtual teams don't have the luxury of infinite bandwidth

Velocity impacts:

Some teams relish the hub-bub of real time communications, and others do not. Good practice is to benchmark the virutal velocity before beginning the first iteration.

Assigning team work:

Assigning work to virtual teams should follow this simple rule: partition work according to its natural boundaries that minimize and simplify interfaces. Albert Einstein has been quoted to the effect: "Make everything as simple as possible, but not too simple". But over simplification is hazardous also. The solution can lose cohesion and the bigger picture becomes so obscured that effective solutions are not possible to build from the too-small parts.

Iteration Planning and tracking:

The iteration planning meeting is the agile mechanism for assigning work. All the team's complement attends. The same applies to a virtual team.

Tracking progress and identifying problems:

Only the daily stand-up meeting is affected by the communications unique to virtual teams. The less efficient electronic channels may have to be compensated by extending the time-box of the daily stand-up.

The burn-down and trending data is part of the team scorecard posted electronically as opposed to marking a whiteboard in a team space.

Are you networked on LinkedIn? If so, share this article with your network by clicking on the link.

Wednesday, April 22, 2009

An Old Question Revisited: How do you manage cost?

I was asked recently in a way that begged for an 'elevator speech' reply: "How do you manage cost?" My goodness! How much time do I have to reply? Isn't that question even older than project management itself? Is there any more to be said?

The best way to answer is to have a conversation, because the response is all tied up in values: is cost the dominant leg in the cost-schedule-scope-quality trade space? Or, is the proposition to always manage to a 'best-value' formulation? If value is what we are willing to pay for, then presumably best-value is also lowest-buck for what we get, but it may not be the lowest cost solution for the minimum requirements. So, we need to get the value thing decided straight away.

The mistake most made is to manage cost like it is an input, rather than a result. Managing at the input means watching the cash flow in comparison to the budget plan for cash consumption. This will provide comfort to the CFO, but it does little for the other stakeholders. The cash flow could be right on plan, cash consumed exactly as planned, and yet the project could be churning pointlessly and producing nothing. There is no value proposition at the input, so continuing to focus on the input side of the project process is strictly a CFO task.

Value is produced at the output. By now, everyone should know this. To manage cost is to manage three variables: budgeted cash flow at the input, actual cost in the process, and value of the actual outcomes.

> The budget is managed with the requirements process and the estimating skills of the team. Each of these, the requirements and estimates, are independent sources of error, and then there are co-variances between them that stretch the errors.

> The actual cost is managed by attention to effective practices, efficient procedures, and look-ahead risk management. Effectiveness is a measure of impact, efficiency is a measure of effort to produce impact, and risk management is foreseeing problems in time to reduce their impact
> Value is managed by insisting on an outcome that meets the quality standards of the stakeholders, but does not overreach on attributes that aren't valuable, to wit: better is the enemy of good!

If you are into an earned value management mind-set, then you recognize the EVM concepts in the foregoing:
> the budgeted cost of the work scheduled, aka BCWS, aka the Budget or Planned Value [PV],
> the actual cost of the work performed, aka ACWP, aka Actual Cost [AC], and
> the budgeted cost of the work performed, aka BCWP, aka Earned value [EV].

There are several ratios and other formulas that you can make out of these four parameters to explain history and forecast the future, but the one that is missing from all the EVM systems is AC/PV. This is the infamous input view we spoke of before. It carries no value; it only measures adherence to a spending plan.

By the way, forecasts are the weakest link! Does history ever repeat? Milton Friedman, distinguished economist, opined: "if you are going to predict, then predict often!" Meaning: hey, things change.

It's a lucky project manager that actually gets to control the labor AC to only that which is needed to produce the EV. This is another of those conversations we need to have. How many times have people been placed on projects because they would otherwise be on overhead and vulnerable to release? How many times have people been retained on projects, the so-called marching army, because PM's have been told they can't get them back if they release them? Smart management anticipates labor redundancies, but also labor losses that might impact the EV: vacation, sick leave, other duties as assigned!

It's a lucky project manager that actually controls the overhead expenses of their project. In point of fact, many projects are encumbered with liquidating enterprise expenses disproportionately to use. It's part of the game, so smart cost managers anticipate the impact of changing overhead positions.

Now the challenge remains to get this all into 30 seconds so that it will sound good on the elevator!

Are you networked on LinkedIn? If so, share this article with your network by clicking on the link.

Saturday, April 18, 2009

Quality thinker Fredrick Taylor is a lean machine

The ideas of F.W. Taylor


How many project managers are still laboring with the aftermath of Fredrick Winslow Taylor, more popularly known as F.W. Taylor? You might ask: Who was Taylor? F.W. Taylor was one of the first to study business systematically. He brought 'Taylorism" into the business culture in the years leading up to World War I. By 1915, his ideas were considered quite advanced, and they had significant impact well into the mid-20th century.

Taylor was a mechanical engineer who worked early-on in a metal products factory. Appalled at the seemingly disorganized and informal management of the time, and equally distressed by the costly throughput of poorly motivated workers laboring at inefficient processes , Taylor set about to invent "scientific management", a revolutionary movement that proposed the reduction of waste through the careful study of work.

Taylor came up with the original 'time-and-motion' studies, perhaps one of the first attacks on non-value work. Peter Drucker, a management guru par excellence who coined the term 'knowledge worker', has ranked Taylor, along with Darwin and Freud, as one of the seminal thinkers of modern times. ["Frederick Taylor, Early Century Management Consultant", The Wall Street Journal Bookshelf, June 13, 1997 pg. A1].

The essence of Taylorism is an antithesis to agile principles but nonetheless instructive. Counter to what we know today, Taylor believed that workers are not capable of understanding the underlying principles and science of their work; they need to be instructed step-by-step what to do and how to do it; and nothing is left to chance or decision. Rigid enforcement is required.

However, Taylor was close to the mark with his doctrine about value-adding work. According to Taylor, managers must accept that they have responsibilities to design efficient and effective process and procedures. Waste must be eliminated! Every action requires definition and a means to measure results.

Taylor was not well like by workers and it's not hard to see why. But Taylor's ideas and practices brought great efficiencies and profitability while providing customers with products of predictability of quality. Taylor most important legacy is perhaps his ideas of scientific management and the importance of process definition and process management as a means to control product and productivity.

I like what Steve McConnell says about quality and the software relationship. Building off Taylor's ideas of 'do it once right', though he does not mention Mr. Taylor, McConnell, author of the respected book "Code Complete" states the " general principle of software quality is .. that improving quality reduces development costs .... the best way to improve productivity is to reduce the time reworking..."

Kent Beck, writing in his book "Extreme Programming Explained - Second Edition" has a pretty strong idea about the legacy of Taylorism and its lingering effects on the knowledge industry. He says of Taylor that he brought a social structure we continue to unconsciously apply, and warns against the message that Taylorism implies: workers are interchangeable; workers only work hard enough to not be noticed; quality is an external responsibility
A project management tip
Fredrick Taylor was the first to study and quantify non-value work and put emphasis on eliminating wasteful and time consuming processes, procedures, and environmental impediments.

Are you networked on LinkedIn? If so, share this article with your network by clicking on the link. 

Sunday, March 29, 2009

A bit about Governance

Governance and agile methods may seem like an oxymoron. But not so. A means to govern is essential for orderly project functioning.

Without the elements of governance in place, at the moment of need there is only a vacuum which sucks in ad hoc solutions. Agile methodologists would not argue that governance is unnecessary or without value. Agile methodologists have gone so far as to build some governance practices into the process, although the not-Agile COBIT and ITIL programs are a bit over the top for agile projects.

As an example, at the team level, the daily time-boxed stand-up meeting is a governance mechanism. This meeting regulates the daily activity and points resources to problem areas where matters need resolution. And another: once a requirements set drawn from the backlog has been assigned to a team and to a sprint, that requirements set isn't allowed to change. This practice regulates the requirement drivers for the sprint and stabilizes the scope for the duration of the sprint.

When designing a governance program, two principles guide an effective implementation:


--Every process step is lean, meaning it is as efficient of resources as possible and it is as conservative of inventory as practical, inventory being the decisions to be made.

--Each step or activity adds some value to the outcome for both those that govern and those that are governed. Adding value means bringing the decision elements into focus and then deciding.


Consider a couple of points: first, governance can be the vehicle to lift the hand of top-down supervision by conveying rights to make the call to those close to the issues; second, good governance can be the reward system for disciplined behavior, for with discipline trust follows. Trust is an absolutely necessary condition for lean governance. Without a faith in the actions of others, many resources and steps will be added to check and inspect. And think about this: the best leaders are trusted by their followers, and the best teams are trusted by their leaders. Leadership inspires inquisitive minds to question the status quo, to search for a better way, and motivates action to turn opportunity into business and organizational value. A trusting organization is a 'safe' organization. Safety to go about doing really innovative and interesting tasks is a real benefit of trustfulness and discipline.The concept of a safe organization is an important concept in the so-called human powered methodologies called Crystal Methods. Alistair Cockburn has written much about safe organizations. See the link to his material in the link list on this blog site. In the agile space 'safety' translates into more generous boundaries, and this generosity often provides the stimulus to find challenging and competitive returns from opportunity.

Although governance can be the containment mechanics for downside risks, and is very often used just to constrain actions, even going so far as putting risk management somewhat on auto-pilot by means of work flow, having a governing framework in-place and operating does put a tool in the hands of the project manager. This tool, skillfully employed, can be the process to unleash actions to find new and better means to accomplish goals, provide some degree of risk management, and bring orderly and timely decisions to bear on priorities and alternatives.

I've encapsulated some ideas for a governance program in the slideshow below. Take a look

Are you networked on LinkedIn? If so, share this article with your network by clicking on the link.