Pages

Monday, September 28, 2009

Agile Earned Value--Part I

There's a common misperception that agile and earned value don't mix: that earned value is too heavy and anti-agle.  Not so! Agile projects earn value by delivering product incrementally, periodically, affordably, and according to the priority of the customer.


If the customer is not satisfied, he may not want to pay for our efforts. If the customer is not successful, he may not be able to pay. If he is not more successful than he already was, why should he pay?
Niels Malotaux
EVO Consulting Project Manager


The agile rule for earned value:
Each release is a value earning; without a go-live to production, there is no earned value.

In project terms, earned value means planning for an outcome and then achieving it, applying only the intended resources. When the project is completed and expectations are met, the entire value is earned—all requirements are rendered in production and equated to benefits.


Agile methods change the bookkeeping a bit, but by Agile Principle 1, delivering value is at the top of the list:

Our highest priority is to satisfy the customer
through early and continuous delivery of valuable software.


Are you on LinkedIn? Share this article with your network by clicking on the link.






Thursday, September 24, 2009

W. Edwards Deming invents PDCA

W. Edwards Deming and the PDCA cycle
Deming--working in Japan and else where in the mid-20th century--introduced very practical ideas of process control as a means to limit variations in product quality. Today, it is called defined process control.

Deming was a product guy; he came at quality from the point of view of the product: make the product the same way each time and make it work within limits that are acceptable to the customer. The modern poster child for defined process control is Six Sigma.


Six Sigma
  • Six Sigma is a problem solving methodology and defect control strategy with the purpose of identifying and mitigating error sources in defined process control.
  • The control limits are established such that production yields less than approximately 3.4 errors in one million opportunities either above or below the control limits. This figure is derived from the error possibilities within six standard deviations of a bell-shaped curve, after allowing 1.5 standard deviations drift of the long-term average defect rate.
  • The process derives its name from the Greek lower case s, called sigma and denoted σ; σ is the symbol used by statisticians for the standard deviation of a probability density function such as the bell curve.

 Agile Thinking
Ken Schwaber—a leading SCRUM methodologist—objecting to defined process control, puts it this way, “[defined process control] is based on processes that work only because their degree of imprecision is acceptable… When defined process control cannot be achieved because of the complexity of the intermediate activities; something called empirical process control has to be employed.”

In Schwaber’s view, software is too complex to expect defects to be contained within predefined error limits. Empirical control is the answer; empirical control is derived from observed facts, adapted to the situation, and not determined by preplanned limits from previous projects.

Deming contributes PDCA to agile projects
In spite of the fact that software projects offer little opportunity for statistical process control in the Six Sigma and Deming way, perhaps Deming's most noteworthy accomplishment from the perspective of project management and agile methodologies is his famous plan-do-check-act cycle that he originally adopted from Walter Shewhart. Plan-do-check-act (PDCA) envisions planning for what is to be done, then doing it—that is the plan-do. Next, measure results—measuring is the check activity—and then act on the measurement results. To act in the PDCA sense means to reflect upon lessons learned and provide feedback for corrective actions to the next iteration of the plan.


Walter A. Shewhart
Deming was influenced by the work of the process statistician Walter A. Shewhart who is credited with identifying that processes have two variables: assignable cause and chance cause.
The former is systemic and capable of being corrected and maintained to an economical minimum; the latter is randomly occurring in frequency and intensity, not always present in the process, and is mitigated by establishing performance limits for a given process.


Edwards Deming's impact on agile projects
A project management tip: Deming introduced the PDCA cycle, which is wholly embraced by the EVO method.
• The cycle really applies to all agile iterations. The plan-do is equivalent to the planning session followed by development, test, and integration.
• Especially relevant is the check-act that provides measurement and feedback for continuous improvement.
• Deming focused on eliminating unsatisfactory results before they reached the customer. In agile parlance, every object must pass its unit, functional, and system test

Are you on LinkedIn? If so, share this article with your network

Wednesday, September 23, 2009

Test Driven Development -- TDD

New to the agile scene? Curious about XP -- Extreme Programming? One practice born in XP and now widely dispersed in the agile community is Test Driven Development, TDD.


The main idea
Here are the main ideas of TDD, and they are a mind-bender for the traditionalist: Requirements are documented in the form of test scripts, and test scripts are the beginning point for product design.


TDD works this way: detailed design and development, after consideration for architecture, is a matter of three big steps:

Step 1: document development requirements with test scripts. Beginning with functional requirements in the iteration backlog, and in collaboration with users, the developer writes technical design requirements in the form of a test script, and writes the script in a form to be run with test-automating tools. Automated tools enable quick turnaround. Automated tests run fast, run on-demand, run repeatedly the same way, and run under various data and system conditions.


Step 2: run the test, modifying the object design until it passes. If the test fails, as it often will, the developer implements the quickest and simplest solution that will likely pass. Step 2 is repeated until the test passes.


Step 3: refine the detail design of the object. After the solution passes the test, the developer iterates the detail design to be compliant with quality standards. In doing so, only the internal detail is changed and improved; no change is made to the object’s external characteristics. To verify no external changes and continued functionality, the modified solution is retested internally.


A project management tip: TDD, unit tests, and acceptance tests
  • TDD was invented as a design practice; it is commonly applied to the lowest level design units
  • “TDD's origins were a desire to get strong automatic regression testing that supported evolutionary design.”4
  • Unit tests, distinct from TDD tests, are postimplementation design verification tests. In most respects, unit tests and TDD tests are very similar, but the purposes are very different: one drives design and the other verifies implementation.
  • The concept of automated tests is also applicable to higher-level tests, such as product integration tests and user acceptance tests, but these are not for the purpose of driving design.
  • At higher levels, designs that pass unit tests are being tested in a larger context and by independent testers, including end users.


Are you on LinkedIn? If so, share this article with your network by clicking on the link!

Tuesday, September 22, 2009

Governance in the Agile space

We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten. Don't let yourself be lulled into inaction.
Bill Gates


Governance and agile methods may seem like an oxymoron—but not so. A means to govern is essential for orderly project functioning. Without governance, the advantages of adaptive and evolutionary methods could be overwhelmed by functions bolted together haphazardly and rendered operationally ineffective, expensive to maintain, and not beneficialdisadvantageous to customers and stakeholders


Governance 'should's'
  • A governance program should be purposeful about maximizing the business potential of a project.
  • Governance should be dedicated toward minimizing the risks to business performance.
  • A governance program should enable and promote innovative and imaginative solutions, and
  • Governance should deter behavior that strays too far from norms.
In short, a governance program exists for five reasons that are in effect the governance mission statement:

Governance Mission
1. To oversee and approve investment on behalf of business beneficiaries;
2. To codify decision-making rights to enable make it possible for teams to have autonomy and freedom of maneuver;
3. To enable and promote innovation, evolution, and technical excellence within the framework of architecture and operating norms;
4. To be the ultimate arbiter of risks that affect business performance and accountability; and
5. To provide accountability for compliance to mandatory standards.
Governance is built on quality principles.

Four principles guide an effective governance implementation:

Governance Principles
1. Governance should be applied proportionately to the amount at stake.
2. Governance should provide clarity for mission and purpose, scope boundaries, decision-making authority, and decision rights.
3. Governance should respect the principle of subsidiary function: governance should not intrude into the management of functions that are best left to functional and project managers.
4. Governance must be lean, timely, and responsive, respecting agile principles to provide enough, but ‘just enough’, oversight and control to accomplish the governance mission.


A project management tip: Decision policy for the project manager
• The simplest policy for decision making is... always make a best-value decision based on the collective value of the risk-weighted factors
• When deciding among alternatives, pick the alternative that informs the business most favorably, even if there is suboptimum result for the project


Are you networked on LinkedIn? If so, share this article with your
network by clicking on the link.

Wednesday, September 2, 2009

Agile Benefits and DCF

In my musing on agile, I assert one advantage is that benefits come quicker and therefore are less susceptible to the risks of future uncertainties since the future is more near term.

The finance guys have a term for this: discounted cash flow, DCF.

So what is DCF and how does it work? In a word, the idea of a 'discount' is to value less a benefit in the future compared to a benefit in the present -- the financial equivalent to a bird in hand vs a bird in the bush.

The future is where uncertainties lurk.  It's just not a deflated dollar; it's also market uncertainties, the varagies of customer delight, and competitive effects.

Below is a Slideshare presentation that gives you some pictures of how this works:
Are you networked on LinkedIn? If so, share this article with your network by clicking on the link.