Be always at war with your vices, at peace with your neighbors, and let each new year find you a better man. ~Benjamin Franklin
Friday, December 31, 2010
It's New Year's Eve!
Labels:
Quotations
Wednesday, December 29, 2010
The Closer!
Michael Young has a posting on PMHut about closing the project. It's aggressively titled "A complete guide to closing projects"
Except I don't think it's complete, or complete enough
It's a good discussion as far as it goes, but here's what he says about archiving data:
Following delivery of the Post Implementation Review Report, the project database is archived. Building a repository of past projects serves as both a reference source and as a training tool for project managers. Project archives can be used when estimating projects and in developing metrics on probable productivity of future teams.
Now, that's necessary, but not sufficient. What you really have to do is update the enterprise estimating models, not just archive the project. Maintaining models is really the only way to improve estimating. If I say that a "hard" specification requires "X" hours with "Y" skills, the credibility of X and Y are on the line in proposals and in execution.
Too many organizations "build a repository of past projects" without putting the effort into mining the information in that repository to refine a model of how the organization really works.
And of course, the outliers have to be dealt with, either in the footnotes, or in the distribution of possible outcomes. After all, X and Y should be expected values if they are single numbers, and if not, then they should be ranges, better yet: percentile rankings. To wit: a hard specification requires X hours at the 95th percentile. Now we have something we can work with.
My advice: don't just close; be a Closer!
Except I don't think it's complete, or complete enough
It's a good discussion as far as it goes, but here's what he says about archiving data:
Following delivery of the Post Implementation Review Report, the project database is archived. Building a repository of past projects serves as both a reference source and as a training tool for project managers. Project archives can be used when estimating projects and in developing metrics on probable productivity of future teams.
Now, that's necessary, but not sufficient. What you really have to do is update the enterprise estimating models, not just archive the project. Maintaining models is really the only way to improve estimating. If I say that a "hard" specification requires "X" hours with "Y" skills, the credibility of X and Y are on the line in proposals and in execution.
Too many organizations "build a repository of past projects" without putting the effort into mining the information in that repository to refine a model of how the organization really works.
And of course, the outliers have to be dealt with, either in the footnotes, or in the distribution of possible outcomes. After all, X and Y should be expected values if they are single numbers, and if not, then they should be ranges, better yet: percentile rankings. To wit: a hard specification requires X hours at the 95th percentile. Now we have something we can work with.
My advice: don't just close; be a Closer!
Bookmark this on Delicious
Are you on LinkedIn? Share this article with your network by clicking on the link.
Labels:
close,
Project Management
Monday, December 27, 2010
Schedule heresy
Here's a little heresy on schedules, just before the holiday break:
I don't like, and don't recommend, MSProject and similar tools for managing a project!
Why?
Plan v Manage
There's too much administration and faux assumptions day-to-day managing dependencies for it to be an effective management tool, especially for smaller projects. There's always a mad scramble and a lot of time and effort taken up on evaluating ad-hoc task level interactions, most of which can be worked out by other means.
On the other hand, it's a great planning tool to get a project started, including a consideration for dependencies, resource conflicts, and other artifacts. As a planning tool, so long as dependencies are restricted to finish-to-start, and there are no fixed dates, it's a good tool to host Monte Carlo simulations. It's just that once a project is under way, milestone charts, gated criteria, and earned value spreadsheets are better tools on account of their efficiency, even on large projects.
Walk the talk
I finished an engagement a couple of years ago that went several years, consumed multiple hundreds of millions of dollars, involved hundreds of project staff, and had four blocks of deliveries over two years.
And, we never had a task-level project schedule.
What we had were swim lane milestones and major dependencies identified between swim lanes. Within the lanes, there were one or more teams and planned interteam dependencies, each team with their milestone schedules, and within teams, there were small working groups, also with milestone schedules.
We set up pipelines for sequential delivery, and gates to frame the pipelines. We managed the pipelines with Excel. [Look for more on pipelines in a future post]
We also did resource leveling with Excel.... that's a good thing! The algorithms in schedule tools can return some silly answers. I always marvel at Excel--it's truly amazing what you can do quantitatively with that tool!
We successfully delivered business value. The first block was late--and that was a value bummer--but the next three hit their milestones on time. We did not successfully earn the intended project EV package of cost-performance-schedule. It was an ERP project [Oracle business systems] for a multi-billion$ enterprise. Re EV: ERP says it all!
In any event: Plan with MSProject [or Primavera, or other similar], but manage with milestones, gates, and EV
I don't like, and don't recommend, MSProject and similar tools for managing a project!
Why?
Plan v Manage
There's too much administration and faux assumptions day-to-day managing dependencies for it to be an effective management tool, especially for smaller projects. There's always a mad scramble and a lot of time and effort taken up on evaluating ad-hoc task level interactions, most of which can be worked out by other means.
On the other hand, it's a great planning tool to get a project started, including a consideration for dependencies, resource conflicts, and other artifacts. As a planning tool, so long as dependencies are restricted to finish-to-start, and there are no fixed dates, it's a good tool to host Monte Carlo simulations. It's just that once a project is under way, milestone charts, gated criteria, and earned value spreadsheets are better tools on account of their efficiency, even on large projects.
Walk the talk
I finished an engagement a couple of years ago that went several years, consumed multiple hundreds of millions of dollars, involved hundreds of project staff, and had four blocks of deliveries over two years.
And, we never had a task-level project schedule.
What we had were swim lane milestones and major dependencies identified between swim lanes. Within the lanes, there were one or more teams and planned interteam dependencies, each team with their milestone schedules, and within teams, there were small working groups, also with milestone schedules.
We set up pipelines for sequential delivery, and gates to frame the pipelines. We managed the pipelines with Excel. [Look for more on pipelines in a future post]
We also did resource leveling with Excel.... that's a good thing! The algorithms in schedule tools can return some silly answers. I always marvel at Excel--it's truly amazing what you can do quantitatively with that tool!
We successfully delivered business value. The first block was late--and that was a value bummer--but the next three hit their milestones on time. We did not successfully earn the intended project EV package of cost-performance-schedule. It was an ERP project [Oracle business systems] for a multi-billion$ enterprise. Re EV: ERP says it all!
In any event: Plan with MSProject [or Primavera, or other similar], but manage with milestones, gates, and EV
Bookmark this on Delicious
Are you on LinkedIn? Share this article with your network by clicking on the link.
Saturday, December 25, 2010
Happy Holidays
From all the folks that make "Musings" possible:
Photo credit: http://dreamscansing.blogspot.com/2010_09_01_archive.html
Photo credit: http://dreamscansing.blogspot.com/2010_09_01_archive.html
Labels:
Quotations
Thursday, December 23, 2010
The Theory of the Mosaic
And so we have the "Theory of the Mosaic", which roughly told is given by this:
From many disparate and small bits, hiding in public and knowable to those who seek, comes a revelation of the larger idea and greater knowledge
This is much more than just seeing the forest in the presence of trees; this is finding the trees in the first place, and then placing them in context, in juxtaposition, and in relationship so that not only the forest, but all the attributes and nuances of the forest are revealed.
Project management?
You betcha!
The proposal project
Let's begin with the competition to win new business. Bidding competitively is a project in itself; it's only after you win that the execution project begins.
One of the tools used by practitioners of the Theory of the Mosaic is the "expert network". These are the relationships that extend in myriad directions and trade "bits" in the marketplace of knowledge [sometimes rumor or conjecture]. In competition, the network extends not only to the potential customer, but to the customer's customer, regulator, appropriator, and suppliers. It even extends to the direct competition. How many of us have sat down with our direct competitor for a chat about the 'opportunity'?
Assembling all this information [in many cases, just bits of data, not even information] into a narrative that can be then transformed--through the proposal process--into a WBS, cost, schedule, and performance promise for a winning offer is no small task and requires all the discipline and commitment to an objective that is the mark of successful project management.
Of course, one of the tenants of the Theory is that information is hiding in public. Expert networks to ferret out the public information is the secret to success. Usually, there is a bright line--that is, no peek at proprietary competitive information that is unethical or illegal is permitted--but often the line gets blurred, the rules change [sometimes in mid-stream], or international ambiguity [read: culture] mislevels the playing field.
Guardianship of such corruption is no small matter. Just refer to the infamous USAF refueling tanker competition for many "don't do this" examples.
The execution project
No project of any scale operates without interpersonal relationships, a dollop of politics, and the obscure actions of many people working on their part. Enter: "expert networks". The project manager for sure, work stream managers, and cost account managers all 'work their networks': up and out to the sponsors, and down and in to the worker-bees.
Assembling the mosaic
For those experienced in brainstorming, assembling a mosiac from an expert network is really no different.
First, the science:
Like items are grouped
Relationships are labeled
Sequencing is labeled
Small items are grouped under bigger items
Gaps are identified; gap filler plans are formulated
Narrative headlines are written
Then, the art:
Interpretations are made; the 'big picture' emerges from the 'pixels'
Importance and urgency are weighed
Actionable 'intelligence' is separated for execution plans
Finally: act on the intelligence!
From many disparate and small bits, hiding in public and knowable to those who seek, comes a revelation of the larger idea and greater knowledge
This is much more than just seeing the forest in the presence of trees; this is finding the trees in the first place, and then placing them in context, in juxtaposition, and in relationship so that not only the forest, but all the attributes and nuances of the forest are revealed.
Project management?
You betcha!
The proposal project
Let's begin with the competition to win new business. Bidding competitively is a project in itself; it's only after you win that the execution project begins.
One of the tools used by practitioners of the Theory of the Mosaic is the "expert network". These are the relationships that extend in myriad directions and trade "bits" in the marketplace of knowledge [sometimes rumor or conjecture]. In competition, the network extends not only to the potential customer, but to the customer's customer, regulator, appropriator, and suppliers. It even extends to the direct competition. How many of us have sat down with our direct competitor for a chat about the 'opportunity'?
Assembling all this information [in many cases, just bits of data, not even information] into a narrative that can be then transformed--through the proposal process--into a WBS, cost, schedule, and performance promise for a winning offer is no small task and requires all the discipline and commitment to an objective that is the mark of successful project management.
Of course, one of the tenants of the Theory is that information is hiding in public. Expert networks to ferret out the public information is the secret to success. Usually, there is a bright line--that is, no peek at proprietary competitive information that is unethical or illegal is permitted--but often the line gets blurred, the rules change [sometimes in mid-stream], or international ambiguity [read: culture] mislevels the playing field.
Guardianship of such corruption is no small matter. Just refer to the infamous USAF refueling tanker competition for many "don't do this" examples.
The execution project
No project of any scale operates without interpersonal relationships, a dollop of politics, and the obscure actions of many people working on their part. Enter: "expert networks". The project manager for sure, work stream managers, and cost account managers all 'work their networks': up and out to the sponsors, and down and in to the worker-bees.
Assembling the mosaic
For those experienced in brainstorming, assembling a mosiac from an expert network is really no different.
First, the science:
Like items are grouped
Relationships are labeled
Sequencing is labeled
Small items are grouped under bigger items
Gaps are identified; gap filler plans are formulated
Narrative headlines are written
Then, the art:
Interpretations are made; the 'big picture' emerges from the 'pixels'
Importance and urgency are weighed
Actionable 'intelligence' is separated for execution plans
Finally: act on the intelligence!
Bookmark this on Delicious
Are you on LinkedIn? Share this article with your network by clicking on the link.
Labels:
marketing,
Project Management
Wednesday, December 22, 2010
Quotation for project managers
A word from Henry Ford:
Thanks to Luis Coehlo at "ah-ha-moments.net" for this bit of wisdom.
In later years, this went on to "Quality is Job One", but then Ford lost the recipe. Now, in a resurgence of Henry's guidance, Ford is regaining the high ground, in part because of a commitment to quality, and part because of another piece of ageless advice:
Of course, simple and complex are not two sides of the same coin. The simplist idea that gets the job done can still be quite complex. My definition of simple is that it's the least complexity that is functionally complete and meets 'quality' measures in the large sense of the word: fitness to form, function, effectiveness, efficiency, availability, etc.
In the December 9th 2010 print edition of "The Economist", there is a great interview--"Ephiphany in Detroit"--with CEO Alan Mullaly on the quality and management turn-around at Ford. It's certainly no secret that many of the things that made Boeing a great aircraft innovator, developer, and production house are being applied at Ford.
There's a lesson to be read about in this interview for all program managers tagged with turning around a project with quality problems.
It's no secret that the first thing Mullaly did to get on top of quality was insist on candid discussion from his functional managers and to instill a culture of safety from prosecution if a problem werre raised. The second thing he did was tune-in to what outside objective evaluators have to say; again, he changed the culture from defense to offense.
So, communicating was a big stick. There's no doubt that "communication in a commonly understood language is the key that unlocks the culture". [This I paraphrase from the US Ambassordor to China from a recent interview on Charlie Rose]
"Quality means doing it right when no one is looking".
Thanks to Luis Coehlo at "ah-ha-moments.net" for this bit of wisdom.
In later years, this went on to "Quality is Job One", but then Ford lost the recipe. Now, in a resurgence of Henry's guidance, Ford is regaining the high ground, in part because of a commitment to quality, and part because of another piece of ageless advice:
Keep it simple, stupid!
Of course, simple and complex are not two sides of the same coin. The simplist idea that gets the job done can still be quite complex. My definition of simple is that it's the least complexity that is functionally complete and meets 'quality' measures in the large sense of the word: fitness to form, function, effectiveness, efficiency, availability, etc.
In the December 9th 2010 print edition of "The Economist", there is a great interview--"Ephiphany in Detroit"--with CEO Alan Mullaly on the quality and management turn-around at Ford. It's certainly no secret that many of the things that made Boeing a great aircraft innovator, developer, and production house are being applied at Ford.
There's a lesson to be read about in this interview for all program managers tagged with turning around a project with quality problems.
It's no secret that the first thing Mullaly did to get on top of quality was insist on candid discussion from his functional managers and to instill a culture of safety from prosecution if a problem werre raised. The second thing he did was tune-in to what outside objective evaluators have to say; again, he changed the culture from defense to offense.
So, communicating was a big stick. There's no doubt that "communication in a commonly understood language is the key that unlocks the culture". [This I paraphrase from the US Ambassordor to China from a recent interview on Charlie Rose]
Bookmark this on Delicious
Are you on LinkedIn? Share this article with your network by clicking on the link.
Labels:
complexity,
Quality,
Quotations
Tuesday, December 21, 2010
Risk informed decision making
At NASA's risk management homepage you can navigate to a paper on risk informed decision making. As defined there:
Probably the most useful idea in this paper is that risk-informed is different from risk-based. The former takes risk into consideration; the latter adjusts all values for risk and makes a utility decision.
In effect, although some quantitative models are introduced and suggested, the main idea is that risk informs decision making but there are other factors that may intervene and override. Just common sense, really.
Nevertheless, the paper proposes three big steps that are useful to review:
1. Formulation and Selection of Decision Alternatives: In this step the decision alternatives are generated by quantitative and qualitative analyses, past experience, as well as engineering judgment. Unacceptable alternatives are removed after deliberation
2. Analysis and Ranking of Decision Alternatives -- In this step, the screened alternatives are ranked
3. Actual Decision making -- The final decision can be made only after a deliberation takes place (that is, we are describing a risk-informed rather than risk-based process). Deliberation is necessary because there may be aspects of the particular decision that cannot be considered in a formal way.
Risk-informed decision making, as described in this paper, is
the formal process of analyzing various decision alternatives with respect to their impact on the PMs, of
assessing uncertainty associated with their degree of impact, and of selecting the optimal decision
alternative using formal decision theory and taking into consideration program constrains, stakeholder
expectations, and the magnitude of uncertainties.
Probably the most useful idea in this paper is that risk-informed is different from risk-based. The former takes risk into consideration; the latter adjusts all values for risk and makes a utility decision.
In effect, although some quantitative models are introduced and suggested, the main idea is that risk informs decision making but there are other factors that may intervene and override. Just common sense, really.
Nevertheless, the paper proposes three big steps that are useful to review:
1. Formulation and Selection of Decision Alternatives: In this step the decision alternatives are generated by quantitative and qualitative analyses, past experience, as well as engineering judgment. Unacceptable alternatives are removed after deliberation
2. Analysis and Ranking of Decision Alternatives -- In this step, the screened alternatives are ranked
3. Actual Decision making -- The final decision can be made only after a deliberation takes place (that is, we are describing a risk-informed rather than risk-based process). Deliberation is necessary because there may be aspects of the particular decision that cannot be considered in a formal way.
Bookmark this on Delicious
Are you on LinkedIn? Share this article with your network by clicking on the link.
Labels:
decision,
risk decision,
Risk Management
Sunday, December 19, 2010
Top 50 Industrial Engineering and Project Management Blogs
This is not my list, but we here at "Musings" made the list of 50 Industrial Engineering and Project Management blogs, actually placing 10th in the project management list, as complied by Masters of Engineering.com
And, I am happy to report that we are in good company with many blogs listed that we follow here.
Of course, in a list as long as 50 there's always something to discover.
One for me was under the category of Operations Research Blogs: arandomforest.com, One blog on this site caught my eye: posted this past September, it is entitled "The Flaw of Averages and why everything is late" and refers to a book similarly titled: "The Flaw of Averages: Why We Underestimate Risk in the Face of Uncertainty" by Sam Savage.
Sam talks about the 7 Deadly Sins of Averages in his paper published in OR/MS Today from the Institute of Operations Research and the Management Sciences.
Of course for the risk astute project manager, "expected value" is the statistic of choice, not an arithmetic average. Expected value is a risk adjusted average of all the possibilities that go into an estimate. Thus, it is a richer piece of information, incorporating more of the information in the distribution of possibilities than just an average. Nevertheless, it does no harm to understand Savage's 7 points--just don't throw away useful information by not understanding expected value as well.
And, I am happy to report that we are in good company with many blogs listed that we follow here.
Of course, in a list as long as 50 there's always something to discover.
One for me was under the category of Operations Research Blogs: arandomforest.com, One blog on this site caught my eye: posted this past September, it is entitled "The Flaw of Averages and why everything is late" and refers to a book similarly titled: "The Flaw of Averages: Why We Underestimate Risk in the Face of Uncertainty" by Sam Savage.
Sam talks about the 7 Deadly Sins of Averages in his paper published in OR/MS Today from the Institute of Operations Research and the Management Sciences.
The Seven Deadly Sins of Averaging
1. The Family with 1 1/2 Children
2. Why Everything is Behind Schedule
3. The Egg Basket
4. The Risk of Ranking
5. Ignoring Restrictions
6. Ignoring Optionality
7. The Double Whammy
Of course for the risk astute project manager, "expected value" is the statistic of choice, not an arithmetic average. Expected value is a risk adjusted average of all the possibilities that go into an estimate. Thus, it is a richer piece of information, incorporating more of the information in the distribution of possibilities than just an average. Nevertheless, it does no harm to understand Savage's 7 points--just don't throw away useful information by not understanding expected value as well.
Bookmark this on Delicious
Are you on LinkedIn? Share this article with your network by clicking on the link.
Friday, December 17, 2010
More about process improvement project risks
In "Process Part I", I put it to you that business' are run from a vertical--that is, functional--perspective but most process improvement projects attempt to improve the business by improving cross-functional--that is, horizontal--process performance.
Trust and verify
The issue for project managers is to convince stakeholders--most of who have a vested interested in the functional metrics oriented vertically--to trust process metrics that are 'invented' or put in place by the project outputs. Without trust, there will be no meaningful outcomes to justify the effort to develop the outputs.
Schema?
One issue is that to support process metrics business data has to be reorganized. Data gathered functionally--that is, vertically--during the normal course of business activity has to be reported horizontally. That requires reorganizing the data schema. The 'get it in' schema is often too inefficient and ineffective to support the 'report it out' needs of the process.
Enter the data warehouse:
One use of a data warehouse is to store the vertical data from the P&L data base in a horizontal form so it can be read out in a process dimension. However, what appears simple on paper--changing the view of the data--is not simple in practice.
Just the facts!
One principle of system engineering of which project managers are well aware is that "view" doesn't change the underlying facts. One example familiar to project managers is the WBS: the sum of the horizontals [which is one 'view'] equals the sum of the verticals [which is another view].
The same applies to business data. In the case of a DW, it should be to sum up horizontally what the P&L database is reporting vertically.
DW project risks
The risk arises in the validation task. The P&L data is 'certified' and 'validated'; the horizontal process view is not. Therein is the risk: data transformations, data timing, and query logic all bear upon results.
Project managers are well advised to take this risk seriously.
It took one of my projects about a year to validate the DW so that it would add up to the P&L reliably. It's not just a matter of arithmetic. It takes time, and repeated success, to obtain the trust of stakeholders whose livelilhood may depend on the results.
Trust and verify
The issue for project managers is to convince stakeholders--most of who have a vested interested in the functional metrics oriented vertically--to trust process metrics that are 'invented' or put in place by the project outputs. Without trust, there will be no meaningful outcomes to justify the effort to develop the outputs.
Schema?
One issue is that to support process metrics business data has to be reorganized. Data gathered functionally--that is, vertically--during the normal course of business activity has to be reported horizontally. That requires reorganizing the data schema. The 'get it in' schema is often too inefficient and ineffective to support the 'report it out' needs of the process.
Enter the data warehouse:
One use of a data warehouse is to store the vertical data from the P&L data base in a horizontal form so it can be read out in a process dimension. However, what appears simple on paper--changing the view of the data--is not simple in practice.
Just the facts!
One principle of system engineering of which project managers are well aware is that "view" doesn't change the underlying facts. One example familiar to project managers is the WBS: the sum of the horizontals [which is one 'view'] equals the sum of the verticals [which is another view].
The same applies to business data. In the case of a DW, it should be to sum up horizontally what the P&L database is reporting vertically.
DW project risks
The risk arises in the validation task. The P&L data is 'certified' and 'validated'; the horizontal process view is not. Therein is the risk: data transformations, data timing, and query logic all bear upon results.
Project managers are well advised to take this risk seriously.
It took one of my projects about a year to validate the DW so that it would add up to the P&L reliably. It's not just a matter of arithmetic. It takes time, and repeated success, to obtain the trust of stakeholders whose livelilhood may depend on the results.
Bookmark this on Delicious
Are you on LinkedIn? Share this article with your network by clicking on the link.
Labels:
process,
Risk Management
Wednesday, December 15, 2010
Acquisition reform
Last month, I had a post on EVM reform. Now, that is actually set in the context of acquisition reform in the DoD, a Secretary Gates initiative.
In the October 2010 issue of "Air Force Magazine", there is a description of the five reforms the Air Force is putting in place to prevent debacles like the solicitation for the replacement airborne refueling tanker--twice canceled, and only this month in some more hot water--and some other big ticket programs in trouble, most notably the F-35 fighter aircraft.
Here is what the Air Force is calling reform:
1. Add 7,000 uniformed and civilian procurement specialists, all being new hires, and most being interns with no experience. Get 'em young, I guess, is the idea. About half of these folks are already on-board. [Note: within all of DoD, Gates is planning on 20,000 new hires for acquisition....that's a whole agency in most places!]
2. Resist change and requirement volatility by elevating to executive level the approval needed to make a change. The air force plans to "....insinuate acquisition pros into the requirements process" early, and then block up requirements such that IOC is at an 80% level with 'block 1'. Haven't we been doing that for decades? How is this a reform? Perhaps we should try the flip side: provoke change while there is still time to deal with it, and be ever open to common sense.
3. Stabilize the budget. A noble objective to be sure, but good luck with that one in the political climate we'll have for the next generation.
4. Improve the quality of the source selection. Again, a good idea always to work on process improvement. In a somewhat shocking statement, the article says that the "....Air Force will execute the source selection exactly like we said we would" in the source selection rules. What a concept!
5. Align authority with responsibility, the bane of all large command and control bureaucracies. The first step is increase the PEO's from 6 to 17 to allow a better spread of command. I hope it works, but we have been working on the A&R problem for 50 years, at least.
Bookmark this on Delicious
Are you on LinkedIn? Share this article with your network by clicking on the link.
Labels:
Project Management
Monday, December 13, 2010
Quotation for project managers
Be eternally suspicious, take nothing for granted, investigate everything. Program success is obtained only by enormous attention to detail everywhere
Quoted on HerdingCats
Bookmark this on Delicious
Are you on LinkedIn? Share this article with your network by clicking on the link.
Labels:
Project Management,
Quotations
Saturday, December 11, 2010
Microtasking
Microtasking. Is microtasking something that is coming to a project near you? Should you be the first your block to do it?
In a recent article, microtasking was explained as subdividing a task into tasks so small that they could be executed in a few seconds, perhaps a minute.
What kind of tasks are these? Mostly repetitive tasks. One task is filling in blanks on a form; another is transcribing sentences in a document; another could be loading a database.
Is this practical? Well, there are a couple of companies providing microtasking as a service, and other companies building tools, primarily web-based tools, to make it possible for others to do it for themselves, or contract to have it done.
Could it work in a project? Is it the ultimate Agile, or the ultimate 'federalism' of project management? I don't know. I'm not aware that it's been tried.
But like social networking, cloud computing, virtual teams, and a host of other technology driven ideas, micotasking, or perhaps its bigger cousin, millitasking, could be coming to a project near you!
In a recent article, microtasking was explained as subdividing a task into tasks so small that they could be executed in a few seconds, perhaps a minute.
What kind of tasks are these? Mostly repetitive tasks. One task is filling in blanks on a form; another is transcribing sentences in a document; another could be loading a database.
Is this practical? Well, there are a couple of companies providing microtasking as a service, and other companies building tools, primarily web-based tools, to make it possible for others to do it for themselves, or contract to have it done.
Could it work in a project? Is it the ultimate Agile, or the ultimate 'federalism' of project management? I don't know. I'm not aware that it's been tried.
But like social networking, cloud computing, virtual teams, and a host of other technology driven ideas, micotasking, or perhaps its bigger cousin, millitasking, could be coming to a project near you!
Bookmark this on Delicious
Are you on LinkedIn? Share this article with your network by clicking on the link.
Labels:
Project Management,
schedule
Thursday, December 9, 2010
Grady Booch on DoD software management
Back to this month's "Crosstalk" that is the issue on architecture. This issue also has an interview with Grady Booch, now with IBM but probably best known as one of the three main authors of the Unified Modeling Language, UML. [Historical note: IBM bought Rational a few years ago, and Booch was a key guy at Rational inventing UML]
Booch gave his opinion about the DoD software community:
I wonder if the commercial "best practices" he refers to are Agile, or something from the CMMI? I'm not sure I would put Agile under "best practices" for general development, but certainly for cases of evolving and emerging requirements that are not fixed in anyone's mind, Agile is a risk management solution for that dilemma.
And, I certainly wouldn't call Agile "software engineering". Test driven design--TDD--might qualify as an engineering practice, so also refactoring, but most of the rest of Agile is management not engineering.
Those are my opinions. Booch had something different to offer:
Three recommendations for large scale organizations
Booch has three recommendations for DoD software managers and practitioners, but really these apply to any large scale organization doing software intensive systems:
1. Increase leverage of open-source tools and resources. This is not just a cost issue; it's an issue of leverage for innovation and transparency. Specifically, Booch says push 'Forge.mil' along, the services arm of 'SourceForge'
2. Build more elaborate and deployed infrastructure for collaboration. [I think he means: It's not that programs should be managed from Facebook or Twitter....they shouldn't...] There should be more emphasis on leveraging the worldwide expertiese of DoD and its contractors
3. Go beyond functional modeling and get down to modeling the system itself. Make architecture an artifact of the project. To this, I say: Amen!
The DoDAF [DoD architecture framework] as a means to bring more emphasis and utility of architecture to DoD programs. Booch thinks DoDAF is effective for modeling the 'enterprise of the warfighter', but is less effective in modeling software intensive systems.
Perhaps so. However, the DoDAF is certainly in evidence in the battlefield robots now under development and being deployed. We'll see how that works out. Advantage: USA [for now]
Booch gave his opinion about the DoD software community:
It really used to be, decades ago, that the DoD was leading the marketplace in the delivery of software-intensive systems. The harsh reality is that the commercial sector is leading best practices and really pushing the arc relative to software engineering and software development. So, in that regard, the DoD is behind the times. That is not to say that they are not pushing the limits in some areas. The kind of complexity we see in certain weapons systems far exceeds anything one would see commercially, but ultimately, there are a lot of things that the DoD can learn from the commercial world.
I wonder if the commercial "best practices" he refers to are Agile, or something from the CMMI? I'm not sure I would put Agile under "best practices" for general development, but certainly for cases of evolving and emerging requirements that are not fixed in anyone's mind, Agile is a risk management solution for that dilemma.
And, I certainly wouldn't call Agile "software engineering". Test driven design--TDD--might qualify as an engineering practice, so also refactoring, but most of the rest of Agile is management not engineering.
Those are my opinions. Booch had something different to offer:
Three recommendations for large scale organizations
Booch has three recommendations for DoD software managers and practitioners, but really these apply to any large scale organization doing software intensive systems:
1. Increase leverage of open-source tools and resources. This is not just a cost issue; it's an issue of leverage for innovation and transparency. Specifically, Booch says push 'Forge.mil' along, the services arm of 'SourceForge'
2. Build more elaborate and deployed infrastructure for collaboration. [I think he means: It's not that programs should be managed from Facebook or Twitter....they shouldn't...] There should be more emphasis on leveraging the worldwide expertiese of DoD and its contractors
3. Go beyond functional modeling and get down to modeling the system itself. Make architecture an artifact of the project. To this, I say: Amen!
The DoDAF [DoD architecture framework] as a means to bring more emphasis and utility of architecture to DoD programs. Booch thinks DoDAF is effective for modeling the 'enterprise of the warfighter', but is less effective in modeling software intensive systems.
Perhaps so. However, the DoDAF is certainly in evidence in the battlefield robots now under development and being deployed. We'll see how that works out. Advantage: USA [for now]
Bookmark this on Delicious
Are you on LinkedIn? Share this article with your network by clicking on the link.
Labels:
architecture,
complexity,
crosstalk,
dod
Tuesday, December 7, 2010
Of infamy and innovation
Today is December 7th, "...a day that will live in infamy" said FDR a day later in an address to Congress. The events of December 7th, 1941 ushered the United States into WW II as an armed belligerent. In turn, WW II ushered profound changes into the culture and society of the United States.
The enormous industrialization of WW II all but put defined process control on the map. Decades later, "six sigma" emerged, but only after TPM and other quality movements that had their roots in the projects to arm millions of service men and women.
The scope of WW II projects was unprecedented, leading to the military-industrial complex that defined and codified program management, system engineering, risk management, analog simulation, and a host of other project practices heretofore unknown or undefined.
WW II unleashed innovation as no other world event. The modern research university was empowered. During the war, the laboratories at MIT and CalTech and Stanford were at the forefront of new ideas, inventions, and applications. Since then, a multitude of research universities have been drivers of the innovation explosion in the United States.
Although the war drove atomic science, atomic science drove quantum mechanics, an understanding of the sub-atomic structure. From this we have all manner of semiconductors that have in turn been the underpinning of the information age.
And, let us not forget that WW II empowered 50% of our workforce for the first time. Women entered the workforce in large numbers doing jobs never open to them before. They have never looked back
And finally, WW II beget the 'GI Bill' that sent millions to college and all but invented the modern middle class from which yet more innovation, inventiveness, and entrepreneurship has arisen.
It sounds like "...there's nothing like a good war". But that's not the case. The emergency of warfare has always raised the bar. Before the U.S. civil war in the mid 19th century, railroads as a means for tactical support for forces was unheard of; so also electronic messaging...the telegraph in those days. Innovation, as a consequence of great national emergency, is the sidebar that always gets a boost.
The enormous industrialization of WW II all but put defined process control on the map. Decades later, "six sigma" emerged, but only after TPM and other quality movements that had their roots in the projects to arm millions of service men and women.
The scope of WW II projects was unprecedented, leading to the military-industrial complex that defined and codified program management, system engineering, risk management, analog simulation, and a host of other project practices heretofore unknown or undefined.
WW II unleashed innovation as no other world event. The modern research university was empowered. During the war, the laboratories at MIT and CalTech and Stanford were at the forefront of new ideas, inventions, and applications. Since then, a multitude of research universities have been drivers of the innovation explosion in the United States.
Although the war drove atomic science, atomic science drove quantum mechanics, an understanding of the sub-atomic structure. From this we have all manner of semiconductors that have in turn been the underpinning of the information age.
And, let us not forget that WW II empowered 50% of our workforce for the first time. Women entered the workforce in large numbers doing jobs never open to them before. They have never looked back
And finally, WW II beget the 'GI Bill' that sent millions to college and all but invented the modern middle class from which yet more innovation, inventiveness, and entrepreneurship has arisen.
It sounds like "...there's nothing like a good war". But that's not the case. The emergency of warfare has always raised the bar. Before the U.S. civil war in the mid 19th century, railroads as a means for tactical support for forces was unheard of; so also electronic messaging...the telegraph in those days. Innovation, as a consequence of great national emergency, is the sidebar that always gets a boost.
Bookmark this on Delicious
Are you on LinkedIn? Share this article with your network by clicking on the link.
Labels:
Innovation
Sunday, December 5, 2010
Crosstalk reviews architecture
"Crosstalk" has a new online website that presents the magazine in a truly online format. I personally like the "flip through the magazine" functionality....a really easy way to see the whole issue quickly.
Architecture
This month's edition is dedicated to architecture, ordinarily the domain of system engineering, but a discipline I firmly believe PM's should embrace as necessary in every project.
Why?
Architecture is the arching narrative that pulls the whole WBS together. And since the WBS is the object of the schedule, architecture helps to integrate all the project pieces. Architecture is an abstraction of the WBS; its that level of detail that's usually of interest and important to sponsors, stakeholders, and beneficiaries .... therefore, it's important to PMs.
And, here's the clincher for me: architecture plays directly into risk management. To see why, consider these properties of architecture:
Architecture
This month's edition is dedicated to architecture, ordinarily the domain of system engineering, but a discipline I firmly believe PM's should embrace as necessary in every project.
Why?
Architecture is the arching narrative that pulls the whole WBS together. And since the WBS is the object of the schedule, architecture helps to integrate all the project pieces. Architecture is an abstraction of the WBS; its that level of detail that's usually of interest and important to sponsors, stakeholders, and beneficiaries .... therefore, it's important to PMs.
And, here's the clincher for me: architecture plays directly into risk management. To see why, consider these properties of architecture:
Topology and protocols:
Architecture tells us the topology of the system, product, or process. Topology tells us about hierarchy, interconnectedness, and whether nodes are reached by point-to-point, hub-and-spoke, or some mesh circuitry. In some cases, architecture gives the protocols, that is: the rules, by which elements of the system tie together. Architecture gives form to requirements.
Cohesion, coherence, and coupling:
Cohesion is a measure of "stickiness", the degree to which elements of the project outputs will hang together under stress, work together well in the environment, and not do chaotic or disparate things when stimulated differently. Good cohesion is good and lowers risk.
Coherence is a measure of sympathetic reinforcement. Coherence gives rise to the adage: "the sum is greater than the parts". High coherence is generally good and lowers risk
Coupling is a measure of interference or dependency between units, subsystems, and modules. In general good architecture respects natural boundaries; disrespect leads to strong coupling and propagation of errors, stress, and failures. Loose coupling that traps effects before they propagate to other components is generally good, and lowers risk.
Summary: pay attention to architecture!
Summary: pay attention to architecture!
Bookmark this on Delicious
Are you on LinkedIn? Share this article with your network by clicking on the link.
Labels:
architecture,
crosstalk,
dod
Friday, December 3, 2010
WBS yet again
From time to time, the debate reemerges about the definition of the WBS. And so it happened again last month with a series of exchanges about 'work' vs 'the product of the work'.
This time, the fireworks began with a posting my Mike Clayton, followed by several responses from readers and critics.
In my response to Mike's post, I said:
But really, I think all the controversy can be reduced to one word: "Microsoft".
Microsoft can be blamed for everything. Microsoft beget MSProject, and MSProject captured the market for an inexpensive and easy to use scheduling tool decades ago. Being mostly a database of tables and fields, with some application code written around it, MSProject allows the user to expose certain fields that have a built-in data definition. One of these fields is entitled "WBS".
Verbs, nouns, and narrative
However, schedules are the world of 'verbs': actions that are to be scheduled. The WBS, on the other hand, is the world of the 'nouns', things that are memorials to completed actions.
The project 'narrative' is just the verbs from the schedule put into sentences where the nouns from the WBS are the objects of the verbs [hopefully, everyone recalls sentence diagramming from the 5th grade]
Application smarts
MSProject's application is not smart enough to distinguish between the 'nouns' and the 'verbs'. So, even if you have been diligent by making the summary row a noun with the subordinate rows containing the scheduled verbs, when the WBS column is exposed all records [rows] in the database [schedule] acquire a WBS number in an indentured and sequential order. The numbering is part of the application functionality.
So, naturally there is a confusion between schedule and WBS if you do not give the field [aka 'column' in the application] a user-defined 'title'. I like 'index', as shown in the figure below, but pick your own. Caution: do not rename the 'field name' since the field name is sacrosanct in the database.
This time, the fireworks began with a posting my Mike Clayton, followed by several responses from readers and critics.
In my response to Mike's post, I said:
Hey Mike: On this side of the pond, the WBS originated in the Defense Department, going back into the ’60s at least, as now given in MIL HDBK 881A, now in its upteenth upgrade and reprinting. PMI is a very late comer to the ‘definition’ game. DoD has always defined the WBS in terms of the product of the work, not the work itself. The 881A definition is: “A product-oriented family tree composed of hardware, software, services, data, and facilities. The family tree results from systems engineering efforts during the acquisition of a defense materiel item. ” You can read all about it at http://www.acq.osd.mil/pm/currentpolicy/wbs/MIL_HDBK-881A/MILHDBK881A/WebHelp3/MILHDBK881A.htm
But really, I think all the controversy can be reduced to one word: "Microsoft".
Microsoft can be blamed for everything. Microsoft beget MSProject, and MSProject captured the market for an inexpensive and easy to use scheduling tool decades ago. Being mostly a database of tables and fields, with some application code written around it, MSProject allows the user to expose certain fields that have a built-in data definition. One of these fields is entitled "WBS".
Verbs, nouns, and narrative
However, schedules are the world of 'verbs': actions that are to be scheduled. The WBS, on the other hand, is the world of the 'nouns', things that are memorials to completed actions.
The project 'narrative' is just the verbs from the schedule put into sentences where the nouns from the WBS are the objects of the verbs [hopefully, everyone recalls sentence diagramming from the 5th grade]
Application smarts
MSProject's application is not smart enough to distinguish between the 'nouns' and the 'verbs'. So, even if you have been diligent by making the summary row a noun with the subordinate rows containing the scheduled verbs, when the WBS column is exposed all records [rows] in the database [schedule] acquire a WBS number in an indentured and sequential order. The numbering is part of the application functionality.
So, naturally there is a confusion between schedule and WBS if you do not give the field [aka 'column' in the application] a user-defined 'title'. I like 'index', as shown in the figure below, but pick your own. Caution: do not rename the 'field name' since the field name is sacrosanct in the database.
Bookmark this on Delicious
Are you on LinkedIn? Share this article with your network by clicking on the link.
Labels:
wbs
Wednesday, December 1, 2010
Prospect Theory
Prospect Theory is an explanation of choosing among alternatives [aka "prospects"] under conditions of risk. Amos Tversky and David Kahneman are credited with the original thinking and coined the term "prospect theory".
Prospect Theory postulates several decision making phenomenon, a couple of which were discussed in the first posting. Here are two more:
The Isolation Effect
If there is a common element to both choices in a decision, decision makers often ignore it, isolating the common element from the decision process. For instance, if there is a bonus or incentive tied to outcomes, for which there is a choice of methods, the bonus is ignored in most cases.
Here's another application: a choice may have some common elements that affect the order in which risks are considered; the ordering may isolate a sure-thing, or bury it in a probabilistic choice.
Consider these two figures taken from Tversky and Kahneman's paper. In the first figure, two probabilistic choices are given, and they are independent of each other. The decision is between $750 in one choice and $800 in the other. The decision making is pretty straight forward: take the $800.
In the second figure, choice is a two step process. In the first step, the $3000 is given as a certainty with a choice to choose the other path that has an EV of $3200. This decision must be made before the consequences are combined with the chance of $0.
The decision outcome [at the square box] is either sure thing $3000 or expected value $3200. But, there is then a probabilistic activity that weights this decision such that at the far left chance node the prospect is either ($0, $750) or ($0, $800).
So, the EV of the prospect is the same in both figures. However, in Figure 2 the second tree has the 'certainty' advantage over the first tree with the choice that is available to pick the sure-thing $3000 at the decision node.
The Value Function
Quoting Tversky and Kahneman: "An essential feature of the ..... theory is that the carriers of value are changes in wealth or welfare, rather than final states. ...... Strictly speaking, value should be treated as a function in two arguments: the asset position that serves as reference point, and the magnitude of the change (positive or negative) from that reference point. "
The point here is that the authors postulate that every prospect has to be weighted with a factor that represents this value idea. The weightings do not have to sum to 1.0 since they are not probabilities; they are utility assignments of value. Weightings give rise to the apparent violations of rational decision making; they account for overweighting certainty; taking risks to avoid losses and avoiding risks to protect gains; and ignoring small probabilities, among other sins.
Bookmark this on Delicious
Prospect Theory postulates several decision making phenomenon, a couple of which were discussed in the first posting. Here are two more:
The Isolation Effect
If there is a common element to both choices in a decision, decision makers often ignore it, isolating the common element from the decision process. For instance, if there is a bonus or incentive tied to outcomes, for which there is a choice of methods, the bonus is ignored in most cases.
Here's another application: a choice may have some common elements that affect the order in which risks are considered; the ordering may isolate a sure-thing, or bury it in a probabilistic choice.
Consider these two figures taken from Tversky and Kahneman's paper. In the first figure, two probabilistic choices are given, and they are independent of each other. The decision is between $750 in one choice and $800 in the other. The decision making is pretty straight forward: take the $800.
In the second figure, choice is a two step process. In the first step, the $3000 is given as a certainty with a choice to choose the other path that has an EV of $3200. This decision must be made before the consequences are combined with the chance of $0.
The decision outcome [at the square box] is either sure thing $3000 or expected value $3200. But, there is then a probabilistic activity that weights this decision such that at the far left chance node the prospect is either ($0, $750) or ($0, $800).
So, the EV of the prospect is the same in both figures. However, in Figure 2 the second tree has the 'certainty' advantage over the first tree with the choice that is available to pick the sure-thing $3000 at the decision node.
The Value Function
Quoting Tversky and Kahneman: "An essential feature of the ..... theory is that the carriers of value are changes in wealth or welfare, rather than final states. ...... Strictly speaking, value should be treated as a function in two arguments: the asset position that serves as reference point, and the magnitude of the change (positive or negative) from that reference point. "
The point here is that the authors postulate that every prospect has to be weighted with a factor that represents this value idea. The weightings do not have to sum to 1.0 since they are not probabilities; they are utility assignments of value. Weightings give rise to the apparent violations of rational decision making; they account for overweighting certainty; taking risks to avoid losses and avoiding risks to protect gains; and ignoring small probabilities, among other sins.
Labels:
decision,
Risk Management,
utility
Monday, November 29, 2010
Quotation: the mind is an argument
The mind is not a single voice but an argument, a chamber of competing voices, and a [problem] occurs when we listen to the wrong side
Jonah Lehrer
If you don't believe this, read "Against the Gods", "How we decide", or "The irrational economist".
Bookmark this on Delicious
Are you on LinkedIn? Share this article with your network by clicking on the link.
Labels:
decision,
Quotations
Saturday, November 27, 2010
Prospect Theory: Decisions under Risk
Daniel Kahneman and Amos Tversky may be a project manager's best friends when it comes to understanding decision making under conditions of risk.
Of course, they've written a lot good stuff over the years.....my favorite is "Judgement under uncertainty: Heuristics and biases". You can find more about this paper in a posting about the key points at HerdingCats.
The original prospect thinking
Tversky and Kahneman are the original thinkers behind prospect theory.. Their 1979 paper in Econometrica is perhaps the best original document, and it's entitled: "Prospect Theory: An analysis of decision under risk". It's worth a read [about 28 pages] to see how it fits project management
What's a prospect? What's the theory?
A prospect is an opportunity--or possibility--to gain or lose something, that something usually measured in monetary terms.
Prospect theory addresses decision making when there is a choice between multiple prospects, and you have to choose one.
A prospect can be a probabilistic chance outcome, like the roll of dice, where there is no memory from one roll to the next. Or it can be a probabilistic outcome where there is context and other influences, or it can be a choice to accept a sure thing.
A prospect choice can be between something deterministic and something probabilistic.
The big idea
So, here's the big idea: The theory predicts that for certain common conditions or combinations of choice, there will be violations of rational decision rules.
Rational decision rules are those that say "decide according to the most advantgeous expected value [or the expected utility value]". In other words, decide in favor of the maximum advantage [usually money] that is statistically predicted.
Violations driven by bias:
Prospect theory postulates that violations are driven by several biases:
Quick example
Here's a quick example to get everyone on the page: The prospect is a choice [a decision] between receiving an amount for certain or taking a chance on receiving a larger amount.
Let's say the amount for certain is $4500, and the chance is an even bet on getting $10,000 or nothing. The expected value of the bet is $5,000.
In numerous experiments and empirical observations, it's been shown that most people will take the certain payout of $4,500 rather than risking the bet for more.
The Certainty Effect: Tversky and Kahneman call the effect described in the example the "Certainty effect". The probabilistic outcome is underweighted in the decision process; a lesser but certain outcome is given a greater weight.
The Reflection Effect: Now, change the situation from a gain to a loss: In the choice between a certain loss of $4,500 and an even bet on losing $10,000 or nothing, most people will choose the bet, again an expected value violation. In other words, the preference....certain outcome vs probabilistic outcome...is changed by the circumstance of either holding onto what you have, or avoiding a loss.
These two effects are summarized in their words:
Other Effects: There are two other effects described by prospect theory, but they are for Part II....coming soon!
Of course, they've written a lot good stuff over the years.....my favorite is "Judgement under uncertainty: Heuristics and biases". You can find more about this paper in a posting about the key points at HerdingCats.
The original prospect thinking
Tversky and Kahneman are the original thinkers behind prospect theory.. Their 1979 paper in Econometrica is perhaps the best original document, and it's entitled: "Prospect Theory: An analysis of decision under risk". It's worth a read [about 28 pages] to see how it fits project management
What's a prospect? What's the theory?
A prospect is an opportunity--or possibility--to gain or lose something, that something usually measured in monetary terms.
Prospect theory addresses decision making when there is a choice between multiple prospects, and you have to choose one.
A prospect can be a probabilistic chance outcome, like the roll of dice, where there is no memory from one roll to the next. Or it can be a probabilistic outcome where there is context and other influences, or it can be a choice to accept a sure thing.
A prospect choice can be between something deterministic and something probabilistic.
The big idea
So, here's the big idea: The theory predicts that for certain common conditions or combinations of choice, there will be violations of rational decision rules.
Rational decision rules are those that say "decide according to the most advantgeous expected value [or the expected utility value]". In other words, decide in favor of the maximum advantage [usually money] that is statistically predicted.
Violations driven by bias:
Prospect theory postulates that violations are driven by several biases:
- Fear matters: Decision makers fear a loss of their current position [if it is not a loss] more than they are willing to risk on an uncertain opportunity. Decision makers fear a sure loss more than a opportunity to recover [if it can avoid a sure loss]
- % matters: Decision makers assign more value to the "relative change in position" rather than the "end state of their position"
- Starting point matters: The so-called "reference point" from which gain or loss is measured is all-important. The reference point can either be the actual present situation, or the situation to which the decision maker aspires. Depending on the reference point, the entire decision might be made differently.
- Gain can be a loss: Even if a relative loss is an absolute gain, it affects decision making as though it is a loss
- Small probabilities are ignored: if the probabilities of a gain or a loss are very, very small, they are often ignored in the choice. The choice is made on the opportunity value rather than the expected value.
- Certainty trumps opportunity: in a choice between a certain payoff and a probabilistic payoff, even if statistically more generous, the bias is for the certain payoff.
- Sequence matters: depending upon the order or sequence of a string of choices, even if the statistical outcome is invariant to the sequence, the decision may be made differently.
Quick example
Here's a quick example to get everyone on the page: The prospect is a choice [a decision] between receiving an amount for certain or taking a chance on receiving a larger amount.
Let's say the amount for certain is $4500, and the chance is an even bet on getting $10,000 or nothing. The expected value of the bet is $5,000.
In numerous experiments and empirical observations, it's been shown that most people will take the certain payout of $4,500 rather than risking the bet for more.
The Certainty Effect: Tversky and Kahneman call the effect described in the example the "Certainty effect". The probabilistic outcome is underweighted in the decision process; a lesser but certain outcome is given a greater weight.
The Reflection Effect: Now, change the situation from a gain to a loss: In the choice between a certain loss of $4,500 and an even bet on losing $10,000 or nothing, most people will choose the bet, again an expected value violation. In other words, the preference....certain outcome vs probabilistic outcome...is changed by the circumstance of either holding onto what you have, or avoiding a loss.
These two effects are summarized in their words:
....people underweight outcomes that are merely probable in comparison with outcomes that are obtained with certainty. This tendency, called the certainty effect, contributes to risk aversion in choices involving sure gains and to risk seeking in choices involving sure losses.
Other Effects: There are two other effects described by prospect theory, but they are for Part II....coming soon!
Bookmark this on Delicious
Are you on LinkedIn? Share this article with your network by clicking on the link.
Labels:
decision,
Risk Management,
utility
Thursday, November 25, 2010
The process guys
Rather than make the trains run on time, it may be more beneficial to do away with trains!
Anonymous
To which I add my own:
Efficiency is a matter of getting the most outcome for the least effort. Effectiveness is getting valuable outcome for every effort applied. Value-add is effectiveness achieved efficiently!
Bookmark this on Delicious
Are you on LinkedIn? Share this article with your network by clicking on the link.
Labels:
process
Tuesday, November 23, 2010
Programs and Projects
Greg Githens has a post on programs and projects that has a nice comparison of the mindset and objectives that go with each.
I'll pick on only two points:
Greg cites the NASA space shuttle as an example of a program. NASA agrees: From their website on program management, NASA states:
I'll pick on only two points:
- Greg draws a distinction between outputs and outcomes [that's good, and see my earlier posting on this], ....but then he puts 'outputs' in the projects column and 'outcomes' in the programs column. I would put it this way: all successful projects produce outputs that beget outcomes, else there is no benefit stream to offset the investment in the outputs. Without benefits, a project is really not successful at the business level. [See: "New Coke"]
- He says project objectives are tactical and program objectives are strategic: Well, it depends on the project doesn't it? An ERP project is certainly a tactical achievement, but it's often justified on its contribution to strategic business performance and capabilities, especially capabilities directed to customers.
- Projects regard “risk” as a threat that will undermine performance. Project mangers focus on reducing uncertainty. Programs regard “risk” as an opportunity that brings with it threats and obstacles that will be managed. Program managers focus first on managing ambiguity and then on managing uncertainty.
- Projects are typically led by people who have good knowledge of the technology and system. Programs are typically led by people who appreciate the politics and culture as well as the technology. Stated differently, they tend to function more as executives than as technocrats.
Greg cites the NASA space shuttle as an example of a program. NASA agrees: From their website on program management, NASA states:
Program management lies between strategic planning and project management, since a strategic plan proposes programs that the organization will undertake. Each program is made up of several different projects. For instance, the Pioneer program included over a dozen spacecraft, ranging from simple test craft like Pioneer 4 to deep space probes like Pioneer 10, the first spacecraft from Earth to head on an interstellar journey
Bookmark this on Delicious
Are you on LinkedIn? Share this article with your network by clicking on the link.
Sunday, November 21, 2010
Earned value reform
Earned value management [EVM] reform is in the air--actually, it's been in the air for some time with effort in Congress to correct some of the problems, as reported by Glen Alleman and Paul Solomon.
In the Nov/Dec 10 online edition of the magazine Defense AT&L, Paul Solomon reports on initiatives to close three big gaps in the ANSI-EIA 748 standard on earned value in an article entitled "Earned Value Management Acquisition Reform"
[Note to reader: Paul Solomon is one of the co-authors of ANSI-EIA 748 and maintains a website on performance based earned value at pb-ev.com]
By Solomon's reckoning, the gaps to be closed are these:
Quality gap: There is no explicit provision to measure quality achievement--or short comings--in the formulation of a claim for earned value by a cost account or work package manager.
Technical performance gap: Although all technical projects have some kind of a technical performance objective and most have some sort of time-phased technical performance achievement plan, again the EVM system is not required to take achievement objectives into account explicitly.
Solomon believes that the fact that '748 is work oriented [work: the schedule] and not also product oriented [product: the WBS] leaves both quality and TPM--these more generally associated with product than work--in the shadows.
Risk management gap: Solomon says this: "The 32 guidelines in ANSI/EIA-748 fail to address the integration of risk management with EVM". Among others, the standard provides no guidance for risk-adjusting EVM's linear equations used to calculate forecasts.
Other EVM problems
Here's the other problems, according to information quoted by Solomon: "DoD has reported that EVM, based on the earned value management standard, no longer serves its purpose", and about that standard, Solomon says: "EVM is still recognized as an international, commercial best practice, but ANSI/EIA-748 has been largely ignored by commercial companies. When there is no government mandate to use EVM, the Project Management Institute (PMI) Guide to the Project Management Body of Knowledge (PMBOK® Guide) is a widely used standard for project management."
PMBOK® Guide?
Well, the PMBOK® Guide may be the go-to for non-Defense projects that employ EVM, but it's been my experience of two decades using EVM in DoD programs and then more than a decade in commerical IT that few non-Defense projects use any version of EVM, especially backoffice IT projects. And it's not because there's no government mandate. But if they want EVM, and if they went to the PMBOK® Guide, they'll find it's a subset of '748 that simply leaves out some of the process and reporting that weighs down '748.
PMBOK® Guide has gaps
Solomon asserts that the PMBOK® Guide has started down the road to integrate risk, TPM, and quality with EVM.
I don't agree.
In spite of what Paul says in the article and on his website, neither the PMBOK® Guide or the companion PMI Practice Standard for Earned Value Management directly address the three gaps. The fact that TPM, Risk Management, and Quality are all addressed under the same cover, and TPM appears as a practice in Quality Management and Risk Management does not integrate these practices with EVM.
Indeed, in Chapter 7 on cost management where the PMBOK® Guide discusses EVM--an improvement over its earlier positioning as a communications tool buried in Chapter 10--the PMBOK® Guide says the measure of earned value is based on work completed. There is not a hint that product quality and performance should be considered. To be sure, in Chapter 8, work performance and quality are tied together, but it's a reach to then tie that connection to Chapter 7.
And, although there are dotted planning lines from 'cost' to 'quality' knowledge areas, there are no such to risk management.
In short, the PMBOK® Guide is not currently the answer to the three big gaps.
Managers should step up:
When I do DoD programs, I set up a performance review board to evaluate and approve claims of EV from the WP and CA managers. It's the job of the board to hold the EV claimants accountable for TPM, quality, and risk. Done right, the standard can work.
In the Nov/Dec 10 online edition of the magazine Defense AT&L, Paul Solomon reports on initiatives to close three big gaps in the ANSI-EIA 748 standard on earned value in an article entitled "Earned Value Management Acquisition Reform"
[Note to reader: Paul Solomon is one of the co-authors of ANSI-EIA 748 and maintains a website on performance based earned value at pb-ev.com]
By Solomon's reckoning, the gaps to be closed are these:
Quality gap: There is no explicit provision to measure quality achievement--or short comings--in the formulation of a claim for earned value by a cost account or work package manager.
Technical performance gap: Although all technical projects have some kind of a technical performance objective and most have some sort of time-phased technical performance achievement plan, again the EVM system is not required to take achievement objectives into account explicitly.
Solomon believes that the fact that '748 is work oriented [work: the schedule] and not also product oriented [product: the WBS] leaves both quality and TPM--these more generally associated with product than work--in the shadows.
Risk management gap: Solomon says this: "The 32 guidelines in ANSI/EIA-748 fail to address the integration of risk management with EVM". Among others, the standard provides no guidance for risk-adjusting EVM's linear equations used to calculate forecasts.
Other EVM problems
Here's the other problems, according to information quoted by Solomon: "DoD has reported that EVM, based on the earned value management standard, no longer serves its purpose", and about that standard, Solomon says: "EVM is still recognized as an international, commercial best practice, but ANSI/EIA-748 has been largely ignored by commercial companies. When there is no government mandate to use EVM, the Project Management Institute (PMI) Guide to the Project Management Body of Knowledge (PMBOK® Guide) is a widely used standard for project management."
PMBOK® Guide?
Well, the PMBOK® Guide may be the go-to for non-Defense projects that employ EVM, but it's been my experience of two decades using EVM in DoD programs and then more than a decade in commerical IT that few non-Defense projects use any version of EVM, especially backoffice IT projects. And it's not because there's no government mandate. But if they want EVM, and if they went to the PMBOK® Guide, they'll find it's a subset of '748 that simply leaves out some of the process and reporting that weighs down '748.
PMBOK® Guide has gaps
Solomon asserts that the PMBOK® Guide has started down the road to integrate risk, TPM, and quality with EVM.
I don't agree.
In spite of what Paul says in the article and on his website, neither the PMBOK® Guide or the companion PMI Practice Standard for Earned Value Management directly address the three gaps. The fact that TPM, Risk Management, and Quality are all addressed under the same cover, and TPM appears as a practice in Quality Management and Risk Management does not integrate these practices with EVM.
Indeed, in Chapter 7 on cost management where the PMBOK® Guide discusses EVM--an improvement over its earlier positioning as a communications tool buried in Chapter 10--the PMBOK® Guide says the measure of earned value is based on work completed. There is not a hint that product quality and performance should be considered. To be sure, in Chapter 8, work performance and quality are tied together, but it's a reach to then tie that connection to Chapter 7.
And, although there are dotted planning lines from 'cost' to 'quality' knowledge areas, there are no such to risk management.
In short, the PMBOK® Guide is not currently the answer to the three big gaps.
Managers should step up:
When I do DoD programs, I set up a performance review board to evaluate and approve claims of EV from the WP and CA managers. It's the job of the board to hold the EV claimants accountable for TPM, quality, and risk. Done right, the standard can work.
Bookmark this on Delicious
Are you on LinkedIn? Share this article with your network by clicking on the link.
Labels:
earned value,
Quality,
Risk Management,
tpm
Friday, November 19, 2010
ISO 31000
I was drifting through the final draft of ISO 31000 looking for nuggets. [Found none]
This is the ISO's first of three documents on Risk Management. Still to come:
This is the ISO's first of three documents on Risk Management. Still to come:
- ISO 31000: Principles and Guidelines on Implementation
- IEC 31010: Risk Management - Risk Assessment Techniques
- ISO/IEC 73: Risk Management - Vocabulary
Bookmark this on Delicious
Are you on LinkedIn? Share this article with your network by clicking on the link.
Labels:
ISO,
Risk Management
Wednesday, November 17, 2010
Boots vs process
Since it's November, I'll return one more time to the November Harvard Business Review that has an interesting set of articles on military leadership.
In an article entitled "Which of These People Is Your Future CEO?: The Different Ways Military Experience Prepares Managers for Leadership", authors Boris Groysberg, Andrew Hill, and Toby Johnson make the following observations...paraphrased for PM:
Where there are highly integrated complex systems for which consequences are difficult to predict or control if managers and do'ers go "off book", process is king. Everything is 'by the book'.
Where there are close encounters in local situation with local tools and capabilities, more or less self contained, then agile, evolutionary, and even emergent responses may be the best approach, indeed the required approach.
The Air Force and the Navy more often encounter the former: in general their officers highly value process; the Army and Marine Corps are more often in the the latter situation with 'boots on the ground', and they value personal initiative and local maneuver.
The authors say this:
On a lighter note:
In an article entitled "Which of These People Is Your Future CEO?: The Different Ways Military Experience Prepares Managers for Leadership", authors Boris Groysberg, Andrew Hill, and Toby Johnson make the following observations...paraphrased for PM:
Where there are highly integrated complex systems for which consequences are difficult to predict or control if managers and do'ers go "off book", process is king. Everything is 'by the book'.
Where there are close encounters in local situation with local tools and capabilities, more or less self contained, then agile, evolutionary, and even emergent responses may be the best approach, indeed the required approach.
The Air Force and the Navy more often encounter the former: in general their officers highly value process; the Army and Marine Corps are more often in the the latter situation with 'boots on the ground', and they value personal initiative and local maneuver.
The authors say this:
To generalize, Navy and Air Force .... take a process-driven approach to management; personnel are expected to follow standard procedures without any deviation. This allows the [them] to excel in highly regulated industries and, perhaps surprisingly, in innovative sectors. Army and Marine Corps .... embrace flexibility and empower people to act on their vision. They excel in small firms, where they are better able to communicate a clear direction and identify capable subordinates to execute accordingly
On a lighter note:
As one former Army captain, a combat veteran of Operation Iraqi Freedom, put it: “Misplace a bolt in the Army and you might have a broken-down truck. Misplace a bolt in the Navy or the Air Force, and you might lose a $100 million piece of machinery
Bookmark this on Delicious
Are you on LinkedIn? Share this article with your network by clicking on the link.
Subscribe to:
Posts (Atom)