David Pratten is passionate about leading IT-related change projects for social good.
1606 stories
·
0 followers

Cone of Uncertainty Bibliography

1 Share

There seems to still be confusion about what the Cone of Uncertainty means for some in the #NoEstimates community. I work in a domain where the CoU is baked into the Integrated Program Performance Management (IPPM) processes flowed down from the buyer, in this case, the Federal Government.

The CoU paradigm defines the needed reduction in uncertainty is some performance metric. This can be the confidence in the estimate for any variable. It can be the needed performance of a measure - Effectiveness, Performance, Key Performance Parameter, or a Technical Performance Measures. Here's a summary of these elements.

The specifics of the Technical Performance Measures applied to inform Physical Percent Complete and the Cone of Uncertainty around the TPM are shown here. Page 22 is an example of a cone that needs to be adhered to for the program to stay on schedule and cost.

There is NO data needed to use the CoU. The CoU is a build-to paradigm, where measures of the program's performance cumulative to date is used to inform the risk for future performance. IN this case Mean Time To Failure. But is can be ANY variance for the program, including confidence in the estimates of future performance - cost, schedule, or technical performance.

Here's a summary of how to understand the Cone of Uncertainty from Incremental Spiral Community Model.
Screen Shot 2017-12-08 at 10.18.38 AM
Understanding the Cone Uncertainty, shown in the figure above, helps to appreciate commitments in this context of change. Early in a systems lifecycle, there are many possible system capabilities and solution to consider., leading to a wide range of system costs. Particularly in competitive procurements, it is tempting to make bids and proposals as the lower edge of the Cone of Uncertainty. This is often rationalized through optimistic assumptions such as "The A team will perform on the project." "The COTS products will do everything right," or "it shouldn't take that much effort to turn the prototypes into products," Often, such temptations are mirrored with similar behavior by acquires who are trying to sell their programs, resulting in a "conspiracy of optimism." This usually results in a project's actual costs far outrunning the optimistic estimates and creating a large overrun.
The data shown in the figure above was collected by the Software Engineering Institute
 
Bibliography 
Here is a sample of reference papers used in our domain that may help further the understanding of how the CoU is used. Google will find almost all this material. The original idea came in 1981. But much has been done since then, and referring to the 1981 book is bad research method
  • "Sizing Challenges," Victor Fuster & Taylor Putnam-Majarian, Software and IT-CASR Proceedings, 22-24 August 2017
  • "Uncertainty in through-life costing-review and perspectives," Goh, Yee M., Newnes, Linda B., Mileham, Antony R., McMahon, Chris, Saravi, Mohammad E., IEEE Transactions on Engineering Management, 57 (4), pp. 689-701
  • "Application of Real Options Theory to Software Engineering for Strategic Decision making in Software Related Capital Investments," Albert O. Olagbemiro, Naval Postgraduate School Dissertation, December 2008.
  • "Software Projects Estimation & Control: Versatility & Contributions of COSMIC Function Points," Alain Abran, ICEAA 2017 ProfessionaDevelopme t& Training Workshop, Portland Oregon, June 6-9, 2017.
  • "Using NESMA Function Point Analysis in an Agile Context," Roel van Rijswijck, Thesis, Radboud Universiteit Nijmegen
  • Incremental Commitment Spiral Model: Principles and Practices for Successful Systems and Software, Barry Boehm and Jo Ann Lane, Addison Wesley, 2014 - Instead of quoting a 1981 book, read this book for the current processes for using the CoU and the Incremental Commitment approach.
  • "Future Challenges and Rewards for Software Engineers," Barry Boehm, Data & Analysis Center for Software (DACS), STN 10-3, October 2007, Volume 10, Number 3

There are 100's more papers and books on Cone of Uncertainty or containing the CoU concept that Google will find for you.

So here's the get off the stage message


The Cone of Uncertainty is a program performance management paradigm, where the outline of the cone - reducing the variance of some program variable - as the project progresses. When the measurement of the variable is NOT inside the cone, then this is a call to take preventative or corrective action to get back inside the cone. Past data is NOT needed to show the CoU is Value. The COU is a build to paradigm. The needed data comes from the executing program. This has been stated numerous times but attempts to redefine the COU still persist.

Read the whole story
drpratten
1 day ago
reply
Sydney, Australia
Share this story
Delete

(Virtually) No one should ever own an Echo or any other "voice assistant" product

1 Share

If you buy one of those intrinsically insecure, always-on "smart speakers" from Google, Amazon, Apple or other players, you're installing a constantly listening presence in your home that by design listens to every word you say, and which is very likely to suffer at least one catastrophic breach that allows hackers (possibly low-level dum-dums like the ransomware creeps who took whole hospitals hostage this year, then asked for a mere $300 to give them back because they were such penny-ante grifters) to gain access to millions of these gadgets, along with machine-learning-trained models that will help them pluck blackmail material, credit card numbers, and humiliating disclosures out of the stream of speech they're capturing. (more…)

Read the whole story
drpratten
4 days ago
reply
Sydney, Australia
Share this story
Delete

Autonomy and Authority

1 Share

Autonomy and Authority post image

These days I speak extensively about how we designed Lunar Logic as an organization. After all, going through a transition from a traditional management model to a situation where company has no managers at all is quite an achievement. One of the pillars of managerless organizational design is autonomy.

After all, decisions won’t just make themselves. Someone has to call the shots. Once we got rid of managers, who would normally make almost all decisions, we need everyone else to embrace decision making. For that to happen, we need to distribute autonomy.

Interestingly enough, when Don Reinertsen, who I respect a lot, talks about decentralizing control he uses somewhat different wording.

Decentralizing control requires decentralizing both the authority to make decisions and the information required to make these decisions correctly.

Don Reinertsen

Authority refers to a formal power to make a decision. However, I tend to make a clear distinction between authority and autonomy. Ultimately, as a manger, I can give my team authority to make a decision. However, at the same time I can instantiate fear or pressure on decision-makers so before they actually make their call they would ask me what I think about the topic and go with my advice. This mean that even if authority was distributed autonomy is not there.

Corollary to that, I may not have formal authority but I can feel courageous enough to make a decision. If that is an acceptable part of an organizational culture it means that I may have autonomy without authority. By the way the latter case is interesting as it pictures the attitude I’m very fond of: ask forgiveness, not a permission.

I’m not going to fundamentally disagree with Don Reinertsen, though. As a matter of fact, we are on the same page as he follows up with his train of thought.

To enable lower organizational levels to make decisions, we need to give them authority, information, and practice. Without practice and the freedom to fail upon occasion, they will not take control of these decisions.

Don Reinertsen

In the first quote Don is talking about prerequisites to decentralize control. In the second he focuses on enabling it. He adds a crucial part: people need to practice. This, as a consequence, means that occasionally they will fail, a.k.a. make bad decisions.

And that’s exactly what autonomy is in its core.

In vast majority of cases autonomy is derived from authority. It doesn’t work the other way around, though. In fact, situation of having formal authority but no real autonomy to make a decision is fairly common. It is also the worst thing we can do if we want people to feel more accountable for an organization they’re with.

Not only do they realize that the power they got is virtual but once it happens they’re not even back to square one. It’s worse. They got burned. So they’re not jumping on that autonomy bandwagon again when they are asked to get more involved in decision making.

That’s, by the way, another case that portraits that cultural change are not safe to fail.

Long story short, don’t confuse authority with autonomy. If you really care about your organization take care of distributing both, not only the former.

Read the whole story
drpratten
4 days ago
reply
Sydney, Australia
Share this story
Delete

Some people can hear this GIF

1 Share

A lot of people can apparently hear this GIF. I can feel it.

The GIF was created by HappyToast.

Read the whole story
drpratten
6 days ago
reply
Sydney, Australia
Share this story
Delete

How Bacteria Help Regulate Blood Pressure

1 Share

Some years ago, when Jennifer Pluznick was nearing the end of her training in physiology and sensory systems, she was startled to discover something in the kidneys that seemed weirdly out of place. It was a smell receptor, a protein that would have looked more at home in the nose. Given that the kidneys filter waste into urine and maintain the right salt content in the blood, it was hard to see how a smell receptor could be useful there. Yet as she delved deeper into what the smell receptor was doing, Pluznick came to a surprising conclusion: The kidney receives messages from the gut microbiome, the symbiotic bacteria that live in the intestines.

In the past few years, Pluznick, who is now an associate professor of physiology at Johns Hopkins University, and a small band of like-minded researchers have put together a picture of what the denizens of the gut are telling the kidney. They have found that these communiqués affect blood pressure, such that if the microbes are destroyed, the host suffers. The researchers have uncovered a direct, molecular-level explanation of how the microbiome conspires with the kidneys and the blood vessels to manipulate the flow of blood.

The smell receptor, called Olfr78, was an orphan at first: It had previously been noticed in the sensory tissues of the nose, but no one knew what specific scent or chemical messenger it responded to. Pluznick began by testing various chemical possibilities and eventually narrowed down the candidates to acetate and propionate. These short-chain fatty acid molecules come from the fermentation breakdown of long chains of carbohydrates — what nutritionists call dietary fiber. Humans, mice, rats and other animals cannot digest fiber, but the bacteria that live in their guts can.

As a result, more than 99 percent of the acetate and propionate that floats through the bloodstream is released by bacteria as they feed. “Any host contribution is really minimal,” Pluznick said. Bacteria are therefore the only meaningful source of what activates Olfr78 — which, further experiments showed, is involved in the regulation of blood pressure.

Our bodies must maintain a delicate balance with blood pressure, as with electricity surging through a wire, where too much means an explosion and too little means a power outage. If blood pressure is too low, an organism loses consciousness; if it’s too high, the strain on the heart and blood vessels can be deadly. Because creatures are constantly flooding their blood with nutrients and chemical signals that alter the balance, the control must be dynamic. One of the ways the body exerts this control is with a hormone called renin, which makes blood vessels narrower when the pressure needs to be kept up. Olfr78, Pluznick and her colleagues discovered, helps drive the production of renin.

How did a smell receptor inherit this job? The genes for smell receptors are present in almost every cell of the body. If in the course of evolution these chemical sensors hooked up to the machinery for manufacturing a hormone rather than to a smell neuron, and if that connection proved useful, evolution would have preserved the arrangement, even in parts of the body as far from the nose as the kidneys are.

Olfr78 wasn’t the end of the story, however. While the team was performing these experiments, they realized that another receptor called Gpr41 was getting signals from the gut microbiome as well. In a paper last year, Pluznick’s first graduate student, Niranjana Natarajan, now a postdoctoral fellow at Harvard University, revealed the role of Gpr41, which she found on the inner walls of blood vessels. Like Olfr78, Gpr41 is known to respond to acetate and propionate — but it lowers blood pressure rather than raising it. Moreover, Gpr41 starts to respond at low levels of acetate and propionate, while Olfr78 kicks in only at higher levels.

Here’s how the pieces fit together: When you — or a mouse, or any other host organism whose organs and microbes talk this way — have a meal and dietary fiber hits the gut, bacteria feed and release their fatty-acid signal. This activates Gpr41, which ratchets down the blood pressure as all the consumed nutrients flood the circulation.

If you keep eating — a slice of pie at Thanksgiving dinner, another helping of mashed potatoes — Gpr41, left to itself, might bring the pressure down to dangerous levels. “We think that is where Olfr78 comes in,” Pluznick said. That receptor, triggered as the next surge of fatty acids arrives, keeps blood pressure from bottoming out by calling for renin to constrict the blood vessels.

The new understanding of how symbiotic bacteria manipulate blood pressure is emblematic of wider progress in linking the microbiome to our vital statistics and health. While vague statements about the microbiome’s effect on health have become commonplace in recent years, the field has moved beyond simply making associations, said Jack Gilbert, a microbiome researcher at the University of Chicago.

“Everybody goes on about the promise,” he said. But in fact, studies full of mechanistic details, like the ones Pluznick, her collaborators and other researchers have published, have been growing more and more numerous.

In June of last year, the National Institutes of Health convened a working group on the microbiome’s control of blood pressure. Researchers met in Maryland to thrash out what important questions still need to be answered, including what role the host’s genetic background plays — whether, for instance, the microbiome matters more for some hosts than for others.

“There’s a lot of excitement getting more data,” said Bina Joe, a professor of physiological genomics and the director of the Center for Hypertension and Personalized Medicine at the University of Toledo. If you look at PubMed, she continued, there are more reviews of the microbiome literature than research papers. The review articles get new researchers interested — but there are still more details to hammer out.

Understanding those details is key to knowing whether transplanting a certain set of microbes into someone can reshape the recipient’s biology enough to cure a health problem, as some proponents of personalized medicine hope. One famous early study showed that giving lean mice the microbiome of an obese human made them obese too, while the microbiome of lean humans kept the mice lean. “There’s one paper that came out earlier this year … that showed that maybe this can happen with blood pressure as well,” Pluznick said, though she cautioned that the study was small and needed follow-up. But theoretically, even if swapping in new bacteria could only slightly lower the blood pressure of those with a genetic tendency toward hypertension, it could make a difference over the course of a lifetime.

“It might be something that’s easier to manipulate than your genes, or my genes. Those are much harder to change,” she said.



Read the whole story
drpratten
10 days ago
reply
Sydney, Australia
Share this story
Delete

Aleatory and Epistemic Uncertainty in Software Development Projects

1 Share

All software development projects operate in the presence of uncertainty.

This uncertainty is unavoidable. The design and development of software must rely on estimation, forecasts, and predictions based on an idealized understanding of what is an unknown (but knowable) understanding of reality.

If your reality is unknowable you've got a much bigger problem and are headed for failure.

There are two broad types of uncertainty on all projects, and on software projects, these two types drive very different responses.

  • There is uncertainty associated with the natural randomness of the underlying processes of writing software.
  • There is uncertainty associated with the model of the real world the software operates in because of insufficient or imperfect knowledge of reality.

These two types have fancy names

  • Aleatory uncertainty.
  • Epistemic uncertainty.

The two types of uncertainty may be combined and analyzed as a total uncertainty or treated separately. In either case, the principles of probability and statistics apply equally.

DiceThe Alea in Aleatory is Latin for Dice.

This means there is an Inherent Randomness

This is a data-based uncertainty associated with the inherent variability of the basic information of the real world processes of development. These uncertainties cannot be reduced, they are just part of the process of development. They are irreducible and the only approach to dealing with them is to have margin. Schedule margin, cost margin, performance margin.

By data-based, it means that the randomness is in the data generated by statistical processes. For example, the duration of a work activity is a statistical process. That duration can take on many values depending on the underlying model of the work. We can have a narrow range of values for the duration. Or a wide range of values, depending on the underlying processes.

Many software project phenomena or processes of concern to developers contain randomness. The expected outcomes are unpredictable (to some degree). Such phenomena can be characterized by field or experimental data that contain significant variability that represents the natural randomness of an underlying phenomenon. The observed measurements are different from one experiment (or one observation) to another, even if conducted or measured under identical conditions.

There is a range of measured or observed values in these experimental results; and, within this range, certain values may occur more frequently than others. The variability inherent in this data or information is statistical in nature, and the realization of a specific value (or range of values) involves probability.

This is why measures like velocity are very sporty since past performance is rarely like future performance in the presence of Aleatory Uncertainties (as well as Epistemic Uncertainties) of actual project work.

The term epistêmê in Greek means knowledge.

CelicusEpistemic uncertainty reflects our lack of knowledge.

This lack of knowledge is a probabilistic assessment of some outcome, usually an event based outcome.

There is a 40% chance of rain in the forecast area for tomorrow is an Epistemic uncertainty.

We assign probabilities to events, probabilities to the work activities that create the knowledge needed to assess the uncertainty, and probabilities of the residual uncertainties after our new knowledge has been acquired.

In practice, we can assign a mean or a median value to this uncertainty. That's what the weather forecast does. That 40% chance of rain is usually a mean value. Where we live, when we hear a 40% chance in Boulder County, we know we have a lower probability because of our micro-climate. That weather forecast is over the forecast area and may be much different depending on where you live in that area.

This forecast also includes inaccuracies and imprecisions in the prescribed forms of the probability distributions and all the parameters of the estimates. This is why forecasting the weather in some parts of the world is a very sporty business. In places like Los Angeles, it's easy - as shown in the movie LA Stories, where Steve Martin is the bored weatherman. Here in Colorado, with our mountain weather, making a forecast a few days from now is not likely to be very successful. As they say don't like Colorado weather? Wait a few hours, it'll change.

Some Challenges to Managing in the Presence of Uncertainty


The primary issue with all uncertainties is the communication of the accuracy and precision of the risk created by the aleatory and epistemic uncertainty. 

  • What is the scope of the uncertainty?
  • What risks does it create to the success of the software development effort?
  • Is the uncertainty time-dependent?
  • At what level of decomposition of the project is the uncertainty applicable?

This is a Risk Communication issue. So let's restate the two forms of uncertainty

  • Aleatory uncertainty: the uncertainty inherent in a nondeterministic (stochastic, random) phenomenon… is reflected by modeling the phenomenon in terms of a probabilistic model… Aleatory uncertainty cannot be reduced by the accumulation of more data or additional
    information.
  • Epistemic uncertainty: the uncertainty attributable to the incomplete knowledge about a phenomenon that affects our ability to model
    it… is reflected in ranges of values for parameters, a range of viable models, the level of model detail, multiple expert interpretations, and statistical confidence.uncertainty uncertainty can be reduced by the accumulation of additional information.

What Does This Mean for Software Development Working in the Presence of Uncertainty?


If you accept that all software development work operates in the presence of Aleatory and Epistemic uncertainty, then ...

No decisions can be made in the presence of these two types of uncertanties without estimating the impact of your decision on the project

This is a simple, clear, concise principle of managing in the presence of uncertainty. Anyone suggesting that decisions can be made without estimating has to willfully ignore this principle, OR the project is de minimus - meaning it's of no consequence to those paying if the project is late, over budget, or the delivered outcomes don't meet the needed performance level for the project to earn its Value in exchange for the Cost to produce that Value.

For Those Interested in the Underling Mathematics Here are Some Gory Details (click to load full sized poster)


Screen Shot 2017-11-30 at 12.28.06 PM
 

Read the whole story
drpratten
10 days ago
reply
Sydney, Australia
Share this story
Delete
Next Page of Stories