2019 Annual Conference Abstracts

Registration Page | Speaker Bios

Abstracts (in order of appearance)

Tuesday, March 26

Wednesday, March 27

Virtual SIPs
Sam L. Savage, Executive Director of ProbabilityManagement.org and author of The Flaw of Averages

The open SIPmath™ Standard represents uncertainties as arrays of outcomes and metadata called SIPs (Stochastic Information Packets). SIPs play a role in probabilistic calculations analogous to the role of Arabic numerals in deterministic calculations. This allows simulations running on diverse platforms to be networked into enterprise wide systems. The simplicity of the approach allows for implementation in native Excel.

Two recent breakthroughs, the HDR™ Random Number Framework and the Metalog™ System for analytically matching probability distributions to data, extend the SIPmath standard to virtual SIPs, which are generated in the client environment. This can reduce the storage requirements to a tiny fraction of previous requirements.

As an analogy, just as you can add water to powdered milk or powdered Kool-Aid to get milk or Kool-Aid, you can add a stream of uniform random numbers to a Metalog to get virtually any continuous probability distribution. And what is the HDR Framework in this analogy? Powdered Water.

The Mission Assurance Portfolio (MAP) Model: A SIPmath Application of Scenario-Based Portfolio Planning with Uncertainty
Shaun Doheney, Innovative Decisions, Inc., and Chair of Resources and Readiness Applications at ProbabilityManagement.org

Presentation based on a paper co-authored by LtCol (ret) Shaun Doheney (USMC), LCDR Connor McLemore (USN) & Dr. Sam Savage.

To get the most warfighting capability out of a limited operating budget, the military must make cost-benefit tradeoffs between various programs. To do so, the military traditionally uses a capability-based assessment (CBA) process that begins with the examination of strategic and operational guidance, and is followed by numerous scenario-based studies, analyses, wargames, and field experiments. This process leads to the identification of capabilities required by the military to perform their responsibilities across a range of military operations. The final step is the development of a draft budget that balances the costs and benefits of various solutions required to fund the development and sustainment of the best possible fighting force. This is a complex process fraught with difficult-to-measure quantities, strong advocacy for existing programs, and political sensitivities. In this talk, we present an alternative scenario-based portfolio planning model that leverages uncertainty to assess various funding profiles for senior military leadership to easily visualize. Our proposed stochastic model provides interactive decision support to facilitate the budgeting discussion. While this model has a clear military framework, the methodology can be applied to any organization making fiscally-constrained cost-benefit trade decisions in the context of scenario-based portfolio planning.

Using Probability Management for System Design Tradespace Exploration
Gregory S. Parnell, Ph.D., Department of Industrial Engineering, University of Arkansas

System analysts perform Tradespace Exploration (TSE) in early system design to identify affordable systems designs that meet complex stakeholder requirements. For defense systems, TSE is an essential task in the Analysis of Alternatives that supports the decision to proceed into development. This presentation uses Probability Management with an integrated trade-off analytics framework to explore 100,000 potential systems design concepts and architectures in near real-time. We demonstrate with an Army Unmanned Aerial Vehicle case study using Set-Based Design with our integrated Model-Based Engineering framework to identify potential solutions, assess feasibility, and evaluate solutions displayed in sets with common design choices. This TSE analysis informs future systems requirements and helps DoD decision makers select the most promising systems design(s) for development.

Metalog Distributions: Uses and Applications To Strengthen Your Capabilities
Tom Keelin, Managing Partner, Keelin Reeds Partners and Chief Research Scientist, ProbabilityManagement.org

As the most novel and generally applicable advance in probability in over a century, the metalog distributions promise to change the practice of data science, statistics, probability, and decision analysis. By combining unlimited shape flexibility, choice of bounds, closed-form data parameterization, and simple closed-form equations, the metalogs are better than classical distributions for a wide range of uses and applications. These include representing discrete expert assessments and simulation outputs with continuous distributions, better and more convenient fit to data than classical distributions, adding independent uncertainties in closed form, Bayesian updating of prior distributions with new data in closed form, and convenient assessment of multivariate distributions. Use of such tools can strengthen your capabilities and those of your organization.

Using Risk Analysis to Make Better Strategic Decisions
Alex Sidorenko, Institute for Strategic Risk Analysis in Decision Making

Alex will present a case study applying Monte Carlo simulation during business planning at a large manufacturing company. The case study will describe a step by step guide:

  • How to set better, more realistic KPIs and performance targets for management, business units or subsidiaries

  • How to move away from single point estimates to start planning and forecasting in ranges and confidence intervals

  • How to significantly reduce the potential for fraud and performance sugarcoating

  • How to improve the performance management conversation

  • How to stop companies from making unreasonable and risky decisions each December when bonus payments are calculated

  • Improve management chances to get fair performance rewards

Stochastic Econometrics
Katlin Bass and Madeline Minchillo, Lone Star Analysis

Presentation co-authored by Katlin Bass, Beth Goode, and Madeline Minchillo, Lone Star Analysis.

During the 20th century most economic and econometric studies were deterministic. Even when statistics were used, determinism was ushered in as soon as possible. 21st century computing with data sets of unimaginable richness have shifted the boundaries of econometrics.

We present two examples of stochastic econometrics, only useful because they are not deterministic.

One of them is a promising application for SIPmath. We present a broadly applicable method which others can replicate. It leverages the power of Machine Learning methods in R or Python to create stochastic econometric ensemble models. The trained model operates in a simulation environment like SIPmath.

The second is a new method and toolset for predicting economic competitions. The behavior of organizations competing to buy or to sell is inherently uncertain. Uncertainties exist in the rational economics for each participant and in the differences in their rational economics. More uncertainties are found in Prospect Theory; most of the time the participants will stray from purely rational behavior. To make matters more complex, there is uncertainty about differences in the biases and heuristics among the participants. In short, uncertainty abounds in economic competition.

How to Supplement Safety Requirements to Prevent Major Technological Catastrophes?
Stan Uryasev, Risk Management and Financial Engineering Lab, University of Florida

Presentation based on a paper co-authored by Stan Uryasev, University of Florida (uryasev@ufl.edu) and Giorgi Pertaia, University of Florida.

This paper discusses a new probabilistic characteristic called Buffered Probability of Exceedance (bPOE) for evaluation of tails of probabilistic distributions. bPOE equals a tail probability with known mean of the tail (i.e., it is a probability of the tail such that the mean of the tail equals some specified value). The objective of the paper is to show how to upgrade with bPOE safety requirements based on Probability of Exceedance (POE).

Let us explain definition of bPOE with a simple example. For instance, 4% of land-falling hurricanes in US have cumulative damage exceeding $50 billion (i.e., POE = 0.04 for threshold=$50 billion). It is estimated, that the average damage from the worst 10% of hurricanes is $50 billion. In terms of bPOE, we say bPOE=0.1 for the threshold=$50 billion. bPOE shows that the largest damages having magnitude around $50 billion have frequency 10%. bPOE can be considered as an important supplement to POE.

The paper considers two application areas: 1) Materials strength regulations (A-basis, B-basis); 2) Ratings of Financial Companies (such as AAA, AA, …). We demonstrate that these safety requirements can be efficiently managed/optimized with convex and linear programming algorithms. In particular, we discuss how to formulate and solve a Collateralized Debt Obligations (CDOs) structuring problem. We also show that these applications can run directly off of standard SIP libraries.

Utilization of an Attrition Model to Explore Warfighting Utility of Non-Kinetic Attacks
CAPT Brian Morgan, Naval Postgraduate School, and Harrison Schramm, Center for Strategic and Budgetary Assessments

Campaign analysis forms the basis of operational and strategic budget decisions with a long track record of success. The Department of Defense’s enterprise risk is the ability of the joint force to protect and advance our national interests (i.e. defense of our nation). To have a comprehensive understanding of the military’s ability to achieve the imperatives articulated in the President’s National Security Strategy, military planners could utilize campaign analysis to gain a broad understanding of the risk under various assumptions. Quantitative campaign analysis focuses decision-makers on investment priorities because it provides a stable baseline for comparative analysis of platforms and systems. Although current models are well suited to examine traditional forms of kinetic warfare, credibly incorporating non-kinetic effects such as cyber attacks remain elusive. In this presentation, we modify a deterministic attrition model used for both ground and air combat to explore uncertain, variable effects of offensive cyber attacks. These effects at the mission level can then be used to calibrate a campaign level model such as Synthetic Theater Operations Research Model (STORM), which is the de-facto standard campaign level model used by the Department of Defense.

The 60 Cycle 120 Volt Standard for the Probability Power Grid: A Combination of SIPmath, Metalogs and the HDR Generator
Doug Hubbard, Hubbard Decision Research; Tom Keelin, Keelin Reeds Partners / Probability Management.org; and Sam Savage, Probability Management.org (moderator)

Just as a standard for distributing electricity allowed people who did not know how to generate the stuff to use it in all sorts of appliances, the goal of the open SIPmath Standard has been to allow people without statistical knowledge to access the proven benefits of probabilistic analysis. In this analogy, libraries of SIPs, representing thousands or millions of potential futures, are like the direct current of the first power grids. Stretching this line of thought close to the breaking point, Doug Hubbard's Random Number Framework and Tom Keelin's Metalog System are like the vastly more efficient alternating current and transformers of today's power grid. By incorporating these two breakthrough technologies, the SIPmath Standard has greatly expanded its applicability and diminished its storage requirements. In this panel discussion, meet with the three inventors involved and help them shape the future of probability management with your comments.  

The Failure of Risk Management: Why It's *Still* Broken and How to Fix It
Doug Hubbard, Hubbard Decision Research and author of The Failure of Risk Management: Why It’s *Still* Broken and How to Fix It

Doug Hubbard wrote the first edition of The Failure of Risk Management: Why It's Broken and How to Fix It a decade ago, just before and during the Great Recession. He is now working on the second edition of this book and finds even more to criticize in popular methods. Since the first edition, the growing interest in Enterprise Risk Management has been hampered by the use of faulty methods. In this session, he will explain where improvements have not been made, why some attempts at improvements have failed, and what we can do differently this time that didn't work the first time.

Promoting Predictive Analytics in a Large Organization
Andrew Abranches and Ujvalla Gupta, Pacific Gas and Electric Company

PG&E Gas Operations has embarked on an initiative to drive accuracy into forecasting of cost and work volume across its portfolio of programs to improve resource planning and become more affordable. In order to do so, PG&E is implementing process improvement methodologies as well as probability management principles across the Gas Ops organization by utilizing open source platform – WIKI. Simulation of risk by using SIPmath has demonstrated the ability to quickly roll up risk on an individual asset and generate a meaningful and actionable outcome. These SIPs can be stored and shared by using WIKI. Having one language for communicating across model parameters, rapid sharing of data and results and ensuring models can be refreshed with live data on a periodic basis is the key to data-driven decision making for financial and work volume planning. PG&E will share some of the insights gained into making this process a success and also share some of the tools used and where PG&E is headed next.

Long-range Probabilistic Forecasts of US Energy Use and Prices
Max Henrion, PhD, Lumina Decision Systems, Inc.

When investing in projects to produce (or conserve) oil, gas, electricity, and other forms of energy, we want to know the future demand and price for that form of energy over the project lifetime. We would also like to know future energy prices when we invest in vehicles, buildings, data centers, or just about anything that uses significant amounts of energy. Many US organizations use the Annual Energy Outlook (AEO) projections from the Department of Energy, at least as a baseline. AEO's Retrospective comparison of projections with actual values show substantial errors. We have found that extreme errors (surprises) were larger in the last decade than previous decades (Sherwin, Henrion, and Azevado, Nature Energy, 2018). Given the accelerating changes in our energy systems, such as adoption of wind and solar generation, electric vehicles, and new policies on greenhouse gas emissions, we should not expect future forecasts to be more accurate. So we should select decisions that are resilient in the face of these uncertainties, expressed as probability distributions. We present probabilistic 20-year forecasts on key energy production, consumption, and prices, based on relatively simple forecast methods, such as exponential smoothing with damped trend. The median forecasts are at least as accurate as AEO Reference scenarios for historical values. Unlike AEO and other sources, these forecasts are probabilistic with serial dependency over time. They will be available in standard formats (SIP and SLURPS) for Monte Carlo evaluation of energy investments. 

The Problem with Big Data
Harry Markowitz, Harry Markowitz Co.

The trend is towards larger and larger databases. But anyone who has gone through the agony of trying to purge financial data of everything from “survival bias” to just plain bugs must cringe at the thought of tons of numbers without suitable provenance.

Beyond Risk: Applying SIPMath Across the Enterprise
Raj Dev, Credit Sesame and Matthew Raphaelson, Chair of Banking Applications at ProbabilityManagement.org

SIPmath has made its mark on the discipline of risk management. Today, we show how to apply SIPmath to other disciplines such as marketing, FP&A,…and specifically human resources. Yes, HR. Traditionally data-averse and unreceptive to quantitative frameworks, the field of HR is brimming with potential for probability management. We will focus on one key area of HR, “the war for talent”, as represented by the recruiting funnel: Applications, Interviews, Offers, Acceptances, and finally, On-boards (new hires). The output of the recruiting funnel is the product of five (or more) probability distributions – you definitely can’t do this in your head! We will mention other areas of HR that greatly benefit from SIPmath modeling, including performance management – where most employee performance can be modeled with some form of bell curve, but in certain types of jobs, performance is better represented by a power law.

Risk-Aware Planning for City Finances: How Much of a Rainy Day Fund is Enough?
Shayne Kavanagh, Government Finance Officers Association and Dan Matusiewicz, City of Newport Beach

How much money do cities need to have in reserve to be prepared to respond to extreme events? Too little could leave the community exposed to unexpected losses. Too much could meet with disapproval from the public. The right amount of reserves requires cities to know their risks. At this session, you will see a real-life Probability Management risk model used by multiple cities to in the United States and Canada to help them think about the risks they are exposed to and how that has shaped their decision-making about their financial reserves.

Multi-objective Optimization for Sustainable Development of Transportation Infrastructure Networks
Michael Lepech, Stanford Incheon Global Campus Research Center

Based on the most widely recognized definition of sustainable development by the United Nations World Commission on Environment and Development, this work formulates sustainable development as a multi-objective problem with short-term and long-term objectives and presents a multi-objective optimization framework for sustainable development of transportation infrastructure networks. Maintenance and seismic retrofit of a network of highway bridges in Santa Clara County, California, is presented as a case study. Network level measures are considered in the multi-objective optimization problems. Traffic delay in the San Francisco Bay Area is adopted as an important social sustainability indicator and short-term minimization objective. Life-cycle greenhouse gas emissions and travel time delays after a seismic event are considered as long-term environmental and social sustainability objectives, respectively. A high-performance computing cluster is used to overcome the computational cost of multi-objective optimization of this large scale transportation infrastructure network.The NSGA-ii genetic algorithm is used to search for the optimal maintenance and retrofit decisions of the network. To efficiently solve for the Pareto optimal retrofit policy considering the stochastic network performance after a seismic event, a new fitness memorization technique is developed to improve the NSGA-ii genetic algorithm, which provides robust evaluation of solutions for a range of multi-objective stochastic optimization problems.

SIPs and SLURPs of Water: Applied Probability Management
Thomas W. Chestnutt, Ph.D., PStat®, CAP®, A & N Technical Services, Inc.

Presentation based on research co-authored by Thomas W. Chesnutt and John Marc Thibault, SM Pro , marc@smpro.ca

Sustainability requires addressing risks and uncertainties. Water Research Foundation project 4742 presents a Primer on the principles and tools of Probability Management that can help address threats to sustainability.

Water Research Foundation Research Project No. 4742 provides applied research to teach standards for analyzing, calculating, and combining uncertainties.The principles, tools, and standards of Probability Management can reduce the cost, augment the validity, and improve the clarity of analytics applied to water demand/sales uncertainty.

Decisions involving uncertainty and risk are a routine activity in water resource management.  Demand/sales models rely on uncertain information about future population, economic cycles, rates, passive conservation and weather. Decisions about the timing and size of capital investments, built on flawed estimates of future levels of demand and supply, can therefore be flawed. Similarly, uncertainty about future sales can confound financial planning. Because information about uncertainty and risk is seldom expressed in concrete terms, ambiguity abounds. Meanwhile, the effect of uncertainty on decisions tends to be downplayed at best and completely disregarded at worst. Accordingly, an adequate accounting for uncertainty and risk requires not only new analytic approaches, but advances in the way analysts communicate with policy-makers and in the way policy makers communicate with each other.

Research Objectives

  1. Clearly explain the Principles of Probability Management (PM) for application to water demand/sales forecasting.

  2. Illustrate the use of PM tools in depicting uncertainty and informing the understanding of risk using at least three water industry case studies.

  3. Review the SIPmath™ v2 standards of PM and explain their relevance for the water industry.

When It Hits Home!
Deborah C. Gordon; Director City/County Association of Governments San Mateo County (2002-2018) and Former Mayor, Woodside, CA

Local community governments are considered the lowest in the government hierarchy and by extension the local officials the most limited in their authority. But we live in these communities and these local officials can have the largest and most direct impact on our lives and families. This type of local governing body was born in the United States in the early 1600’s and over the next almost four hundred years, processes and policies changed to meet the emerging challenges. But these changes were made at a pace that could keep up with the pace of change in the challenges. Today, local officials are faced with an exponentially accelerating change in risk, landscape, connectivity and community values – aging gas lines – expansion of the urban-wildland interface - digitalization and connection of everything  - and an emerging population that does not want to drive a car and an aging one that does and should not – to name a few. They are asked to plan, deliberate and decide on solutions for incredibly complex problems. Collaboration with other jurisdictions and engagement with local communities is required to solve these challenges. But to do so effectively, with an acceptable level of credibility, and in a timeframe that matches the persistence of the problem, these bodies need tools that can help them calculate uncertainty, understand the drivers of that uncertainty and communicate uncertainty accurately and do so quickly and interactively. SIPs and SLURPs are just the thing – and you don’t need a Ph.D. in probability theory to benefit from them.

Applications of Probability Management: Best Practices, Simplicity, Practicality, Legalities
Katlin Bass, Lone Star Analysis; Tom Chesnutt, A & N Technical Services; Tom Keelin, Keelin Reeds Partners / ProbabilityManagement.org; and Steve Roemerman, Lone Star Analysis (moderator)

The panel will review the presentations of the conference in light of best practices for the age of algorithms. The conference speakers demonstrate how the arithmetic of uncertainty addresses real world problems with simplicity and practicality. But best practice research shows that few practitioners are correctly dealing with uncertainty, even in the face of legal jeopardy. The panel will discuss the applications presented in the conference and how similar success could be achieved in other organizations. The panel aims to help both seasoned and sophisticated analytics professionals, and those who are taking first steps toward dealing correctly with uncertainty and risk.