2018 Annual Conference Abstracts

Standardizing Risk

March 27 - 28, 2018
San Jose, CA

An Integrated Approach to Managing Uncertainty around Cybersecurity Risks - Eng-Wee Ethan Yeo, Manager, Information Security Governance, Surescripts LLC
Best Practices in Risk Analysis - Steve Roemerman, Chairman and CEO of Lone Star Analysis, Inc.
Powering Risk Management with Metalog Distributions - Tom Keelin, Managing Partner, Keelin Reeds Partners
Probability Management Overview - Sam Savage, Executive Director of ProbabilityManagement.org and author of The Flaw of Averages
Progress Towards the Game of Life - Harry M. Markowitz, Harry Markowitz Co. 
Quantifying Information Risk - A FAIR way from Fear, Uncertainty and Doubt (FUD) - Christopher T. Carlson, CISSP, President, C T Carlson LLC
Risk Analysis: From Chaos to Consistency - Jack Jones, Co-Founder and EVP R&D, RiskLens
SIPmath Modeler Tools for Excel - Sam Savage
Standardizing Risk at PG&E Gas Operations - Christine Cowsert Chapman, Senior Director, Asset Management and System Operations, Pacific Gas and Electric Company
Technology Risk in the Public Sector: Can it be Managed? - Ann Dunkin, CIO of the County of Santa Clara and former CIO of the United States Environmental Protection Agency
The Components of Risk and How to Measure Them - Doug Hubbard, President of Hubbard Decision Research and author of How to Measure Anything in Cybersecurity Risk
The Open Factor Analysis of Information Risk, a Standard for Cyber Risk - Mike Jerbic, Lecturer in the Department of Economics at San Jose State University
Withdrawal Flexibility: The Missing Link for Sustainable Retirement Income - John Scruggs, Vice President of Research, Loring Ward

An Integrated Approach to Managing Uncertainty around Cybersecurity Risks
Eng-Wee Ethan Yeo, Manager, Information Security Governance, Surescripts LLC

This presentation will provide a concrete example of how SIPmath integrates the management of probability distributions for factors in the Factor Analysis of Information Risk (FAIR) model, using Analytica by Lumina Decision Systems and Microsoft Excel with the SIPmath Modeler Tools. This provides a user-centric mechanism to communicate cybersecurity and operational risk at various levels of an organization, using familiar tools, while preserving the statistical integrity of the underlying risk model.

The presentation will begin with a brief overview of the mock SIPmath architecture used for the demonstration, followed by a live demonstration. The presentation ends with a Q&A session.

The demonstration will convey the following concepts: 1) Using Analytica to analyze cybersecurity risks using the FAIR model; 2) Exporting and importing analysis results between Analytica and a SIP library; and 3) Creating an interactive model in Microsoft Excel, using results in the SIP library, to illustrate how different security control sets influence overall risk given a set of risk scenarios.

Using Microsoft Excel for the interactive model provides accurate information that matters to the decision maker, without requiring them to learn new tools or expose them to details of the analyses.

Best Practices in Risk Analytics
Steve Roemerman, Chairman and CEO of Lone Star Analysis, Inc.

The Modeling Best Practices Benchmarking Project (MBP2) has been a multi-year international effort to find best practices in modeling, simulation, and analysis.   MBP2 findings related to risk and uncertainty will form the springboard for a panel discussion.  Two topics will be discussed to explore how risk analytics can be improved and trustworthy.  The first topic is a discussion of best practices in dealing with uncertainty, which MBP2 extensively explored.  There is no single “best” method.  The panel will discuss why some organizations choose their methods (for better or worse).   Second, MBP2 identified several risk factors.  These indicate flawed analytics are being produced and used by an organization.   The panel will discuss cases from the public domain where risk analysis failed and was itself a source of risk. 

Powering Risk Management with Metalog Distributions
Tom Keelin, Managing Partner, Keelin Reeds Partners

The metalogs are new family of continuous probability distributions that are ideally suited to representing and simulating broad classes of data. Traditional distributions (normal, lognormal, triangular, beta etc.) lack the flexibility to do this well; typically require complex curve fitting procedures; and often require look-up tables for simulation. In contrast, the metalogs obviate the need for curve fitting by taking data as their parameters; are more flexible because they have an unlimited number of shape parameters; and are ideal for simulation because they have simple, closed-form equations. In risk management, it is common to represent total damages from N events as the sum of N IID (independent, identically distributed) lognormal or triangular distributions over the damages from a single event. Given that there are no closed-form equations for the sum of N IID lognormal or triangular distributions and that the number of events N is itself uncertain, such calculations quickly become both intractably complex and slow to simulate. Metalog distributions enable and streamline such analysis by providing closed form equations that represent sums of IID lognormal and triangular distributions to any required degree of accuracy. The same method applies more broadly to most other probability distributions and to operations beyond addition including multiplication, division, and minimum or maximum extreme values.

Probability Management Overview
Sam Savage, Executive Director of ProbabilityManagement.org and author of The Flaw of Averages

Probability Management is the communication of uncertainties as standardized arrays of data called SIPs. Just as Hindu-Arabic Numerals serve as the basis of our accounting systems, SIPs are being used as the basis of risk management systems. A major advantage is that uncertainties may be rolled up across platforms across the enterprise to create a consolidated risk statement. Several examples will be presented at this conference.

Progress Towards the Game of Life
Harry M. Markowitz, Harry Markowitz Co.

This talk argues that financial decisions for the individual or family should be considered as part of the "game as a whole" which the individual or family plays out. Even reducing this game to its essentials, it is surely too complex to solve analytically, therefore requires computer simulation to think through. The object to be analyzed is the nuclear family, consisting of an unattached individual, a couple, or a family with children and perhaps a residing elder. Typically, in the course of events, the residing elder (if any) dies or is placed in a nursing facility; the children leave home to set up their own nuclear families; the original family then consists of husband and wife. One spouse dies; one survives. When the remaining spouse dies the subject family's wealth is distributed to heirs and charity, and the game of life is over for the subject family.

The question is: How would a rational family play this game?

Quantifying Information Risk - A FAIR way from Fear, Uncertainty and Doubt (FUD)
Christopher T. Carlson, CISSP, President, C T Carlson LLC

The information security field is fraught with obfuscation. There is not even consensus on definitions of terms like risk, threat and vulnerability. So, it is no surprise that alarming graphics and risk ratings have been used to influence executives to spend more money on security.
The presentation begins with illustrations of common qualitative risk communication, then shows a way to develop defensible quantitative risk information. The solution is based on the Open FAIR risk analysis tool that leverages SIPmath. Topics covered include:

Qualitative risk analysis - examples to set the stage
Quantitative risk analysis - what is FAIR
 Performing FAIR analyses - using probability distributions with calibrated estimates

Risk Analysis: From Chaos to Consistency
Jack Jones, Co-Founder and EVP R&D, RiskLens

Risk measurement practices commonly used in the cyber and technology landscape today are broken.  As a result, organizations struggle to find the signal within the noise, which leaves them drowning in "critical" concerns and unable to prioritize effectively.  In this session, Jack will discuss the problems associated with common cyber risk measurement practices, and then share an approach that has helped organizations drain their cyber risk-related swamp.  

SIPmath Modeler Tools for Excel
Sam Savage, Executive Director of ProbabilityManagement.org and author of The Flaw of Averages

The SIPmath Tools make it easy to create interactive simulations models that run in native Excel without macros or add-ins. They may also be used to read or write SIP libraries in Excel, XML, CSV and other formats, which may be shared with Matlab, R and many other environments. Unlike other simulation tools, which must be loaded when the models are run, the SIPmath tools may be shared with any of the roughly half billion Excel users. Visit the Models page of ProbabilityManagement.org to experiment with examples. A free version of the tools is available for download.

Standardizing Risk at PG&E Gas Operations
Christine Cowsert Chapman, Senior Director, Asset Management and System Operations, Pacific Gas and Electric Company

PG&E Gas Operations has embarked on an initiative to improve risk quantification, the accounting of uncertainty, and implementation of probabilistic assessment techniques to improve decision making.  The implementation of probability management principles has demonstrated value to both leaders and technical experts and has shown what is possible using these modeling techniques.  Using a rapid prototyping approach to modeling with SIPmath, PG&E has been able to fail fast and find quick wins that can be built upon to improve decision making in the business.  PG&E will share some of the insights gained through this process and will share where these tools are taking them next.

Technology Risk in the Public Sector: Can it be Managed?
Ann Dunkin, CIO of the County of Santa Clara and former CIO of the United States Environmental Protection Agency

This talk provides one perspective in the ongoing debate in the information technology community about how to best protect our enterprises. Is the conventional wisdom about cyber hygiene and risk mitigation correct? Or are we fooling ourselves by trying to build moats around our enterprises and asking our users to create and change complex passwords frequently?  Is it time for a new approach?
 
Drawing on experience at the federal and local levels, this presentation will discuss efforts to quantify and mitigate risk in information technology operations and security within the public sector.  Major risk vectors will be identified, including suppliers, end users and external actors.  Various risk mitigation strategies will be discussed.  These strategies will include traditional security planning, testing and user education.  These traditional solutions will be evaluated alongside newer, more quantified risk-based approaches including the recently developed FAIR framework.

The Components of Risk and How to Measure Them
Doug Hubbard, President of Hubbard Decision Research and author of How to Measure Anything in Cybersecurity Risk

There is an "Ivory Tower of Babel" about risk.  Understanding risk has been plagued with multiple inconsistent definitions across many professions.  Even major standards organizations promote definitions that are inconsistent with each other and confusing in practice.  The lack of clarity on this may have contributed to the development of many ambiguous "soft" assessments of risks which are insufficient to make real decisions.  But clear, quantitative uses of the term and how to measure it have been well established for decades or even over a century in some fields..  We will define the components of risk and how they have been unambiguously quantified in a ways that have supported practical decisions in insurance, portfolio management, and engineering risk analysis.

The Open Factor Analysis of Information Risk, a Standard for Cyber Risk
Mike Jerbic, Lecturer in the Department of Economics at San Jose State University

Definitions of cyber risk have often been inconsistent with other managed risks such as market risk, credit risk, and operational risk.  This inconsistency can lead managers to say, “cyber risk cannot be measured or quantified.”  However, as its significance rises to the level of senior management and the Board of Directors, cyber risk must be addressed holistically with other risks.  How else can senior management and the board make effective decisions between mitigation alternatives?
 
The Open Group’s Open FAIR standard defines cyber risk commensurately with other enterprise risks.  I will explain that standard and demonstrate a new Open FAIR Spreadsheet risk calculator. The calculator was developed by the Open Group, San Jose State University’s Economics Department, and Probability Management using the SIPMath Modeler Tools for Excel. It quantifies cyber risk in the same terms as other enterprise risks, enabling management to integrate it with other operational risks.

Withdrawal Flexibility: The Missing Link for Sustainable Retirement Income
John Scruggs, Vice President of Research, Loring Ward

Retirement outcomes are largely determined by three key investor choices:  portfolio allocation, initial withdrawal level, and withdrawal flexibility.  While first two choices have received much attention, the effect of withdrawal flexibility on retirement outcomes is relatively unknown.  Using Monte Carlo simulation, this paper investigates the joint effects of the three choices on retirement outcomes in terms of income and terminal wealth.  Withdrawal flexibility effectively transfers market risk from a household’s wealth to its income (consumption). Allowing for even modest withdrawal flexibility dramatically reduces the probability of every retiree’s fear: “running out of money.” Since retiree goals in terms of lifestyle and legacy can be easily related to income and terminal wealth, a compelling case can be made for featuring Monte Carlo simulation in the retiree decision-making process.