White Paper

A Comparison between Real Options and Traditional Analysis in Layman's Terms

This paper provides a contrast between traditional valuation methods and the new generation of strategic decision analytics, namely real options analysis (with Monte Carlo simulation, stochastic forecasting, and optimization). In addition, it briefly illuminates the advantages as well as the pitfalls of using each methodology. It should be clear to the reader that the new analytics described here do not completely replace the traditional approaches. Rather, the new analytics complement and build upon the traditional approaches – in fact, one of the precursors to performing real options and simulation is the development of a traditional model. Thus, do not ignore or discard what is tried and true – traditional approaches are not incorrect, they are simply incomplete when modeled under actual business conditions of uncertainty and risk – but complement it with more advanced analytics to obtain a much clearer view of the business reality.

A Quick Primer on Risk and Decision Analysis for Everyone: Applying Monte Carlo Simulation, Real Options, Forecasting, and Portfolio Optimization

In this quick primer, advanced quantitative risk- based concepts will be introduced, namely, the hands-on applications of Monte Carlo simulation, real options analysis, stochastic forecasting, and portfolio optimization. These methodologies rely on existing techniques (e.g., return on investment, discounted cash flow, cost-based analysis, and so forth), and complements these traditional techniques by pushing the envelope of analytics, and not to replace them outright. It is not a complete chance of paradigm, and we are not asking the reader to throw out what has been tried and true, but to shift one’s paradigm, to move with the times, and to improve upon what has been tried and true. These new methodologies are used in helping make the best possible decisions, allocating budgets, and so forth, where the conditions surrounding these decisions are risky or uncertain. These new techniques can be used to identify, analyze, quantify, value, predict, hedge, mitigate, optimize, allocate, diversify, and manage risk. Find out how multinational like 3M, Airbus, Boeing, BP, Chevron, GE, Motorola, Pfizer, Johnson & Johnson, and many others are relying on these advanced analytical techniques.

A Quick Risk Primer for Actuaries, Insurance Companies, and Banks

Actuaries are typically tasked with analyzing and managing the risks of financial contracts, and an actuary’s work is based on the application of mathematical, statistical, economic, and financial analysis applied to a wide range of practical problems in long-term financial planning and management. They act as financial advisors to many commercial organizations such as insurance companies, retirement funds, banks, and stockbrokers. This quick primer is targeted at actuaries to help them better understand the application of various risk-based tools and techniques that will complement their abilities to better serve their clients. The approach is to select a small set of applications from a plethora of possible cases an actuary might come across, and show how the concepts of Monte Carlo simulation, stochastic forecasting, real options analysis, and portfolio optimization can be applied. The methodologies discussed and examples illustrated in this white paper are solved using the Risk Simulator software (Monte Carlo simulation, stochastic forecasting, and stochastic optimization) and Real Options SLS software (real options analysis and modeling), both by Real Options Valuation, Inc., and the theoretical discussions are based on the author’s several books.

A Quick Risk Primer for the Military

In this quick primer, advanced quantitative risk- based concepts will be introduced, namely, the hands-on applications of Monte Carlo simulation, real options analysis, stochastic forecasting, and portfolio optimization, and knowledge value added. These methodologies rely on common metrics and existing techniques (e.g., return on investment, discounted cash flow, cost-based analysis, and so forth), and complements these traditional techniques by pushing the envelope of analytics, and not to replace them outright. It is not a complete chance of paradigm, and we are not asking the reader to throw out what has been tried and true, but to shift one’s paradigm, to move with the times, and to improve upon what has been tried and true. These new methodologies are used in helping make the best possible decisions, allocating budgets, predict outcomes, create portfolios with the highest strategic value and returns on investment, and so forth, where the conditions surrounding these decisions are risky or uncertain. These new techniques can be used to identify, analyze, quantify, value, predict, hedge, mitigate, optimize, allocate, diversify, and manage risk for military options.

Basel II and III Credit Market Operational and Liquidity Risks with Asset Liability Management

It is often said that the Basel Committee Standards, formally called Capital Accords, constitute the bible for banking regulators (Central Banks) everywhere. In addition to the Accords, the Basel Committee has also framed 29 principles for effective banking supervision known as the Core Principles for Effective Banking Supervision. These standards encompassed by the Capital Accord and the Core Principles have become the source of banking regulation in every country in the world. As is widely known, these standards have evolved from Basel I to Basel II and III, reflecting the evolution of the financial industry (from Basel I to II) and the lessons from the financial crisis of 2008 (from Basel II to III). The most noticeable financial regulation paradigm changes captured and fostered by the Basel standards’ evolution are risk management and capital allocation. These most important changes in the international standards, and, therefore, in virtually every country´s financial regulatory framework, relate to the manner in which risks are managed and capital is calculated. By the general definition, as stated in Core Principle 15, Risk Management is the process to be used to “identify, measure, evaluate, monitor, report and control or mitigate all material risks on a timely basis and to assess the adequacy of their capital and liquidity in relation to their risk profile.” This process has been presented as the IMMM process: Identify, Measure, Monitor, and Mitigate each risk.
This case study looks at the practical tools—quantitative models, Monte Carlo risk simulations, credit models, and business statistics—utilized to model and quantify regulatory and economic capital, measure and monitor key risk indicators, and report all the obtained data in a clear and intuitive manner. It relates to the modeling and analysis of asset liability management, credit risk, market risk, operational risk, and liquidity risk for banks or financial institutions, allowing these firms to properly identify, assess, quantify, value, diversify, hedge, and generate periodic regulatory reports for supervisory authorities and Central Banks on their credit, market, and operational risk areas, as well as for internal risk audits, risk controls, and risk management purposes.

Dynamic Decision Trees

One major misunderstanding that analysts tend to have about real options is that they can be solved using decision trees alone. This is untrue. Instead, decision trees are a great way of depicting strategic pathways that a firm can take, showing graphically a decision road map of management’s strategic initiatives and opportunities over time. However, to solve a real options problem, it is better to combine decision‐tree analytics with real options analytics, rather than solely relying on decision trees. When used in framing real options, these trees should be more appropriately called strategy trees (used to define optimal strategic pathways).
In summary, decision‐tree analysis is incomplete as a stand‐alone analysis in complex situations. Both the decision‐tree and real options methodologies discussed approach the same problem from different perspectives. However, a common ground could be reached. Taking the advantages of both approaches and melding them into an overall valuation strategy, decision trees should be used to frame the problem, real options analytics should be used to solve any existing strategic optionalities (either by pruning the decision tree into subtrees or solving the entire strategy tree at once), and the results should be presented back on a decision tree. These so‐called option strategy trees are useful for determining the optimal decision paths the firm should take.

ERM at Eletrobrás

This white paper is intended to describe the methodology applied in automating Enterprise Risk Management (ERM) for Eletrobras Furnas, the largest utility company in Brazil. The ERM approach uses Real Options Valuation, Inc. (ROV) PEAT software’s ERM module, and adapts the Risk Matrix model currently used by the Eletrobras group to the concept of expected value of risk, pushing the envelope from qualitative risk assessment to more quantitative risk management.

ESO Valuation Detailed White Paper on FAS 123R

The goal of this case study is to provide the reader a better understanding of the valuation applications of FAS 123’s preferred methodology (Financial Accounting Standard 123) – the binomial lattice- through a systematic and objective assessment of the methodology and comparing its results with the Black-Scholes model (BSM). This case study shows that, with care, FAS 123 valuation can be implemented accurately. The analysis performed uses a customized binomial lattice that takes into account real –life conditions such as vesting, employee suboptimal exercise behavior, forfeiture rates, and blackouts, as well as changing dividends, risk-free rates, and volatilities over the life of the ESO (employee stock options).
This case study introduces the FAS 123 concept, followed by the different ESO valuation methodologies (closed-form BSM, binomial lattices, and Monte Carlo simulation) and their impacts on valuation. It is shown here that by using the right methodology that still conforms to the FAS 123 requirements, firms can potentially reduce their expenses by millions of dollars a year by avoiding the unnecessary overvaluation of the naïve BSM, using instead a modified and customized binomial lattice model that take into account suboptimal exercise behavior, forfeiture rates, vesting, blackout dates, and changing inputs over time.

Employer Sponsored Healthcare Economics and U.S. Affordable Care Act

The United States passed major healthcare reform legislation in March 2010, hereafter referred to as the Affordable Care Act (ACA). It is complicated and has many moving parts. Figure 14.149 identifies the key healthcare finance stakeholders and their moving parts. What makes this so important is that public and private healthcare spending within the United States is 17.6% of a $15.8 trillion gross domestic product, which equates to $8,233 per person. As such, the changes brought about by this legislation have created a monumental disruption opportunity as employers strategize what to do with respect to their employer‐sponsored insurance offerings. Our focus here is on the employer‐sponsored health insurance portion of the ACA because even though this system is currently subsidized through the tax code with the deductibility of the employers’ contributions from the corporation and the exclusion from the taxable income of the employee for these employers’ contributions, employers need to assess and determine what option is optimal in creating value as they define it. Figure 14.150 illustrates a sampling of strategic real options an employer may review when managing its healthcare exposure.

Extreme Value Theory and Application to Market Shocks

Economic Capital is highly critical to banks (as well as central bankers and financial regulators who monitor banks) as it links a bank’s earnings and returns on investment tied to risks that are specific to an investment portfolio, business line, or business opportunity. In addition, these measurements of Economic Capital can be aggregated into a portfolio of holdings. To model and measure Economic Capital, the concept of Value at Risk (VaR) is typically used in trying to understand how the entire financial organization is affected by the various risks of each holding as aggregated into a portfolio, after accounting for pairwise cross-correlations among various holdings. VaR measures the maximum possible loss given some predefined probability level (e.g., 99.90%) over some holding period or time horizon (e.g., 10 days). Senior management and decision makers at the bank usually select the probability or confidence interval, which reflects the board’s risk appetite, or it can be based on Basel III capital requirements. Stated another way, we can define the probability level as the bank’s desired probability of surviving per year. In addition, the holding period usually is chosen such that it coincides with the time period it takes to liquidate a loss position.
VaR can be computed several ways. Two main families of approaches exist: structural closed-form models and Monte Carlo risk simulation approaches. We showcase both methods later in this case study, starting with the structural models. The second and much more powerful of the two approaches is the use of Monte Carlo risk simulation. Instead of simply correlating individual business lines or assets in the structural models, entire probability distributions can be correlated using more advanced mathematical Copulas and simulation algorithms in Monte Carlo risk simulation methods by using the Risk Simulator software. In addition, tens to hundreds of thousands of scenarios can be generated using simulation, providing a very powerful stress testing mechanism for valuing VaR. Distributional fitting methods are applied to reduce the thousands of historical data into their appropriate probability distributions, allowing their modeling to be handled with greater ease.

Global Risk Standards Compliance in ROV Methods

The Enterprise Risk Management (ERM) methods deployed by any organization should at least consider compliance with global standards if not exactly mirroring COSO (Committee of Sponsoring Organizations of the Treadway Commission, with respect to their organizing committees at AAA, AICPA, FEI, IMA, and IIA), International Standards ISO 31000:2009, the U.S. Sarbanes–Oxley Act, the Basel III requirements for Operational Risk (from the Basel Committee through the Bank of International Settlements), and NIST 800‐37. The parallels and applications of ROV methodologies closely mirror these regulatory and international standards and, at times, exceed these standards. Figures 1‐10 illustrate some examples of compliance with ISO 31000:2009, and Figures 11‐20 show compliance with Basel II and Basel III requirements. These figures and the summary lists assume that the reader is already familiar with the IRM methodology employed by Real Options Valuation, Inc.

Impacts of FAS 123

The goal of this case study is to provide the reader a better understanding of the valuation applications of FAS 123’s preferred methodology (Financial Accounting Standard 123) – the binomial lattice- through a systematic and objective assessment of the methodology and comparing its results with the Black-Scholes model (BSM). This case study shows that, with care, FAS 123 valuation can be implemented accurately. The analysis performed uses a customized binomial lattice that takes into account real –life conditions such as vesting, employee suboptimal exercise behavior, forfeiture rates, and blackouts, as well as changing dividends, risk-free rates, and volatilities over the life of the ESO (employee stock options).
This case study introduces the FAS 123 concept, followed by the different ESO valuation methodologies (closed-form BSM, binomial lattices, and Monte Carlo simulation) and their impacts on valuation. It is shown here that by using the right methodology that still conforms to the FAS 123 requirements, firms can potentially reduce their expenses by millions of dollars a year by avoiding the unnecessary overvaluation of the naïve BSM, using instead a modified and customized binomial lattice model that take into account suboptimal exercise behavior, forfeiture rates, vesting, blackout dates, and changing inputs over time.

Options to Abandon

The Abandonment Option looks at the value of a project or asset’s flexibility in being abandoned over the life the option.

PEAT Enterprise Risk Management

Enterprise Risk Management (ERM) in an organization includes the business processes and methods used to identify and manage risks as well as seize upside opportunities to achieve its objectives. ERM, therefore, provides a methodological framework in risk management for identifying risky events or conditions relevant to the organization’s objectives, risks, and opportunities, identifying and assessing these conditions in terms of Likelihood or frequency of occurrence as well as the risk condition’s magnitude of Impact, determining risk mitigation and postrisk response strategy, and monitoring the progress of these risk controls. When organizations identify and proactively address risks and opportunities, organizations are able to protect and create value for their stakeholders (e.g., owners, employees, shareholders, executives, customers, regulators, nations, and society in general).
ERM is usually also described as a risk‐based approach to strategic planning as well as for managing an organization by integrating internal risk controls and external risk‐compliance requirements (e.g., COSO, ISO 31000:2009, Basel III, and Sarbanes–Oxley Act). It applies to a broad spectrum of risks facing an organization to ensure that these risks are properly identified and managed. Investors, government regulators, banks, and debt rating agencies, among others, tend to scrutinize the risk‐management processes of an organization as a key metric to its potential success.

PEAT Project Management Schedule and Cost Risk

In the world of project management, there are essentially two major sources of risks: schedule risk and cost risk. In other words, will the project be on time and under budget, or will there be a schedule crash and budget overrun, and, if so, how bad can they be? To illustrate how quantitative risk management can be applied to project management, we use ROV PEAT to model these two sources of risks.

Real Options Analysis in Layman's Terms

This model illustrates how to use Risk Simulator for running a Monte Carlo simulation on a multi-phased project to forecast costs and cost overruns, simulating discrete events (success/failure and go/no-go outcomes), linking these events, and linking the input parameters in a simulation assumption to calculated cells.
Requirements: Modeling Toolkit, Risk Simulator.

Sales Forecast and Pipeline Analysis

The need for sales of a company’s products is continual (i.e., necessary every day, week, month, quarter, year), but no sale is actually attained until a customer has decided to buy the products or services presented and the company has received payment in exchange for said products or services. The customer decision is ultimately beyond the control of the company and can only be solicited, cajoled, and requested by sales staff. Nevertheless, every company needs assurances of what sales level will be attained and if that level will be sufficient to support the costs of the company. There is, by definition, an element of uncertainty and risk associated with the forecasts reported for what sales will occur within a specified future period, and every organization needs to understand the risks that the reported sales forecasts will not be met. No company likes to receive the bad news that forecasts have not been met, especially when no thought was given to the possibility of underperformance prior to its occurrence. End‐of‐period actions are too late to provide assistance in making sales goals. In a world of uncertainty, knowing the level of risks contained within sales forecasts is essential for management to be able to tactically act in time to have a significant effect on final sales results.

Application White Paper

Advanced Data Diagnostics

This example model provides a sample data set on which we can run Risk Simulator’s Diagnostic Tool so that we can determine the econometric properties of data. The diagnostics run include checking the data for heteroskedasticity, nonlinearity, outliers, specification errors, micronumerosity, stationarity and stochastic properties, normality and sphericity of errors, and multicollinearity. Each test is described in more detail in its respective report.

Data and Statistical Analysis

Brief description: Applying the Statistical Analysis Tool to determine the key statistical characteristics of your data set, including linearity, nonlinearity, normality, distributional fit, distributional moments, forecastability, trends, and stochastic nature of the data.

Requirements: Modeling Toolkit, Risk Simulator

Forecasting - ARIMA (autoregressive integrated moving average)

This sample model illustrates how to run an econometric model called the Box-Jenkins ARIMA, which stands for autoregressive integrated moving average, an advanced forecasting technique that take into account historical fluctuations, trends, seasonality, cycles, prediction errors, and nonstationarity of the data.
Requirements: Modeling Toolkit, Risk Simulator.

Forecasting - Cubic and Nonlinear Splines

This sample model illustrate how to compute a linear and nonlinear interpolation and extrapolation of missing values in a time-series forecast model and for forecasting.
Requirements: Modeling Toolkit, Risk Simulator.

Forecasting - Exponential J and Logistic S Curves

The first sample model illustrate how to use Risk Simulator for running a simulation model on the Exponential J-growth curve where the growth rate is uncertain over time, to determine the probabilistic outcomes at same point in the future. Additional, the second sample model illustrate how to run a simulation model on the logistic S-growth curve where the growth rate is uncertain over the time, to determine the probabilistic outcomes at some point in the future.
Requirements: Modeling Toolkit, Risk Simulator.

Forecasting - Markov Switching Regime

This sample model illustrate how to run a simulation model on Markov chains to determine path-dependent movements of market share and the long-term steady-state effects of market share.
Requirements: Modeling Toolkit, Risk Simulator.

Forecasting - Nonlinear Extrapolation

This sample model illustrate how to forecast with a nonlinear extrapolation as well as comparing the results using more conventional time-series decomposition forecasts with trends and seasonality effects.
Requirements: Modeling Toolkit, Risk Simulator.

Forecasting - Regression and Basic Econometrics

This sample model illustrate how to use Risk Simulator for running a multiple regression analysis and econometric modeling.
Requirements: Modeling Toolkit, Risk Simulator.

Forecasting - Stochastic Processes (random walk, mean reversion, jump diffusion)

This simple model illustrates how simulate Stochastic Processes (Brownian Motion Random Walk, Mean-Reversion, Jump-Diffusion, and Mixed Models).
A stochastic process is a sequence of events or paths generated by probabilistic laws. That is, random events can occur over time but are governed by specific statistical and probabilistic rules. The main stochastic processes include Random Walk or Brownian Motion, Mean-Reversion and Jump-Diffusion. These processes can be used to forecast a multitude of variables that seemingly follow random trends but yet are restricted by probabilistic laws. We can use Risk Simulator’s Stochastic Process module to simulate and create such processes. These processes can be used to forecast a multitude of time-series data including stock prices, interest rates, inflation rates, oil prices, electricity prices, commodity prices, and so forth.

Requirement: Modeling Toolkit, Risk Simulator.

Forecasting - Time-Series Decomposition

This simple model illustrates how to run time-series analysis forecast, which take into account historical base values, trends and seasonalities to project the future.
Time-series forecasting decomposes the historical data into the baseline, trend, and seasonality, if any. The models then apply an optimization procedure to find the alpha, beta, and gamma parameters for the baseline, trend, and seasonality coefficients, and then recompose them into forecast. In other words, this methodology first applies a backcast to find the best-fitting model and best-fitting parameters of the model that minimizes forecast errors, and then proceeds to forecast the future based on the historical data that exist. This of course assumes that the same baseline growth, trend, and seasonality hold going forward. Even if they do not -say, when there exists a structural shift (e.g., company goes global, has a merger, spin-off, etc.) – the baseline forecasts can be computed and then the required adjustments can be made to the forecasts.
Requirement: Modeling Toolkit, Risk Simulator.

Optimization - Continuous Portfolio Optimization

This model illustrates how to run an optimization on continuous decision variables, viewing and interpreting optimization results.
This model shows 10 asset classes with different risk and return characteristics. The idea here is to find the best portfolio allocation such that the portfolio’s bang-for-the-buck or returns to risk ratio is maximized. That is, to allocate 100% of an individual’s investment portfolio among several different asset classes (e.g., different types of mutual funds or investment styles: growth, value, aggressive growth, income, global, index, contrarian, momentum, and so forth).
Requirements: Modeling Toolkit, Risk Simulator.

Optimization - Discrete Project Selection

This sample model illustrates how to run an optimization on discrete integer decision variables in project selection in order to choose the best projects in a portfolio given a large variety of project options, subject to risk, return, budget and other constraints.
Requirements: Modeling Toolkit, Risk Simulator.

Optimization - Efficient Frontier

This sample model illustrates how to run an optimization on discrete binary integer decision variables in project selection and mix, viewing and interpreting optimization results, creating additional qualitative constraints to the optimization model, and generating an investment Efficient Frontier by applying optimization on changing constraints.
Requirements: Modeling Toolkit, Risk Simulator.

Optimization - Inventory Optimization

This sample model illustrates how to run an optimization on manufacturing inventory to maximize profits and minimize costs subject to manufacturing capacity constraints.
Requirements: Modeling Toolkit, Risk Simulator.

Optimization - Optimal Pricing with Elasticity

This sample model illustrates how to run an optimization on manufacturing inventory to maximize profits and minimize costs subject to manufacturing capacity constraints.
Requirements: Modeling Toolkit, Risk Simulator.

Optimization - Optimal Timing

This sample model illustrates how to find the optimal harvest rate for a population that maximizes return. This is a population dynamics and harvest model, where given an initial population, the growth rate of the population, we can determine the optimal harvest rate that maximizes the return on investment, given a specific cost of capital and carrying capacity of the population. For instance, this can be a model on a herd of cattle or a forest of trees, where there is a carrying or maximum capacity of the population.
Requirements: Modeling Toolkit, Risk Simulator.

Optimization - Stochastic Optimization

This sample model illustrates how to run a stochastic optimization on continuous decision variables with simulation and interpreting optimization results. This model shows four asset classes with different risk and return characteristics. The idea here is to find the best portfolio allocation such that the portfolio’s bang-for-the-buck or returns to risk ratio is maximized. That is, the goal is to allocate 100% of an individual’s investment among several different asset classes (e.g., different types of mutual funds or investment styles: growth, value, aggressive growth, income, global, index, contrarian, momentum, and so forth).
Requirements: Modeling Toolkit, Risk Simulator.

Project Management (PERT, CPM, Time and Cost Estimation)

This model illustrates how to use Risk Simulator for running a Monte Carlo simulation on a multi-phased project to forecast costs and cost overruns, simulating discrete events (success/failure and go/no-go outcomes), linking these events, and linking the input parameters in a simulation assumption to calculated cells.
Requirements: Modeling Toolkit, Risk Simulator.

Risk Simulation Basics

This section also provides the novice risk analyst an introduction to the Risk Simulator software for performing Monte Carlo simulation, where a 30-day trial version of the software is included in the book’s DVD. This section starts off by illustrating what Risk Simulator does and what steps are taken in a Monte Carlo simulation as well as some of the more basic elements in a simulation analysis. It then continues with how to interpret the results from a simulation and ends with a discussion of correlating variable in a simulation as well as applying precision and control. Software versions with new enhancements are released continually.

Six Sigma - CPK Capability, Control Charts, Specification Models

Capability Measures (Cp and Cpk) and Specification Levels (USL and LSL):
Capability measures in Six Sigma are the central analytics in quality control. Defective Proportion Units (DPUs) are the number of defects observed with respect to the total opportunities or outcomes, and can be measured with respect to a million opportunities (defective parts per million, DPPM, or also known as the defects per millions opportunities, DPMO). Process Capability Index or Cpk takes into account the off- centered distribution and analyzing it as a centered process producing a similar level of defects (defined as the radio between permissible deviation, measured from the mean value to nearest specific limit of acceptability, and the actual one-side 3 sigma spread of the process).
These two Six Sigma models illustrate how to obtain the various capability measures such as the capability index (Cpk and Cp), yields, defects per million opportunities (DPMO) and others.
Requirements: Modeling Toolkit, Risk Simulator.

Six Sigma - Parametric and Nonparametric Hypothesis Tests

This chapter is a follow up from the last chapter on hypothesis testing. Here we continue the quest of quality control and sampling to test one and two different variables to see if they are statistically significantly similar or different from some hypothesized value. The methodologies introduced in this chapter are more advanced than those presented in the last chapter, but build on similar concepts. All the statistical tests shown in this chapter are available in the Statistical Tools Section of Modeling Toolkit.
This chapter illustrates how to use Modeling Toolkit’s Statistical Tools to run single and two variable hypothesis tests using parametric and nonparametric techniques.
Requirements: Modeling Toolkit, Risk Simulator.

Six Sigma - Sample Size Determination

In performing quality controls and hypothesis testing, the size of the sample collected is of paramount importance. Theoretically, it would be impossible or too expensive and impractical to collect information and data on the entire population to be tested (e.g., all outputs from a large manufacturing facility). Therefore, statistical sampling is required. The question is: What size sample is sufficient?
These five Six Sigma models illustrate how to obtain the required sample size in performing hypotheses testing from means to standard deviations and proportions.
Requirements: Modeling Toolkit, Risk Simulator.

Six Sigma - Statistical Probability Determination

This Chapter illustrates how to use Risk Simulator’s Distributional Analysis and Modeling Toolkit’s probability functions to obtain exact probabilities of events, confidence intervals and hypothesis testing for quality control, as well as using Risk Simulator for running hypothesis tests after a simulation run, generating a hypothesis test with raw data, understanding the concept of random seeds, and running a nonparametric bootstrap simulation to obtain the confidence intervals of the statistics.
Requirements: Modeling Toolkit, Risk Simulator.

References

Biography and Adoption Rate

Dr. Johnathan C. Mun is the founder, Chairman and CEO of Real Options Valuation, Inc. (ROV), a consulting, training, and software development firm specializing in strategic real options, financial valuation, Monte Carlo Risk Simulation, stochastic forecasting, optimization, business intelligence analytics, and risk analysis located in northern Silicon Valley, California.

Testimonials

CLIENT REFERENCES:
Sample companies that have been through our training seminars or obtained our software and consulting services: 3M, Accenture, AIG, Allstate Insurance, Airbus, Alexion, Aquiva Trading, AT&T, Boeing, Chevron Texaco, Duke Energy, Eli Lily, GE, GE Capital, Glaxo SmithKline, Goodyear, Halliburton, Intel, Johnson & Johnson, Lloyds Bank, Motorola, Northrop Grumman, Phillips, Pioneer, Roche Molecular Diagnostics, Seagate, Schlumberger, Shell, Sprint, State of California, Sunoco, Syngenta, Timken, Total Elf Fina, U.S. Department of Defense (Air Force, Army, Navy, Marines), Washington Gas, over 100+ universities globally, and many other multinationals...

Training

Detailed Course Outlines

Table of Contents: MODULE 1: Introduction to Risk Analysis. MODULE 2: Monte Carlo Simulation with Risk Simulator. MODULE 3: Advanced Simulation Techniques. MODULE 4: Simulation and Analytical Tools. MODULE 5: Forecasting. MODULE 6: Real Options Analysis: Theory and Background. MODULE 8: Optimization with Risk Simulator.

Live Seminars Information

We offer several hands-on training courses and seminars in the areas of risk analysis and real options analysis, taught by world-learning experts on the topics. Our classes are generally kept to a small size to cultivate a better learning atmosphere and to provide the opportunity for more one-to-one attention by the instructors. Courses are typically conducted in regional locations around the world (check our website for the latest schedule) in state-of-the-art computer-equipped seminar rooms, where each participant will have his or her own computer terminal.

Training DVD Detailed Modules List

Training DVD contents:
MODULE 1: Introduction to Risk Analysis (56 min). MODULE 2: Monte Carlo Simulation with Risk Simulator (1 hour). MODULE 3: Advanced Simulation Techniques (1 hour 20 min). MODULE 4: Simulation and Analytical Tools (1 hour 10 min). MODULE 5: Forecasting (1 hour 33 min). MODULE 6: Real Options Analysis: Theory and Background (2 hour 40 min).MODULE 7: Real Options Analysis: Application with SLS Software (1 hour 30 min). MODULE 8: Optimization with Risk Simulator (45 min).

Training DVD Set

Training DVD contents:
MODULE 1: Introduction to Risk Analysis (56 min). MODULE 2: Monte Carlo Simulation with Risk Simulator (1 hour). MODULE 3: Advanced Simulation Techniques (1 hour 20 min). MODULE 4: Simulation and Analytical Tools (1 hour 10 min). MODULE 5: Forecasting (1 hour 33 min). MODULE 6: Real Options Analysis: Theory and Background (2 hour 40 min).MODULE 7: Real Options Analysis: Application with SLS Software (1 hour 30 min). MODULE 8: Optimization with Risk Simulator (45 min).

Applied Examples

Multinomial Options

In this whitepaper shows: 1-. American and European mean-reversion option using Trinomial Lattices; 2.- Jump-Diffusion option using Quadranomial Lattices; 3.- Dual-Variable rainbow option using Pentanomial Lattices.

Options to Choose

The Contraction, Expansion, and Abandonment Option applies when a firm has three competing and mutually exclusive options on a single project to choose from at different times up to the time of expiration. Be aware that this is a mutually exclusive set of options. That is, you cannot execute any combinations of expansion, contraction, or abandonment at the same time. Only one option can be executed at any time. That is, for mutually exclusive options, use a single model to compute the option value. However, if the options are nonmutually exclusive, calculate them individually in different models and add up the values for the total value of the strategy.

Options to Contract

A Contraction Option evaluates the flexibility value of being able to reduce production output or to contract the scale and scope of a project when conditions are not as amenable, thereby reducing the value of the asset or project by a Contraction Factor, but at the same time creating some cost Savings.

Options to Expand

The Expansion Option values the flexibility to expand from a current existing state to a large or expanded state. Therefore, an existing state or condition must first be present in order to use the expansion option. That is, there must be a base case on which to expand. If there is no base case state, then the simple Execution Option (calculated using the simple Call Option) is more appropriate, where the issue at hand is whether to execute a project immediately or to defer execution.

Sequential Compound Options

Sequential Compound Options are applicable for research and development investments or any other investments that have multiple stages. The MSLS is required for solving Sequential Compound Options.