This document discusses sensitivity analysis in SuperDecisions. It provides examples of sensitivity analysis in a car hierarchy model and a complex BOCR (Benefits, Opportunities, Costs, Risks) model assessing options for the US President in Baghdad. Sensitivity analysis allows varying the priority of criteria or nodes to see how it impacts the overall results. The document demonstrates sensitivity with respect to criteria priorities in both models, and sensitivity to BOCR node priorities in the complex model.
This document provides a tutorial for using the SuperDecisions software to build multi-criteria decision models. It explains how to install the software, build a decision hierarchy, make pairwise comparisons between criteria and alternatives, view results, and perform sensitivity analysis. The tutorial uses a sample model to select the best car out of three alternatives based on criteria like price, miles per gallon, prestige, and comfort. It demonstrates how to construct the hierarchy in SuperDecisions, enter pairwise comparison judgments, and view the resulting supermatrix before and after synthesis.
This document describes the Analytic Network Process (ANP) model for complex decision making. The ANP model includes the following key elements:
1. A top-level network with four merit nodes: benefits, opportunities, costs, and risks.
2. Subnetworks below each merit node containing control criteria hierarchies to evaluate each merit.
3. Additional subnetworks for high priority control criteria containing decision alternatives.
4. Pairwise comparisons to obtain weights for criteria, alternatives, and influences between elements. Limit matrices converge the results.
5. Sensitivity analysis identifies the best alternative for different priorities of the merit nodes like benefits, costs, and risks. The document provides an example ANP model and
This document provides a tutorial on using the SuperDecisions software for Analytic Hierarchy Process (AHP) and Analytic Network Process (ANP) models. It outlines the basic process of creating clusters, nodes, links between nodes, making pairwise comparisons, and obtaining results. It demonstrates how to build a simple 3-level hierarchy to choose the best car based on criteria like prestige, price, and miles per gallon. It also discusses features like different comparison modes, improving inconsistency, sensitivity analysis, and building a ratings model instead of a relative model.
ANP can be used to model market share by creating a network of factors that influence alternatives' market share. The model connects marketing, product, and other factors to shoe brand alternatives like Nike, Reebok, and Adidas. Pairwise comparisons of factors and alternatives produce priorities that estimate each brand's relative market share. Validating ANP results against external data demonstrates the model's ability to incorporate judgment.
Recommender systems have become an important personalization technique
on the web and are widely used especially in e-commerce applications.
However, operators of web shops and other platforms are challenged by
the large variety of available algorithms and the multitude of their
possible parameterizations. Since the quality of the recommendations that are
given can have a significant business impact, the selection
of a recommender system should be made based on well-founded evaluation
data. The literature on recommender system evaluation offers a large
variety of evaluation metrics but provides little guidance on how to choose
among them. The paper which is presented in this presentation focuses on the often neglected aspect of clearly defining the goal of an evaluation and how this goal relates to the
selection of an appropriate metric. We discuss several well-known
accuracy metrics and analyze how these reflect different evaluation goals. Furthermore we present some less well-known metrics as well as a variation of the area under the curve measure that are particularly suitable for the evaluation of
recommender systems in e-commerce applications.
This document provides an overview of advanced analytics capabilities in the Necto training Module 6. It discusses performing analytics using formulas, exceptions, and properties. Formulas allow users to define custom members using predefined formulas or MDX functions. Exceptions highlight outlier values based on defined criteria. Properties control the visual attributes and functionality of workboards and their components. The document demonstrates how to create formulas, exceptions, and edit properties to enhance analytics and workboards in Necto.
The document provides 9 steps to configure Bill Redirect software to redirect package dimension readings from a Dimmer III device via serial communication to a computer's keyboard buffer, allowing the dimensions to be input as keyboard strokes into any shipping/packaging software. The steps cover downloading and installing the software, configuring serial port settings, creating a virtual button to request dimensions, and testing.
The document provides step-by-step instructions for repairing and enhancing a low-resolution digital photograph using layers and filters in Photoshop. It describes isolating noise in the blue channel, creating blurred layers to smooth skin tones and add soft focus, and using layer masks to selectively sharpen details. It then explains how to separate the foreground subject from the background to add depth of field by blurring the background layer.
Lead Scoring Group Case Study Presentation.pdfKrishP2
油
The document describes a case study to build a logistic regression model to predict lead conversion for an online education company. Key steps included:
- Preparing the data by handling missing values, outliers, encoding categorical variables.
- Using recursive feature elimination to select the top 20 predictive features.
- Building a logistic regression model and selecting the final 16 features based on p-values and VIF.
- Choosing a probability threshold of 0.33 based on model evaluation metrics to classify leads as converted or not.
- Calculating lead scores by multiplying the conversion probability by 100.
The top three predictive categorical variables were Tags_Lost to EINS, Tags_Closed by Horizzon,
This document discusses goal seek and sensitivity analysis in Excel. It provides examples of using goal seek to find unknown values that satisfy a desired result. Sensitivity analysis determines how changes in input values affect the output. The document explains how to perform one-variable and two-variable sensitivity analysis in Excel using data tables. Examples are provided to illustrate goal seek and sensitivity analysis for business scenarios like determining break-even points and analyzing profit impacts.
Accuracy assessment compares a classified image to ground truth data by creating random points and a confusion matrix. It determines overall accuracy and producer's and user's accuracies. Thresholding identifies incorrectly classified pixels statistically based on classification measures. An accuracy assessment was performed on a classified image using 100 random points to generate a confusion matrix and accuracy report showing the image was 75% accurate overall. Post-classification correction was then applied using neighborhood statistics to improve classification accuracy.
The part is axisymmetrically modeled in solidworks(2D) before importing to ansys workbench where the boundary zones are identified and appropriate mesh settings is applied. The model is then imported in Fluent for analysis . Significant setting changes are Density based solver , Enhanced Eddy viscosity model with near wall treatment , solution steering , FMG initialization etc.
- A normal modes analysis was performed on a finite element model of a clamping set to determine its vibration mode shapes. The model was imported into HyperMesh and material properties and constraints were applied.
- An eigenvalue extraction was specified to calculate the first 6 modes. The results were viewed in HyperView and showed the component deforming in different patterns for each mode.
Acm Tech Talk - Decomposition Paradigms for Large Scale SystemsVinayak Hegde
油
This document discusses decomposition approaches for solving large-scale systems problems. It begins with an overview of general decomposition strategies and approaches to decomposition. It then discusses specific decomposition paradigms like model coordination and goal coordination. It provides examples of how decomposition can be applied to problems in areas like optimization, identification, and control. The document concludes with some illustrative examples and case studies.
Conjoint Analysis Part 3/3 - Market SimulatorMinha Hwang
油
The marketing simulator document provides an overview of how to build a market simulator model to forecast market shares and profits for new product options. It describes the key components of the model including defining the profile space, competitors, customers and choice rules. The document also explains how to compute utilities, predict choice, and aggregate choices to forecast market shares and profits under different scenarios.
The document provides information on using the BEX Query Designer in SAP BW. It describes key components of the Query Designer including info providers, query elements, variables, reusable structures, formulas, and calculated key figures. The Query Designer allows users to define queries, filters, and calculations not available directly in the info providers to retrieve and analyze data from SAP BW.
A BI-OBJECTIVE MODEL FOR SVM WITH AN INTERACTIVE PROCEDURE TO IDENTIFY THE BE...gerogepatton
油
A support vector machine (SVM) learns the decision surface from two different classes of the input points, there are misclassifications in some of the input points in several applications. In this paper a bi-objective quadratic programming model is utilized and different feature quality measures are optimized simultaneously using the weighting method for solving our bi-objective quadratic programming problem. An important contribution will be added for the proposed bi-objective quadratic programming model by getting different efficient support vectors due to changing the weighting values. The numerical examples, give evidence of the effectiveness of the weighting parameters on reducing the misclassification between two classes of the input points. An interactive procedure will be added to identify the best compromise solution from the generated efficient solutions.
A BI-OBJECTIVE MODEL FOR SVM WITH AN INTERACTIVE PROCEDURE TO IDENTIFY THE BE...ijaia
油
A support vector machine (SVM) learns the decision surface from two different classes of the input points, there are misclassifications in some of the input points in several applications. In this paper a bi-objective quadratic programming model is utilized and different feature quality measures are optimized simultaneously using the weighting method for solving our bi-objective quadratic programming problem. An important contribution will be added for the proposed bi-objective quadratic programming model by getting different efficient support vectors due to changing the weighting values. The numerical examples, give evidence of the effectiveness of the weighting parameters on reducing the misclassification between two classes of the input points. An interactive procedure will be added to identify the best compromise solution from the generated efficient solutions.
The document provides an overview of credit scoring and scorecard development. It discusses:
- The objectives of credit scoring in assessing credit risk and forecasting good/bad applicants.
- The types of clients that are categorized for scoring, including good, bad, indeterminate, insufficient, excluded, and rejected.
- The research objectives and challenges in building statistical models to assign risk scores and monitor model performance.
- The research methodology involving data partitioning, variable binning, scorecard modeling using logistic regression, and scorecard evaluation metrics like KS, Gini, and lift.
This document discusses goal seek and sensitivity analysis in Excel. It defines goal seek as a tool that finds an unknown value from known values to achieve a desired result. Sensitivity analysis determines how changes in inputs impact the output. The document provides examples of using goal seek to find the number of generators that must be sold to reach the break-even point. It also demonstrates sensitivity analysis by creating a data table to analyze how profit changes with different sales levels, costs, and prices.
This document summarizes a project to predict home insurance policy purchases from customer data using machine learning models. The authors explored a dataset of 260,753 customers to select meaningful features from 297 initial ones. They tested logistic regression, support vector machines, and gradient boosted trees on subsets of the data. Gradient boosted trees showed the best performance, with its accuracy on the test set increasing as more training data was used. The authors concluded this model was generalizing well to new data compared to the other algorithms tested.
The document discusses linear programming models and optimization techniques. It covers sensitivity analysis and duality analysis to determine parameter values where a linear programming solution remains valid. It also discusses solving linear programming problems with integer constraints and using network models to solve transportation problems. The document then provides an example of using the simplex method and sensitivity analysis to solve a linear programming problem to maximize profit based on production capacity constraints.
This document provides a tutorial and examples for implementing the CreditRisk+ model in a spreadsheet. It describes 9 examples of applying the model to different portfolio structures and assumptions. For each example, it provides the input assumptions, how to set up the model parameters, and describes the output results. The examples illustrate how to model single or multiple sectors, correlated and uncorrelated sectors, and incorporate severity variations.
This document provides a practical guide for using support vector machines (SVMs) for classification tasks. It recommends beginners follow a simple procedure of transforming data, scaling it, using a radial basis function kernel, and performing cross-validation to select hyperparameters. Real-world examples show this procedure achieves better accuracy than approaches without these steps. The guide aims to help novices rapidly obtain acceptable SVM results without a deep understanding of the underlying theory.
Predict Backorder on a supply chain data for an OrganizationPiyush Srivastava
油
The document discusses predicting backorders using supply chain data. It defines backorders as customer orders that cannot be filled immediately but the customer is willing to wait. The data analyzed consists of 23 attributes related to a garment supply chain, including inventory levels, forecast sales, and supplier performance metrics. Various machine learning algorithms are applied and evaluated on their ability to predict backorders, including naive Bayes, random forest, k-NN, neural networks, and support vector machines. Random forest achieved the best accuracy of 89.53% at predicting backorders. Feature selection and data balancing techniques are suggested to potentially further improve prediction performance.
COMPUTATIONAL ENGINEERING OF FINITE ELEMENT MODELLING FOR AUTOMOTIVE APPLICAT...IAEME Publication
油
Modals with complicated geometry, complex loads and boundary condition are difficult to analyse and evaluate in the terms of strain, stress, displacement and reaction forces by using theoretical methods. A given modal can be analysed by using Finite Element Method easily with the help of computer software ABAQUS CAE and can get approximate solutions. This report is about modelling two dimensional and three dimensional analyses with the ABAQUS CAE for plane stress, plane strain, shell, and beam and 3d solid modal elements.
This document presents a model for profiling mobile telecom subscribers based on their credit behavior. It discusses using subscriber profiling for credit management, monitoring payments, and targeting promotions. Key attributes for credit profiling include network tenure, payment delay, payment gap, and revenue. A fuzzy c-means clustering algorithm is used to segment subscribers into clusters based on these attributes. The cluster centroids are analyzed to identify valuable subscriber segments to retain and opportunity segments to positively influence.
City of Pittsburgh decision about Penguins ArenaElena Rokou
油
The document discusses options for what the city of Pittsburgh should do regarding the aging Mellon Arena, home of the Pittsburgh Penguins hockey team. It presents an analytic network process model with alternatives of building a new arena, refurbishing the existing one, or keeping it as is. The model evaluates criteria such as financial, image, social, and competitive factors. Across different calculation methods, the results consistently recommend building a new arena as the optimal choice to retain the team and maximize benefits for the city and community. Sensitivity analysis confirms this recommendation.
Int'l policy US tariffs on imported steelElena Rokou
油
President Bush imposed temporary tariffs of up to 30% on steel imports to provide relief for the struggling U.S. steel industry for three years. However, U.S. trading partners criticized the tariffs as protectionist and threatened retaliation. The document analyzes three alternatives - imposing tariffs, not imposing tariffs, or addressing steel overproduction through the WTO. It uses an ANP model to rate each alternative based on benefits, opportunities, costs, and risks to local and global priorities. The model concludes the best approach is to resolve steel overproduction issues under WTO rules rather than impose tariffs.
Lead Scoring Group Case Study Presentation.pdfKrishP2
油
The document describes a case study to build a logistic regression model to predict lead conversion for an online education company. Key steps included:
- Preparing the data by handling missing values, outliers, encoding categorical variables.
- Using recursive feature elimination to select the top 20 predictive features.
- Building a logistic regression model and selecting the final 16 features based on p-values and VIF.
- Choosing a probability threshold of 0.33 based on model evaluation metrics to classify leads as converted or not.
- Calculating lead scores by multiplying the conversion probability by 100.
The top three predictive categorical variables were Tags_Lost to EINS, Tags_Closed by Horizzon,
This document discusses goal seek and sensitivity analysis in Excel. It provides examples of using goal seek to find unknown values that satisfy a desired result. Sensitivity analysis determines how changes in input values affect the output. The document explains how to perform one-variable and two-variable sensitivity analysis in Excel using data tables. Examples are provided to illustrate goal seek and sensitivity analysis for business scenarios like determining break-even points and analyzing profit impacts.
Accuracy assessment compares a classified image to ground truth data by creating random points and a confusion matrix. It determines overall accuracy and producer's and user's accuracies. Thresholding identifies incorrectly classified pixels statistically based on classification measures. An accuracy assessment was performed on a classified image using 100 random points to generate a confusion matrix and accuracy report showing the image was 75% accurate overall. Post-classification correction was then applied using neighborhood statistics to improve classification accuracy.
The part is axisymmetrically modeled in solidworks(2D) before importing to ansys workbench where the boundary zones are identified and appropriate mesh settings is applied. The model is then imported in Fluent for analysis . Significant setting changes are Density based solver , Enhanced Eddy viscosity model with near wall treatment , solution steering , FMG initialization etc.
- A normal modes analysis was performed on a finite element model of a clamping set to determine its vibration mode shapes. The model was imported into HyperMesh and material properties and constraints were applied.
- An eigenvalue extraction was specified to calculate the first 6 modes. The results were viewed in HyperView and showed the component deforming in different patterns for each mode.
Acm Tech Talk - Decomposition Paradigms for Large Scale SystemsVinayak Hegde
油
This document discusses decomposition approaches for solving large-scale systems problems. It begins with an overview of general decomposition strategies and approaches to decomposition. It then discusses specific decomposition paradigms like model coordination and goal coordination. It provides examples of how decomposition can be applied to problems in areas like optimization, identification, and control. The document concludes with some illustrative examples and case studies.
Conjoint Analysis Part 3/3 - Market SimulatorMinha Hwang
油
The marketing simulator document provides an overview of how to build a market simulator model to forecast market shares and profits for new product options. It describes the key components of the model including defining the profile space, competitors, customers and choice rules. The document also explains how to compute utilities, predict choice, and aggregate choices to forecast market shares and profits under different scenarios.
The document provides information on using the BEX Query Designer in SAP BW. It describes key components of the Query Designer including info providers, query elements, variables, reusable structures, formulas, and calculated key figures. The Query Designer allows users to define queries, filters, and calculations not available directly in the info providers to retrieve and analyze data from SAP BW.
A BI-OBJECTIVE MODEL FOR SVM WITH AN INTERACTIVE PROCEDURE TO IDENTIFY THE BE...gerogepatton
油
A support vector machine (SVM) learns the decision surface from two different classes of the input points, there are misclassifications in some of the input points in several applications. In this paper a bi-objective quadratic programming model is utilized and different feature quality measures are optimized simultaneously using the weighting method for solving our bi-objective quadratic programming problem. An important contribution will be added for the proposed bi-objective quadratic programming model by getting different efficient support vectors due to changing the weighting values. The numerical examples, give evidence of the effectiveness of the weighting parameters on reducing the misclassification between two classes of the input points. An interactive procedure will be added to identify the best compromise solution from the generated efficient solutions.
A BI-OBJECTIVE MODEL FOR SVM WITH AN INTERACTIVE PROCEDURE TO IDENTIFY THE BE...ijaia
油
A support vector machine (SVM) learns the decision surface from two different classes of the input points, there are misclassifications in some of the input points in several applications. In this paper a bi-objective quadratic programming model is utilized and different feature quality measures are optimized simultaneously using the weighting method for solving our bi-objective quadratic programming problem. An important contribution will be added for the proposed bi-objective quadratic programming model by getting different efficient support vectors due to changing the weighting values. The numerical examples, give evidence of the effectiveness of the weighting parameters on reducing the misclassification between two classes of the input points. An interactive procedure will be added to identify the best compromise solution from the generated efficient solutions.
The document provides an overview of credit scoring and scorecard development. It discusses:
- The objectives of credit scoring in assessing credit risk and forecasting good/bad applicants.
- The types of clients that are categorized for scoring, including good, bad, indeterminate, insufficient, excluded, and rejected.
- The research objectives and challenges in building statistical models to assign risk scores and monitor model performance.
- The research methodology involving data partitioning, variable binning, scorecard modeling using logistic regression, and scorecard evaluation metrics like KS, Gini, and lift.
This document discusses goal seek and sensitivity analysis in Excel. It defines goal seek as a tool that finds an unknown value from known values to achieve a desired result. Sensitivity analysis determines how changes in inputs impact the output. The document provides examples of using goal seek to find the number of generators that must be sold to reach the break-even point. It also demonstrates sensitivity analysis by creating a data table to analyze how profit changes with different sales levels, costs, and prices.
This document summarizes a project to predict home insurance policy purchases from customer data using machine learning models. The authors explored a dataset of 260,753 customers to select meaningful features from 297 initial ones. They tested logistic regression, support vector machines, and gradient boosted trees on subsets of the data. Gradient boosted trees showed the best performance, with its accuracy on the test set increasing as more training data was used. The authors concluded this model was generalizing well to new data compared to the other algorithms tested.
The document discusses linear programming models and optimization techniques. It covers sensitivity analysis and duality analysis to determine parameter values where a linear programming solution remains valid. It also discusses solving linear programming problems with integer constraints and using network models to solve transportation problems. The document then provides an example of using the simplex method and sensitivity analysis to solve a linear programming problem to maximize profit based on production capacity constraints.
This document provides a tutorial and examples for implementing the CreditRisk+ model in a spreadsheet. It describes 9 examples of applying the model to different portfolio structures and assumptions. For each example, it provides the input assumptions, how to set up the model parameters, and describes the output results. The examples illustrate how to model single or multiple sectors, correlated and uncorrelated sectors, and incorporate severity variations.
This document provides a practical guide for using support vector machines (SVMs) for classification tasks. It recommends beginners follow a simple procedure of transforming data, scaling it, using a radial basis function kernel, and performing cross-validation to select hyperparameters. Real-world examples show this procedure achieves better accuracy than approaches without these steps. The guide aims to help novices rapidly obtain acceptable SVM results without a deep understanding of the underlying theory.
Predict Backorder on a supply chain data for an OrganizationPiyush Srivastava
油
The document discusses predicting backorders using supply chain data. It defines backorders as customer orders that cannot be filled immediately but the customer is willing to wait. The data analyzed consists of 23 attributes related to a garment supply chain, including inventory levels, forecast sales, and supplier performance metrics. Various machine learning algorithms are applied and evaluated on their ability to predict backorders, including naive Bayes, random forest, k-NN, neural networks, and support vector machines. Random forest achieved the best accuracy of 89.53% at predicting backorders. Feature selection and data balancing techniques are suggested to potentially further improve prediction performance.
COMPUTATIONAL ENGINEERING OF FINITE ELEMENT MODELLING FOR AUTOMOTIVE APPLICAT...IAEME Publication
油
Modals with complicated geometry, complex loads and boundary condition are difficult to analyse and evaluate in the terms of strain, stress, displacement and reaction forces by using theoretical methods. A given modal can be analysed by using Finite Element Method easily with the help of computer software ABAQUS CAE and can get approximate solutions. This report is about modelling two dimensional and three dimensional analyses with the ABAQUS CAE for plane stress, plane strain, shell, and beam and 3d solid modal elements.
This document presents a model for profiling mobile telecom subscribers based on their credit behavior. It discusses using subscriber profiling for credit management, monitoring payments, and targeting promotions. Key attributes for credit profiling include network tenure, payment delay, payment gap, and revenue. A fuzzy c-means clustering algorithm is used to segment subscribers into clusters based on these attributes. The cluster centroids are analyzed to identify valuable subscriber segments to retain and opportunity segments to positively influence.
City of Pittsburgh decision about Penguins ArenaElena Rokou
油
The document discusses options for what the city of Pittsburgh should do regarding the aging Mellon Arena, home of the Pittsburgh Penguins hockey team. It presents an analytic network process model with alternatives of building a new arena, refurbishing the existing one, or keeping it as is. The model evaluates criteria such as financial, image, social, and competitive factors. Across different calculation methods, the results consistently recommend building a new arena as the optimal choice to retain the team and maximize benefits for the city and community. Sensitivity analysis confirms this recommendation.
Int'l policy US tariffs on imported steelElena Rokou
油
President Bush imposed temporary tariffs of up to 30% on steel imports to provide relief for the struggling U.S. steel industry for three years. However, U.S. trading partners criticized the tariffs as protectionist and threatened retaliation. The document analyzes three alternatives - imposing tariffs, not imposing tariffs, or addressing steel overproduction through the WTO. It uses an ANP model to rate each alternative based on benefits, opportunities, costs, and risks to local and global priorities. The model concludes the best approach is to resolve steel overproduction issues under WTO rules rather than impose tariffs.
Best time to withdraw from iraq BOCR modelElena Rokou
油
The document provides background information on the US invasion and occupation of Iraq, including a timeline of key events from 1990 to 2006. It then presents 5 alternatives for withdrawal of US troops from Iraq: 1) Upon Iraqi government request, 2) After 5 years, 3) Within 2-5 years, 4) Within 1 year, 5) Within 3 months with troop redeployment. Metrics are provided to evaluate the strategic criteria and costs/benefits of each alternative.
The document discusses the Arctic National Wildlife Refuge (ANWR) in Alaska. It notes that ANWR is 19 million acres in size, with 1.5 million acres of coastal plain that various administrations have debated opening to oil and gas exploration. A table then provides a cluster matrix overview that rates the potential benefits, costs, opportunities and risks of opening ANWR to exploration across economic, political, and social dimensions.
Here are the two structures:
Structure 1:
Marketing Strategy (50%)
West Side (25%)
Store 1 (12.5%)
Store 2 (12.5%)
City Centre (25%)
Store 3 (25%)
East Side (25%)
Store 4 (12.5%)
Store 5 (12.5%)
Structure 2:
Marketing Strategy (50%)
Store 1 (10%)
Store 2 (10%)
Store 3 (10%)
Store 4 (10%)
Store 5 (10%)
He incorrectly assumes the criteria weights are arbitrary when they depend on the assumed results and structure. No wonder he gets different results - AHP is being misapplied. The
1. The document provides instructions for creating a ratings model in Expert Choice software. It involves building a hierarchical criteria model, adding alternatives to evaluate, editing the criteria to appear as column headings in the ratings spreadsheet, adding the alternatives, establishing categories and priorities for qualitative criteria through pairwise comparisons, and rating each alternative on the quantitative criteria.
2. The ratings are completed by creating categories for the remaining criteria, rating each alternative, and computing the total priorities for each alternative through synthesis to obtain the results.
The document introduces the Analytic Network Process (ANP), which is an extension of the Analytic Hierarchy Process (AHP) that allows for inner and outer dependence relationships between decision elements. In ANP, criteria are prioritized based on their importance to actual alternatives rather than to an abstract goal. Feedback comparisons are made between alternatives and criteria to establish priority vectors. A network supermatrix incorporates these dependencies to arrive at a final synthesis. The results may differ from AHP because ANP accounts for interdependencies between decision elements that influence priorities.
This curriculum vitae provides contact information and an overview of the educational and professional experience of Elena Rokou. She holds a PhD candidate degree from the National Technological University of Athens in Mechanical Engineering, an MSc from the University of the Aegean in Financial Engineering, and a BSc from the University of Ioannina in Computer Science. Her professional experience includes positions in IT project management, software engineering, and teaching. She is fluent in English, Italian, and Greek.
2. Contents
(to make hyperlinks active use slide show mode)
Sensitivity in hierarchies
Sensitivity in networks
Sensitivity in BOCR complex models
Sensitivity with respect to the BOCR priorities
Sensitivity with respect to control criteria
Sensitivity with respect to judgments
3. SENSITIVITY IN HIERARCHIES
Sensitivity in hierarchies requires picking an
independent top node (for example, the goal), a
with respect to node in a level below it (for
example, a criterion), varying the values of that
node and seeing how it affects the best outcome
(for example, the alternatives).
The following shows sensitivity with respect to
the Price criterion for the car hierarchy sample
model from SuperDecisions. Sample models
are under the Help command.
5. Car Hierarchy Sensitivity
1. Select Computations>Sensitivity to get into sensitivity analysis. Select
Edit to pick your independent variable.
6. Car Hierarchy Sensitivity (contd)
2. The first node in the model, alphabetically, will appear in the Input
Selector dialogue box. Change it to the goal node by selecting it and
pressing Edit to get into the Parameter Selector dialogue box
7. Car Hierarchy Sensitivity (contd)
3. The Edit Parameter dialogue box appears, Fig. 1. Select parameter type 2
(SuperMatrix), Network 0, the wrt Node is the Independent Variable, the Goal, and
the 1st other node is the one we are interested in, Price. Six steps are adequate for
this linear type of sensitivity. Press Done to get back to the Sensitivity Input
selector, Fig. 2, to see your selections. They are correct here. Press Update to
see the graph, next page.
Fig. 1 Fig. 2
8. Car Hierarchy Sensitivity (contd)
4. The dotted line for the priority of the
what-if node is initially set to 50%, or
at x = .5. When x = 0, its priority is 0,
at x = 1, its priority is 1. As you change
Its priority the priorities of the other criteria
change proportionately.
The Avalon is the cheap car, the
Babylon the mid-price and the Carryon
the luxury expensive car. The vertical
line here is at the priority Price has in
the model: about 49 %. At this priority
for Price the Avalon gets 31% of the
priority, the Babylon about 26% and
the Carryon about 43%. Grab the cursor
with your mouse and move it to the
right on the x-axis to see that after about
70% the Avalon becomes best choice.
Sensitivity can also be done for the other
criteria in the model: Prestige, etc.
9. Car Hierarchy Sensitivity (contd)
To see the values used to plot the current graph select File>Save
in Sensitivity and save to a .txt file. Start Excel, select
File>Open and enter the name you gave the .txt file. The Data
Import Wizard will appear. Keep clicking Next to import the
data. Below are the values for the Price graph with 6 steps.
The more steps you have, the closer you can get to the Price
value that corresponds to the synthesized results.
Input Value Matrix: Goal 1Price Avalon Babylon Carryon
0 1.00E-04 2.14E-01 1.79E-01 6.07E-01
0.2 2.00E-01 2.53E-01 2.11E-01 5.36E-01
0.4 4.00E-01 2.91E-01 2.43E-01 4.66E-01
0.6 6.00E-01 3.30E-01 2.75E-01 3.95E-01
0.8 8.00E-01 3.68E-01 3.07E-01 3.25E-01
1 1.00E+00 4.07E-01 3.39E-01 2.54E-01
10. Sensitivity in a simple network
This section is under construction.
11. Sensitivity in a Complex Model
By a complex model we mean a BOCR model that
has a top level network containing the BOCR
nodes, control criteria hierarchies in the
subnets attached to the BOCR and decision
subnets attached to the selected control
criteria. Sensitivity can be done 3 ways, with
respect to:
1. Priorities of the BOCR nodes
2. Priorities of the Control Criteria nodes
3. Judgments in the pairwise comparison matrices
12. Doing Sensitivity for BOCR nodes
In a complex BOCR model, one can examine what effect
changing the priority of a merit node, for example,
Benefits, does to the ranks and synthesized priorities of
the alternatives. To see the priorities of the BOCR go to
Assess>Ratings and turn on the View>Priorities
command in Ratings. These are the b, o, c and r
priorities for Benefits, Opportunities, Costs and Risks,
used in the formula to combine the B, O, C, and R
synthesized vectors that are being passed up from the
control criteria subnets. The Raw values are passed up
to be used in the formula. The Additive (negative)
formula is bB + oO cC rR,
the Multiplicative formula is (bB oO)/(cC rR)
13. Bush in Baghdad Model
Load the NMD model for sensitivity.mod from the Sample models in SuperDecisions.
Select the Design>Standard Formula command which shows the selected formula is
Additive (negative). There are some issues with the numbers being displayed in the
Sensitivity module, but the general trend of the graph is correct. We are working on
these issues as of Sept. 2006 and hope to have them resolved soon.
14. Bush in Baghdad Model
Synthesized results with Additive (probabilistic) formula shown in Fig. 1.
The priorities of Benefits, Opportunities, Costs and Risks are shown from
the Ratings spreadshee in Fig. 2 (select Assess>Ratings in the main model
to get into the Ratings Module and turn on View>Priorities to see Priorities.
Fig. 1. Synthesized Results
Fig. 2. B, O, C, and R Priorities
15. Sensitivity for Costs node
Select Computations>Sensitivity
from the main menu in the top level
network. You must change the
Independent variable from the alphabet-
Ically first node, Economic, to the one you
want: Costs, when the Sensitivity
Analysis window opens. Select
Edit>Independent Variable to get
to the Sensitivity Input selector
dialogue box shown on the next slide.
16. Sensitivity for Costs node (contd)
The Sensitivity input selector dialogue box appears as shown in Fig. 1.
Click on the node that is initially shown there and select either
the New or Edit command which brings up the Edit Parameter dialogue
box shown in Fig. 2 where you can change the node to Benefits.
Fig. 1 Fig. 2
17. Sensitivity for Costs node (contd)
Set up the parameters as shown in Fig. 1 with Parameter Type: 0 (for
Priorities); Network: 0 (for top-level network - click the empty space at the
bottom of drop-down list to make it 0 if it is not that way); Wrt Node: Costs
(select Costs from drop down list). The number of steps is okay at 6, but you
may make it more if you wish say 10 or 20. Click the Done button to return to
the Input selector dialogue box as shown in Fig. 2. Click the Update button to
get the sensitivity graph for Costs shown on next slide.
Fig. 1 - Select parameter type 0, Fig. 2 Click Update for graph
Network 0, Costs, Done
18. Sensitivity for Costs node (contd)
The priorities of the
alternatives are read from the
projection on the y-axis of the
point at which the alternative
line intersects the vertical
dotted line. The priority for
Risks ranges from 0 to 1.0 on
the x-axis. Move the dotted
line by clicking on it and
dragging. The vertical line is
always shown initially at .5 on
the x-axis, or at 50% priority.
The best option is Deploy
NMD for Risks between about
.19 and .04. Between 0.4 and
0.55 R&D is best.
After Risks has a priority of
more than about 0.55 the
Terminate option is best.
19. Sensitivity for Costs node (contd)
The same process can be repeated for
Benefits, Opportunities and Costs.
Select Edit>Save and save the data points
that produced the graph to a .txt file.
Open it in Excel. Keep clicking Next in the
text import Wizard to get the values shown
on next slide.
20. Sensitivity for Costs node (contd)
Table of values leading to the graph on previous page.
Selecting more steps on the edit parameter dialogue box will
give more data points.
3 Work with UN to
ensure
1 Pre-emptive Attack 2 Attack Iraq only with Allied Weapons
Input Value Priority: Costs on Iraq Help Inspections
0 1.00E-04 2.39E-01 2.73E-01 3.65E-01
0.2 2.00E-01 1.77E-01 2.72E-01 3.67E-01
0.4 4.00E-01 1.24E-01 2.71E-01 3.68E-01
0.6 6.00E-01 7.70E-02 2.71E-01 3.69E-01
0.8 8.00E-01 3.62E-02 2.70E-01 3.70E-01
1 1.00E+00 1.70E-05 2.70E-01 3.71E-01