Вы находитесь на странице: 1из 13

UNIVERSITY OF MODERN SCIENCES

College of Business
Summer II-2016
ASSIGNMENT

Course Name: Risk Analysis and Modeling

Course Code: BUSN 415

Assessment:

Submission Date:

Summer II

20 August, 2016

Student Name:

Student ID: S0000000635

Abdulla Alhammadi

Section {

Total Marks: 20

Weightage: 20%

INSTRUCTIONS TO THE CANDIDATES:

This is individual assessment. Group work is not allowed.

University strictly observe the policy of plagiarism only upto 20%.

SUBMIT the Assignment through STUDENT PORTAL

Assignments submitted through Student Portal will be marked ONLY

The Assignments submitted after DUE DATE will not be ENTERTAINED

QUESTION 1. COMPARE

AND CONTRAST RISK AND UNCERTAINTY .

Risk is defined as the scenario of prevailing or dropping something


worth. Uncertainty is a circumstance where there is no expertise
about the future events.
Risk may be measured and quantified, via theoretical fashions.
Conversely, it is not feasible to degree uncertainty in quantitative
terms, as the destiny activities are unpredictable.
The capacity effects are recognized in threat, whereas in case of
uncertainty, the outcomes are unknown.
Risk may be controlled, if proper measures are taken to govern it.
Then again, uncertainty is beyond the manipulate of the man or
woman or corporation, because the future is unsure.
Minimization of risk can be carried out, with the aid of taking
essential precautions. In place of uncertainty that can't be
minimized.
In risk, chances are assigned to a fixed of instances which is not
possible in case of uncertainty.

QUESTION 2. WHAT IS THE


MONTE CARLO SIMULATION

MAJOR LIMITATION OF ONLY USING


TO PERFORM RISK ANALYSIS ?

Due to the fact one does now not solve Newtons equations of movement; no
dynamical records may be accumulated from a conventional Monte Carlo
simulation. One of the major difficulties of Monte Carlo simulations of
proteins in a specific solvent is the difficulty of engaging in large-scale
movements. Any pass that appreciably alters the inner coordinates of the
protein without additionally moving the solvent debris will in all likelihood
result in a massive overlap of atoms and, as a result, the rejection of the trial
configuration. Simulations the usage of an implicit solvent do no longer be
afflicted by those drawbacks, and, consequently, coarse-grained protein
fashions are the maximum famous systems wherein Monte Carlo techniques
are used. there is additionally no trendy, proper, freely available program for
the Monte Carlo simulation of proteins due to the fact the selection of which
Monte Carlo moves to apply, and the rates at which they're attempted, range
for the precise trouble one is inquisitive about, despite the fact that we
notice that a Monte Carlo module has recently been brought to CHARMM

QUESTION 3.WHAT

ARE THE DIFFERENCES BETWEEN TIME -SERIES

FORECASTING TECHNIQUES AND NONLINEAR EXTRAPOLATION ?

Time series forecasting methods produce forecasts based completely on


historic values. Time collection forecasting methods are broadly used in
business conditions where forecasts of a year or less are required. The time
collection techniques used in ezForecaster are in particular appropriate to
sales, advertising, Finance, and manufacturing planning. Time collection
strategies have the advantage of relative simplicity, but certain factors want
to be taken into consideration
The extrapolation method can be implemented inside the indoors
reconstruction hassle. Frequently there isn't enough statistics that has been
obtained from experimentation and we need to extend, or extrapolate, from
acknowledged statistics to values beyond the recognized. Often the
extrapolation is linear

QUESTION 4. EXPLAIN
MEANS :

WHAT EACH OF THE FOLLOWING TERMS

Time-series analysis
A time series is a chain of statistics factors listed (or graphed) in time order.
Maximum commonly, a time collection is a series taken at successive
similarly spaced points in time. Thus its miles a series of discrete-time facts.
Examples of time collection are heights of ocean tides, counts of sunspots,
and the everyday last price of the Dow Jones business common.
Ordinary least squares
Ordinary Least Squares or OLS is one of the best strategies of linear
regression. The goal of OLS is to intently "fit" a feature with the statistics. It
does so by using minimizing the sum of squared errors from the records.
Regression analysis
Regression analysis is a statistical device for the investigation of
relationships between variables. usually, the investigator seeks to check the
causal impact of 1 variable upon any otherthe impact of a price growth
upon demand, as an example, or the impact of adjustments within the
money deliver upon the inflation rate. To discover such issues, the
investigator assembles records at the underlying variables of hobby and
employs regression to estimate the quantitative impact of the causal
variables upon the variable that they have an effect on. The investigator also
usually assesses the statistical importance of the predicted relationships,
that is, the degree of self-belief that the genuine dating is close to the
expected relationship.
Heteroskedasticity
Heteroskedasticity, in records, is when the standard deviations of a variable,
monitored over a selected amount of time, are no constant.
Heteroskedasticity frequently arises in two bureaucracies: conditional and
unconditional. Conditional Heteroskedasticity identifies no constant volatility
whilst destiny durations of excessive and low volatility cannot be diagnosed.
Unconditional Heteroskedasticity is used while futures intervals of excessive
and occasional volatility may be diagnosed.

Autocorrelation
Autocorrelation is a function of records wherein the correlation between the
values of the identical variables is primarily based on related gadgets. It
violates the assumption of instance independence, which underlies
maximum of the conventional models. It normally exists in those kinds of
information-units wherein the statistics, as opposed to being randomly
selected, is from the equal source.
Multicollinearity
Multicollinearity is a kingdom of very excessive intercorrelations or interinstitutions a number of the impartial variables. Its far consequently a kind
of disturbance inside the statistics, and if present in the records the
statistical inferences made about the statistics may not be reliable.
ARIMA
A statistical evaluation model that uses time series statistics to predict
destiny trends. it's miles a shape of regression analysis that seeks to are
expecting destiny moves alongside the apparently random stroll taken by
means of stocks and the financial marketplace via inspecting the differences
between values in the series as opposed to the usage of the actual facts
values. Lags of the differenced collection are known as "autoregressive" and
lags within forecasted records are called "transferring average."
This version kind is normally called ARIMA (p, d, q), with the integers
regarding the autoregressive, integrated and transferring common parts of
the facts set, respectively. ARIMA modeling can don't forget traits,
seasonality, cycles, errors and non-desk bound factors of a records set when
making forecasts.

QUESTION 5.EXPLAIN

WHY IF EACH OF THE FOLLOWING IS NOT

DETECTED PROPERLY OR CORRECTED FOR IN THE MODEL, THE


ESTIMATED MODEL WILL BE FLAWED :

Heteroskedasticity
Several tests exist to check for the presence of Heteroskedasticity. These
tests also are relevant for trying out misspecifications and nonlinearities. The
best technique is to graphically represent every independent variable
against the based variable as illustrated in advance inside the chapter. any
other method is to use one of the most broadly used fashions, the Whites
take a look at, in which the test is based totally on the null speculation of no
Heteroskedasticity in opposition to an change speculation of
Heteroskedasticity of some unknown well-known form. The test statistic is
computed by an auxiliary or secondary regression, where the squared
residuals or mistakes from the primary regression are regressed on all
feasible (and no redundant) go merchandise of the regressors.
Autocorrelation
One quite simple approach to test for autocorrelation is to graph the time
collection of a regression equations residuals. If these residuals show off
some cyclicality, then autocorrelation exists. Every other greater sturdy
method to locate autocorrelation is the use of the DurbinWatson statistic,
which estimates the potential for a first-order autocorrelation. The Durbin
Watson take a look at additionally identifies version misspecification, this is,
if a specific time-series variable is correlated to itself one length earlier. Many
time-collection statistics tend to be vehicle correlated to their ancient
occurrences. This dating may be because of a couple of reasons, which
includes the variables spatial relationships (similar time and area), extended
economic shocks and activities, mental inertia, smoothing, seasonal
modifications of the records, and so forth.

Multicollinearity
Multicollinearity exists when there is a linear dating between the unbiased
variables. Whilst this occurs, the regression equation cannot be anticipated
in any respect. In close to collinearity situations, the expected regression
equation will be biased and provide faulty results. This case is specifically
proper while a step-clever regression technique is used, wherein the
statistically large unbiased variables can be thrown out of the regression
blend earlier than anticipated, ensuing in a regression equation this is
neither efficient nor correct

QUESTIONS 6.CRITICALLY
MODELS:

EXPLAIN THE FOLLOWING ECONOMETRIC

Vector Autoregression Model (VAR)


The vector autoregression (VAR) model is one of the most successful, bendy,
and easy to use fashions for the evaluation of multivariate time series. Its
miles a natural extension of the univariate autoregressive model to dynamic
multivariate time series. The VAR version has demonstrated to be mainly
useful for describing the dynamic conduct of economic and economic time
collection and for forecasting. It often gives superior forecasts to the ones
from univariate time series fashions and problematic theory-based
simultaneous equations models. Forecasts from VAR fashions are quite bendy
due to the fact they may be made conditional on the capability future paths
of targeted variables in the version.
Generalized Autoregression Conditional Heteroskedasticity Model
(GARCH)
Autoregressive Conditional Heteroskedasticity version of Order cohesion
A time collection that is given at every example through:
t=twtt=twt
where wtwt is discrete white noise, with 0 imply and unit variance and
2tt2 is given through:
2t=zero+12t1t2=0+1t12
wherein zerozero and 11 are parameters of the version.
8

Exponential Generalized Autoregression Conditional


Heteroskedasticity Model (EGARCH)
EGARCH models benefit from no parameter regulations; hence the feasible
instabilities of optimization workouts are decreased. on the other hand the
theoretical properties of QML estimators of EGARCH fashions aren't clarified
to an excellent quantity.
Generalized Autoregression Conditional Heteroskedasticity in Mean
Model (GARCH-M)
In finance, the go back of a protection may additionally depend on its
volatility (threat). To model such phenomena, the Gracchi-in-suggest
(GARCH-MI) version provides a Heteroskedasticity time period into the mean
equation.

QUESTION 7.WHAT IS A STOCHASTIC


MOTION)? CRITICALLY EXPLAIN.

PROCESS

(E.G., BROWNIAN

In probability theory, a stochastic system or on occasion random technique


(broadly used) is a collection of random variables; this is often used to
represent the evolution of a few random cost, or system, over the years. that
is the probabilistic counterpart to a deterministic method (or deterministic
machine). instead of describing a system which could only evolve in a single
manner (as in the case, for instance, of answers of a regular differential
equation), in a stochastic or random technique there's a few indeterminacy:
even though the initial condition (or starting point) is known, there are
several (regularly infinitely many) instructions wherein the technique may
also evolve.
in possibility principle, a technique related to the operation of risk. for
instance, in radioactive decay each atom is concern to a fixed probability of
breaking down in any given time interval. more typically, a stochastic
method refers to a circle of relatives of random variables indexed towards
some different variable or set of variables. it is one of the most popular
objects of look at in chance. a few simple styles of stochastic tactics consist
of Markov strategies, Poisson approaches (along with radioactive decay), and
time collection, with the index variable relating to time. This indexing can be
9

both discrete and non-stop, the hobby being inside the nature of
modifications of the variables with admire to time.

QUESTION 8.IF

YOU KNOW THAT TWO SIMULATED VARIABLES ARE

CORRELATED BUT DO NOT HAVE THE RELEVANT CORRELATION VALUE,


SHOULD YOU STILL GO AHEAD AND CORRELATE THEM IN A
SIMULATION?

The correlation coefficient is a degree of the strength and direction of the


relationship between variables, and may tackle any values among 1.zero
and +1.0; that is, the correlation coefficient can be decomposed into its path
or sign (high-quality or bad dating between two variables) and the
importance or energy of the connection (the higher the absolute price of the
correlation coefficient, the more potent the connection). it's far crucial to
word that correlation does not suggest causation. two completely unrelated
random variables might display a few correlation, but this doesn't suggest
any causation among the 2 (e.g., sunspot pastime and activities in the stock
market are correlated, but there is no causation between the 2).
There are standard styles of correlations: parametric and nonparametric
correlations. Pearsons correlation coefficient is the most common correlation
10

degree, and is usually referred to honestly because the correlation


coefficient. but, Pearsons correlation is a parametric degree, which means
that that it requires each correlated variables to have an underlying
everyday distribution and that the connection among the variables is linear.
while these situations are violated, that is often the case in Monte Carlo
simulation, the nonparametric opposite numbers grow to be greater crucial.
Spearmans rank correlation and Kendalls tau are the two nonparametric
options. The Spearman correlation is maximum typically used and is
maximum appropriate while implemented within the context of Monte Carlo
simulationthere's no dependence on regular distributions or linearity,
meaning that correlations between different variables with exclusive
distribution may be implemented.
so as to compute the Spearman correlation, first rank the entire x and y
variable values and then practice the Pearsons correlation computation.
inside the case of danger Simulator, the correlation used is the greater
robust nonparametric Spearmans rank correlation. but, to simplify the
simulation procedure and to be steady with Excels correlation characteristic,
the correlation consumer inputs required are the Pearsons correlation
coefficient. threat Simulator will then apply its own algorithms to transform
them into Spearmans rank correlation, thereby simplifying the process.

QUESTION 9.COMPARE

AND CONTRAST BETWEEN A DISCRETE

VERSUS CONTINUOUS DECISION VARIABLE WHEN USED IN AN


OPTIMIZATION UNDER UNCERTAINTY .

Discrete variables are countable in a finite amount of time. for instance, you
may count number the trade in your pocket. you may be counted the money
to your bank account. you can also depend the quantity of money in
anybodys financial institution account. it would take you a long term to rely
that ultimate object, however the point is its nevertheless countable. nonstop Variables could (actually) take for all time to depend. In truth, you'll get
to all the time and in no way end counting them. as an example, take age.
you couldnt matter age. Why not? due to the fact it'd literally take for all
time. for instance, you can be: 25 years, 10 months, 2 days, five hours, four
11

seconds, 4 milliseconds, 8 nanoseconds, ninety nine Pico sendsand so


forth. You could flip age into a discrete variable after which you could count
number it.
some fashions best make experience if the variables tackle values from a
discrete set, regularly a subset of integers, while other models incorporate
variables that can take on any real cost. models with discrete variables are
discrete optimization troubles; fashions with non-stop variables are
continuous optimization troubles. non-stop optimization troubles tend to be
simpler to remedy than discrete optimization troubles; the smoothness of the
functions approach that the goal function and constraint function values at a
point xx may be used to infer records about factors in a community of xx.
however, improvements in algorithms coupled with advancements in
computing technology have dramatically expanded the scale and complexity
of discrete optimization problems that can be solved effectively. continuous
optimization algorithms are important in discrete optimization because many
discrete optimization algorithms generate a series of continuous sub issues.

QUESTION 10.HOW TO GET THE RISK


ORGANIZATION ? EXPLAIN CRITICALLY .

ANALYSIS ACCEPTED IN AN

Management is liable for defining an employer's applicable level of chance;


the safety practitioner have to recognize the technique and be capable of
indicate to management how underlining security threats can negatively
have an effect on commercial enterprise objectives.
acceptance of residual dangers that end result from with hazard treatment
has to take vicinity at the level of the government management of the
12

organization To this volume, risk popularity issues the verbal exchange of


residual dangers to the choice makers.
once usual, residual dangers are taken into consideration as dangers that the
control of the organization knowingly takes. the extent and volume of
widespread dangers contain one of the important parameters of the chance
control technique. In other words, the better the standard residual dangers,
the much less the paintings involved in dealing with dangers (and inversely).
this doesn't mean, but, that when generic the dangers will no longer change
in forthcoming repetitions of the chance control existence-cycle. in the
ordinary phases and sports of the hazard management procedures the
severity of these risks will be measured over the years. inside the occasion
that new assertions are made or converting technical conditions recognized,
dangers which have been commonplace want to be reconsidered.
risk acceptance is considered as being a non-compulsory procedure,
positioned among hazard treatment and chance communique .This manner
is visible as an elective one, due to the fact it can be covered by means of
both risk treatment and risk verbal exchange tactics. this will be finished by
way of speaking the outcome of hazard treatment to the management of the
organization. One reason for explicitly bringing up danger reputation is the
want to attract control's attention to this difficulty which might otherwise
simply be a communicative hobby.
within the connected inventories, threat attractiveness has been protected
within the evaluation of strategies and equipment, as it is probably a choice
criterion for certain styles of companies (e.g. within the monetary and
coverage zone, in vital infrastructure safety etc.).

13

Вам также может понравиться