Modeling Financial Time Series
We are attempting to 'model' what the reality is; so
that we can predict it. Statistical Modeling, in addition to being of
central importance in statistical decision making, is critical in any
endeavor, since essentially everything is a model of reality. As such,
modeling has applications in such disparate fields as marketing, finance,
and organizational behavior. Particularly compelling is econometric modeling
since, unlike most disciplines (such as Normative Economics), econometrics
deals only with provable facts, not with beliefs and opinions.
Time series analysis is an integral part of financial
analysis. The topic is interesting and useful, with applications to the
prediction of interest rates, foreign currency risk, stock market volatility,
and the like. There are many varieties of econometric and multivariate
techniques. Specific examples are regression and multivariate regression;
vector autoregressions; and co integration regarding tests of present
value models. Next section presents the underlying theory on which statistical
models are predicated.
Financial Modeling: Econometric modeling
is vital in finance and in financial time series analysis. Modeling is,
simply put, the creation of representations of reality. It is important
to be mindful that, despite the importance of the model, it is in fact
only a representation of reality and not the reality itself. Accordingly,
the model must adapt to reality; it is futile to attempt to adapt reality
to the model. As representations, models cannot be exact. Models imply
that action is only taken after careful thought and reflections This can
have major consequences in the financial realm. A key element of financial
planning and financial forecasting is the ability to construct models
showing the interrelatedness of financial data. Models showing correlation
or causation between variables can be used to improve financial decisionmaking.
For example, one would be more concerned about the consequences on the
domestic stock market of a downturn in another economy if it can be shown
that there is a mathematically provable causative impact of that nation's
economy and the domestic stock market. However, modeling is fraught with
dangers. A model which heretofore was valid may lose validity due to changing
conditions, thus becoming an inaccurate representation of reality and
adversely affecting the ability of the decisionmaker to make good decisions.
The examples of univariate and multivariate regression,
vector autoregression, and present value cointegration illustrate the
application of modeling, a vital dimension in managerial decision making,
to econometrics, and specifically the study of financial time series.
The provable nature of econometric models is impressive; rather than proffering
solutions to financial problems based on intuition or convention, one
can mathematically demonstrate that a model is or is not valid, or requires
modification. It can also be seen that modeling is an iterative process,
as the models must continuously change to reflect changing realities.
The ability to do so has striking ramifications in the financial realm,
where the ability of models to accurately predict financial time series
is directly related to the ability of the individual or firm to profit
from changes in financial scenarios.
Univariate and Multivariate Models:
The use of regression analysis is widespread in examining financial time
series. Some examples are the use of forward exchange rates as optimal
predictors of future spot rates; conditional variance and the risk premium
in foreign exchange markets; and stock returns and volatility. A model
that has been useful for this type of application is called the GARCHM
model, which incorporates computation of the man into the GARCH (generalized
autoregressive conditional heteroskedastic) model. This sounds complex
and esoteric, but it only means that the serially correlated errors and
the conditional variance enter the mean computation, and that the conditional
variance itself depends on a vector of explanatory variables. The GARCHM
model has been further modified, a testament of finance practitioners
to the necessity of adapting the model to a changing reality. For example,
this model can now accommodate exponential (nonlinear) functions, and
is no longer constrained by nonnegativity parameters.
One application of this model is the analysis of stock
returns and volatility. Traditionally, the belief has been that the variance
of portfolio returns is the primary risk measure for investors. However,
using extensive time series data, it has been proven that the relationship
between mean returns and return variance or standard deviation I weak;
hence the traditional twoparameter asset pricing models appear to be
inappropriate, and mathematical proof replaces convention. Since decisions
premised on the original models are necessarily suboptimal because the
original premise is flawed, it is advantageous for the finance practitioner
to abandon the model in favor of one with a more accurate representation
of reality.
Correct specification of a model is of paramount importance,
and a battery of misspecification testing criteria have been established.
These include tests of normality, linearity, and homoskedasticity, and
can be applied to a variety of models. A simple example which yields surprising
results is the Capital Asset Pricing Model, one of the cornerstones of
elementary economics. Application of the testing criterial to data concerning
companies' risk premium shows significant evidence of nonlinearity, nonnormality
and parameter nonconstancy. The CAPM was found to be applicable for only
three of seventeen companies that were analyzed. This does not mean, however,
that the CAPM should be summarily rejected; it still has value as a pedagogic
tool, and can be used as a theoretical framework. For the econometrician
or financial professional, for whom the misspecification of the model
can translate into suboptimal financial decisions, the CAPM should be
supplanted by a better model, specifically one that reflects the timevarying
nature of betas. The GARCHM framework is one such model.
Multivariate linear regression models apply the same
theoretical framework. The principal difference is the replacement of
the dependent variable by a vector. The estimation theory is essentially
a multivariate extension of that developed for the univariate, and as
such can be used to test models such as the stock and volatility model
and the CAPM. In the case of the CAPM, the vector introduced is excess
asset returns at a designated time. One application is the computation
of the CAPM with timevarying covariances. Although in this example the
null hypothesis that all intercepts are zero cannot be rejected, the misspecification
problems of the univariate model still remain. Slope and intercept estimates
also remain the same, since the same regression appears in each equation.
Vector Autoregression: General regression
models assume that the dependent variable is a function of past values
of itself and past and present values of the independent variable. The
independent variable, then, is said to be weakly exogenous, since its
stochastic structure contains no relevant information for estimating the
parameters of interest. While the weak exogeneity of the independent variable
allows efficient estimation of the parameters of interest without any
reference to its own stochastic structure, problems in predicting the
dependent variable may arise if "feedback" from the dependent
to the independent variable develops over time. (When no such feedback
exists, it is said that the dependent variable does not Grangercause
the independent variable.) Weak exogenetic coupled with Granger noncausality
yields strong exogenetic which, unlike weak exogenetic, is directly testable.
To perform the tests requires utilization of the dynamic structural equation
model (DSEM) and the vector autoregressive process (VAR). The multivariate
regression model is thus extended in two directions, by allowing simultaneity
between the endogenous variables in the dependent variable, and explicitly
considering the process generating the exogenous variables in the dependent
variable, and explicitly considering the process generating the exogenous
independent variables. Results of this testing are useful in determination
of whether an independent variable is strictly exogenous or is predetermined.
Strict exogenetic can be tested in DSEMs by expressing each endogenous
variable as an infinite distributed lag of the exogenous variables. If
the independent variable is strictly exogenous, attention can be limited
to distributions conditional on the independent variable without loss
of information, resulting in simplification of statistical inference.
If the independent variable is strictly exogenous, it is also predetermined,
meaning that all of its past and current values are independent of the
current error term. While strict exogenetic is closely related to the
concept of Granger noncausality, the two concepts are not equivalent
and are not interchangeable.
It can be seen that this type of analysis is helpful
in verifying the appropriateness of a model as well as proving that, in
some cases, the process of statistical inference can be simplified without
losing accuracy, thereby both strengthening the credibility of the model
while increasing the efficiency of the modeling process. Vector autoregressions
can be used to calculate other variations on causality, including instantaneous
causality, linear dependence, and measures of feedback from the dependent
to he independent and from the independent to the dependent variables.
It is possible to proceed further with developing causality tests, but
simulation studies which have been performed reach a consensus that the
greatest combination of reliability and ease can be obtained by applying
the procedures described.
Cointegration and Present Value Modeling:
Present value models are used extensively in finance to formulate models
of efficient markets. In general terms. A present value model for two
variables y1 and x1, states that y1 is a linear function of the present
discounted value of the expected future values of x1, where the constant
term, the constant discount factor, and the coefficient of proportionality
are parameters that are either know or need to be estimated. Not all financial
time series are nonintegrated; the presence of integrated variables affects
standard regression results and procedures of inference. Variables may
also be cointegrated, requiring the superimposition of cointegrating vectors
on the model, and resulting in circumstances under which the concept of
equilibrium loses all practical implications and spurious regressions
may occur. In present value analysis, cointegration can be used to define
the "theoretical spread" and to identify comovements of variables.
This is useful in constructing volatilitybased tests.
One such test is stock market volatility. Assuming cointegration,
secondorder vector autoregressions are constructed, which show suggest
that dividend changes are not only highly predictable but are Grangercaused
by the spread. When the assumed value of the discount rate is increased,
certain restrictions can be rejected at low significance levels. This
yields results showing an even more pronounced "excess volatility"
than that anticipated by the present value model. It also illustrates
that the model is more appropriate in situations where the discount rate
is higher. The implications of applying a cointegration approach to stock
market volatility testing for financial managers are significant. Of related
significance is the ability to test the expectations hypotheses of interest
rate term structure.
Mean absolute error is a robust measure of error. However,
one may also use the sum of errors to compare the success of each forecasting
model relative to a baseline, such as a random walk model, which is usually
used in financial time series modelling.
Back to Statistical
Forecasting Home Page
