You are on page 1of 5

January Exam SECTION A - COMPULSORY SECTION Question 1 Provide brief answers to all parts of this question (a) Explain

what is meant by the following assertion: In the Classical Regression Model, the OLS estimators are Best Linear Unbiased Estimators. [7 marks] The assumptions under the classical regression model are: 1) E(Ut) = 0. The errors have a zero mean 2) Var(Ut) = 2. Variance of errors is constant and finite over all values of xt. 3) Cov(Ui,Uj) = 0. The errors are statistically independent of one another 4) Cov(Ut,Xt) = 0. No relationship between the error and corresponding x variable. 5) Although this isnt require for the classical regression model to be best, Ut is normally distributed, therefore allowing us to use hypothesis testing to make inferences about the parameters and . If assumptions 1-4 hold, then the estimators and (^ on top of both) determin ed by the OLS are known as Best Linear Unbiased Estimator (BLUE). This is when the errors have minimum variances among the class of linear unbiased estimators to show the parameters are Best. Estimator, is when (^) is an estimator of the true value of . Linear, is when (^) is a linear estimator Unbiased, is when on average the actual values of the estimators and equal to their true value. (b) Give an example of a seasonal effect that is believed to occur in financial data and how we might deal with it in a regression model. [7 marks] Seasonal effects in financial markets has been widely observed and is a component of a time-series, which is defined as the repetitive and predictable movement around the trend line of one year or less and often termed Calendar Anamolies. An example of seasonal effects is The January effect, which is a calendar related anomaly in the financial market where financial securities prices increase. This occurs in the US. Most common theory to explain this anomaly is that individual investors, who are income tax sensitive and who disproportionately hold small stocks, sell stock for tax reasons (such as to claim capital losses) and reinvest after the first of the year. You can deal with this in a regression model using dummy variables, either intercept or slope dummies. You enter a 1 for every time period the anomaly occurs and this will remove the pattern that is being observed.

(c) Explain the use of the Autocorrelation and Partial Autocorrelation functions in identifying a univariate Time Series model. [7 marks] The characteristics of an ARMA process will be a combination of those from the autoregressive (AR) and moving average (MA) parts. The acf alone can distinguish between a pure AR and a pure MA. However, an ARMA process will have a geometrically declining acf as will a pure AR process. So, the pacf is useful for distinguishing between an AR (p) process and an ARMA (p,q) process the former will have a geometrically declining autocorrelation function, but a partial autocorrelation function which cuts off to zero after p lags, while the latter will have both autocorrelation and partial autocorrelation functions which decline geometrically.

(d) Why is autocorrelation in a regression model more serious if one of the regressors is a lagged dependent variable? [7 marks] Inclusion of lagged values of the dependent variable violates the assumption that the explanatory variables are non-stochastic. The model with many lags may have solved a statistical problem (autocorrelated residuals at the expense of creating an interpretational one. Note that if there is still autocorrelation in the reisudals of a model including lags, then the OLS estimators will not even be consistent as well as efficient.

(e) What is heteroskedasticity and what are the estimation and inference problems associated with it? If errors do not have a constant variance (as stated in assumption to in question 1), they are said to be heteroscedastic. To consider one illustration, the Ut have been calculated and then plotted against one of the explanatory variables, X2t, as shown in the figure below. It is clearly evident that the errors in the figure are heteroscedastic DRAW GRAPH FROM SLIDE 110 The OLS estimators will still give unbiased (and also consistent) coefficient estimates, but they are no longer BLUE that is, they no longer have the minimum variance among the class of unbiased estimators. The variance is used to calculate the coefficient SE and therefore they no longer hold. SO if OLS is still used in the presence of heteroskedasticity, the SE could be wrong and therefore any inferences made could be misleading.

SECTION A - COMPULSORY SECTION Question 1 Provide brief answers to all parts of this question (a) What is heteroskedasticity? What are the potential problems of the latter in estimation and inference and how can these problems be remedied? Same as above. Solutions: Transforming variables into logs ot reduceing by some other measure of Size Using heteroscedasticity-consistent SE estimates, therefore making hypothesis testing more conservative.

[7 marks] (b) Discuss the causes of serially correlated errors in financial time series data. Serial correlation occurs when interpreting time series data. EGs of the causes: Business cycle inertia causes positive autocorrelation in macroeconomic time series. Overlapping effect of shocks: the effect of a time (t) shock persist over t+1 etc Model misspecification: omitted relevant variables which are correlated across time (inertia). [7 marks] (c) What is a random walk process? Illustrate the latter with an example. [7 marks] (d) Explain what the autocorrelation function is and its use in time series modelling. Autocorrelation function measures the correlation between an observation k periods ago up until the current observation, therefore there is no need to control for observations at intermediate lags (i.e. all lags<k) as you would for for pacf. The characteristics of an ARMA model will be a combination of the AR and MA parts. Autocorrelation alone can distinguish between a pure autoregressive (AR) and a pure moving average (MA) process. A random walk is defined as a process where the current value of a variable is composed of the past value plus an error term defined as a white noise (a normal variable with zero mean and variance one).

[7 marks]

(e) Describe the concept of seasonality. Give an example of seasonality in financial data and illustrate how to account for it in a model. Same as above. [7 marks] [Total - 35 marks]

Other things about ARMA and (Partial) Autocorrelation:

Characteristics of AR:
A geometrically decaying acf A number of non-zero points of pacf = AR order

Characteristics of MA: Number of non-zero points of acf = MA order A geometrically decaying pacf

Characteristics of a combination of AR & MA (ARMA) A geometrically decaying acf A geometrically decaying pacf

A possible exam questions:

(d) Explain what the autocorrelation function is and its use in time series modelling.

Partial autocorrelation function measures the correlation between an observation k periods ago and the current observation, after controlling for observations at intermediate lags (i.e. all lags<k). The characteristics of an ARMA model will be a combination of the AR and MA parts. A pacf is useful for distinguishing between an AR (p) process and an ARMA (p,q) process.

(b) Discuss the causes of spatial correlated errors in financial time series data.
Spatial correlation occurs when looking at cross-sectional data. The cause of this is: Omitted variables or shocks common to all groups (countries, companies, households).

You might also like