You are on page 1of 25

PROJECT REPORT ON

TIME SERIES DATA ANALYSIS

Submitted to:
Prof. Sajal Ghosh
Submitted by
Group -7
Saswat Ota, 29NMP91
Sumit Mathur, 29NMP85
Swaraj Dhar, 29NMP87
Tushar Prakash, 29NMP90

TABLE OF CONTENTS
1) Co-integration based trading strategies
a. Co Integration
b. VAR
2) Prediction of Household Consumption
a. Regression Analysis
3) Application of ARIMA and GARCH models for
forecasting of
Future Share price of EAMAMI Pvt. Ltd
a. ARIMA and GARCH Model

Co-integration based trading strategies: A new approach to


enhanced index tracking and statistical arbitrage
Introduction
Pairs trading is a trading strategy commonly used by hedge funds where the
portfolio manager forms his portfolio from pairs of different securities (such
as stocks, bonds, or commodities) that moves closely together over a time
period.
When the spread between the securities widen (divergence), the portfolio
manager takes his positions by opening a long position when he buys the
relative undervalued security and a short position when he sells the relative over
valued security.
If the spread over time converges to its benchmark, the manager closes his
positions and has gained a profit.
Market Neutral Strategy. As only acts on the divergence and convergence
between pairs of securities and not on the general economic situation.

Business Objectives
In this paper we tried to find out pairs of stocks which have a true long run
equilibrium (co integration) yield a higher return than pairs of stocks that relies
on a more spurious relationship (correlation) when applying Pairs Trading for a
trading period from 3Nov 15 to 24 Nov 16.

2. Methodology and Data Description:


To calculate the correlation between all the pairs in the formation window. The
correlation portfolio is represented by the pairs with the highest correlation. Here We
have taken last one year closing price of the BSE idicies as follows

In the Next step ,we tried to find out the co integrated pairs that have the highest
correlation.
We have taken possible combination of stock which has more than 95% co relation
and then we went for co integration test

>95% Co-relation
nifty
energy
BSE METAL
nifty
BSE oil and
energy
gas
nifty 50
BSE FMCG
nifty 50
bse auto
nifty 50
bse bank
nifty 50
bse 500
BSE
FMCG
bse 500
bse auto bse bank
bse auto bse 500
bse bank bse 500

Co
Integration
NO
NO
YES
NO
YES
NO
NO
NO
NO
YES

Co Integration Strategy:
Before moving further into the process let us analyze the data graphically. Although the
series shows volatile pattern with correlated data but we cannot rely simply on the
graphical representation.

Step 1: Logarithmic the series


Convert the series into the log series and check if series is stationary or non-stationary
as VAR model works only with Stationary series

series lnifty_energy=log(nifty_energy)

Step 2: Unit root test


To know the stationary in the series
2.a

Step 2; Test for Stationary


Series 1( At Level 0) Unit Root Test has been carried out as next step: At (Level, Trend +
Intercept)

2.b: Stationary Archived on 1St derivation ( in all 7 indices ) As per the output both
series are of order I(1)

Step 3 : Lag Length and Co integration Test:


Also we have found that the lag length is of 4. So further running the VAR model with
updated Lag length to test the long term causality

H0 : There is no co-integration i.e. There is no long run co-integration between variables


As Trace stats are greater than the critical value.
So H0 cannot be rejected for trace value test at None* & At most 1*.
Similarly, Max Eigen Stats are greater than the critical value.
So here H0 is rejected for trace value test. This mean there is long run relationship
between both the variables.

Step 4: Vector Error Test:

Step 5: Granger Causality Test:

Conclusion
Long Term Causality:
There is co-integration exist between the two series Nifty 50 and BSE FMCG and long
term Causality form exist from Nifty 50 to FMCG and the speed is 79%.
We have taken 1year data so it is expected co integration occurs in 9 month.
Short Term causality:
From Granger Causality test there is no short term causality exist between Nifty 50
and BSE FMCG
From this demonstrations, Hedger can take position when there is a spread of 1/2sigma
and sell the overprice indices and buy the underpriced indices and should go for long
term investment for 9 month and hedger will take position when mean reversal or co
integration occurs .
In the same Way we also found Nifty 50 and BSE Bank has co integration and causality
moves from nifty 50 to BSE bank and accordingly we can take the position to gain return.

Regressional Analysis: Prediction of Household Consumption


Objective: We are trying to ascertain If there is a dependency of GDP, Savings, Spending, and
Capital Investment on Household consumption.

Methodology & Data Description: We will use regression analysis to find consumption is
dependent on which variables out of (GDP, Saving, Spending, and Capital Investment). We have
extracted data on indian household consumption from 2000-15 and test its dependency on GDP,
Savings, Spendings and capital investment.

Empirical Analysis:
H0: Household consumption has no dependency on GDP, Savings, Spendings, and capital investment.
DW test is close to 1.7 ~ 2 so we can assume there is no auto correlation.

Command 1: ls cn c gdp id sp sv
Output screen 1

Using (cn-1), the DW test shows 2.22, which confirms no auto correlation.
Command 2: ls cn c gdp id sp sv (cn-1)
Output screen 2

Autocorrelation is not there in the data. Next step is to remove Multicollinearity.


After removing variables one by one, we found two variables have significant p value which denotes
that multi collinearity between dependent variables have been removed.
Command 3: ls cn c gdp sp
Output screen 3

Conclusion: Consumption is dependent on GDP and Spending. The model 99.6 % accuracy

Application of ARIMA and GARCH models for forecasting of


Future Share price of EAMAMI Pvt. Ltd
Introduction
The three main purposes of forecasting volatility are for risk management, for asset allocation, and for taking
bets on future volatility. A large part of risk management is measuring the potential future losses of a portfolio
of assets, and in order to measure these potential losses, estimates must be made of future volatilities and
correlations.
In asset allocation, the GARCH approach of minimizing risk for a given level of expected returns has become a
standard approach, and of course an estimate of the variance-covariance matrix is required to measure risk.
Perhaps the most challenging application of volatility forecasting, however, is to use it for developing a
volatility trading strategy.
Option traders often develop their own forecast of volatility, and based on this forecast they compare their
estimate for the value of an option with the market price of that option.

Business Objective:
Here we demonstrated a model which helps to describe financial markets in which volatility can change,
becoming more volatile during periods of financial crises or any events and less volatile during periods of
relative calm and steady economic growth.

Methodology & Data Description:


We have taken stock market data of EMAMI limited and done MSARIMA and GARCH modeling for
forecasting its future share price

Empirical Analysis:
Step 1: Identification :
1)The data was extracted from BSE website and was imported into the E-Views software.

ii)The data was visualized through the Graph option of the software

There seems to be a trend in the data and may be a hint of seasonal pattern, but its just a visual representation,
so we go for the next step.
iii) The correleogram was plotted of the data in level
The ACF shows presence of non-stationary in the data and PACF shows AR(1) pattern but fi confirmation we
have to do UNIT Root test also

Here we can see that the DATA has a Unit Root, i.e. it is non stationary, so we repeat the unit root test with 1 st
differential

The first differential shows that there is no unit root. But the corelogram shows white noise so instead of
detrending we have to go for desaesonalizing

The de seasonalized coreleogram shows that we have AR(1), SAR(5), and SMA(5) signature

Stage 2: Estimaton: we model based on the above signatures

This shows that SAR(5) is insignificant, so we remove it from the estimation and re do the modelling

Here we see that AR(1) and MA(5) are significant

Stage 3: Diagnostic Checking: We go for residual diagnostic checking

Here we can see we have reached white noise process. So we move for forcasting
Stage 4: Static and Dynamic forecasting:

We can see that our M.A.P.E. is 1.22 so our forcast is accurete and we can move forward to out of sample
forcast

Stage 5: Test of Volatility: Heteroskedasticity Test at lag 1

Heterosedacity on lag5

Similarly ARCH test was done on lag15 and found that there is no ARCH effect .

Conclusion:
In ARIMA-GARCH model, we could only work on the timeseries if we could reach white noise process
whereas VAR modeling can be applied on non-stationary series as well.
2. GARCH model only shows size of the volatility, Asymmetry and persistence of the volatility but it does not
show if the error correction is there within the series or not and if it is then how it is behaving. VAR modeling
suggests the speed of adjustment within the error correction.
3. In this case found that there is no volatility effect on stock price

You might also like