You are on page 1of 40

Analysis of the Indian Capital Market: Pre and Post Liberalization*

J. K. Nayak1
Abstract The new issue market, also known as primary market, has undergone an exponential growth in the last decade or so. The paid up capital as well as the number of listed companies has risen sharply. Undoubtedly, this is an indication of a healthy trend in the development of the nation. But the moot question to be answered is whether the growth of the new issue market has witnessed a decline in investor grievances in comparison to the past i.e., before liberalization or it has been on the rise. In this paper an attempt has been made to find out the common grievances and the regulatory measures undertaken to provide protection. An empirical approach has been established in this paper.

1.

INTRODUCTION

The capital market is the barometer of any countrys economy and provides a mechanism for capital formation. Across the world there was a transformation in the financial intermediation from a credit based financial system to a capital market based system which was partly due to a shift in financial policies from financial repression (credit controls and other modes of primary sector promotion) to financial liberalization. This led to an increasing significance of capital markets in the allocation of financial resources. The Indian capital market also went through a major transformation after 1992 and the sensex is hovering around the 10000 mark by the end of the year 2005, which seemed a dream just a few years
* 1.

back, although the beginning of such an initiative could be seen since the second half of 1980s. Since then the market has been growing in leaps and bounds and has aroused the interests of the investors. The reason for such a development was an increasing uncertainty caused due to liberalization and standardization of the prudential requirements of the banking sector for global integration of the Indian financial system. Further, rise in their non-performing assets led to a decrease in credit from banks to the commercial sector. Liberalization and opening of the gates led to an expansion of three broad channels of financing the private sector namely, a) Domestic capital market b) International capital market (American depository receipts and Global depository receipts) and c) Foreign direct investment.

Received May 13, 2006; Revised June 22, 2006. Lecturer (Senior Grade), Regional College of Management, Bhubaneswar, India e-mail: jognayak@yahoo.com

140

Vilakshan, XIMB Journal of Management

The efficiency of a capital market which can be defined in terms of its ability to reflect the impact of all relevant information in the prices of the securities and the large number of profit driven individuals who act dependently on one another grew tremendously in the Indian context. The number of issues enlisted before and after 1991 has been exponential in nature. Some of the major reasons for their growth are advent of SEBI and abolishment of Capital Issues Control Act, new regulations for protection of investors, online trading, depositories and credit rating system etc. Here, an attempt has been made to highlight the major problems linked with new issue market and the problem solving mechanisms built to take care of the investors. This paper highlights and asserts that the domestic capital market, especially the new issue or primary market became the predominant channel for financing corporate sector needs in India. It has been examined through an empirical research about the existing and past problems involved in the equity market. The steps taken by the government for protection and the satisfaction level of investors has been studied.
2. LITERATURE REVIEW

small investors, mutual funds, banks, companies and financial institutions. Equity trading in India was dominated by floorbased trading on Indias oldest exchange, the Bombay Stock Exchange (BSE) upto late 1994. This process had several problems. The floor was non transparent and illiquid. The non transparency of the floor led to rampant abuse such as investors being charged higher prices for purchases as compared with the prices actually traded on the floor. It was not possible for investors to crosscheck these prices. Investors were forced to pay high brokerage fees to undercapitalized individual brokers, who had primitive order processing systems. Gupta (1992) concludes that a) Indian sock market is highly speculative, b) Indian investors are dissatisfied with the services provided to them by the brokers, c) margins levied by the stock exchanges are inadequate and d) liquidity in a large number of stocks in Indian markets is very low. This situation was transformed by the arrival of the new National Stock Exchange (NSE) in 1994. A consortium of governmentowned financial institutions, owned NSE. NSE built an electronic ordermatching system, where computers matched orders without human intervention. It used satellite communications to make this trading system accessible from locations all over the country. Trading in equities commenced at NSE in November 1994. From October 1995 onwards (11 months after commencement), NSE has been Indias largest exchange. There are few other parallels to this episode

A developed securities market enables all individuals, no matter how limited their means, to share the increased wealth provided by competitive private enterprise (Jenkins 1991). The players involved in the capital market include

Nayak, Analysis of the Indian ...

141

internationally, where a second exchange displaced the entrenched liquidity on an existing market within a year (Shah & Thomas 2000). The removal of License Raj especially in areas related to private sector financing options, led to a direct increase in market based financing of industrial investments through an expansion in three broad channels, FDI, Global depository receipts (GDRs) in the international market and the last being the Capital market which consists of the secondary market and the new issue market. One important factor that led to the growth of the new issue market was the growing significance of financial assets, with increase in the saving rate and monetisation of the economy. Recently the government and SEBI have initiated a number of healthy measures to develop the capital market. Some of them are Grant of legal status to SEBI for protecting investors interest and regulating the market. Pricing of issues was left free. Permission to FIIs (foreign institutional investors) to enter the primary and secondary market.
Period 1951-60 1961-70 1971-80 1981-90 1991-99 Capital raised (Rs.Crore) 285 728 992 23,357 1,06,799

Equity issue in foreign markets by Indian companies through ADRs and GDRs. Dematerialization of shares. Compulsory credit rating. Promotion of the concept of corporate governance. Permission for buy back of shares. Participation of foreign partners with equity in all industries. Reduction in interest rates.

The outcome of the revamping of the capital market on the new issue market is that the total amount of proposed investments through the NIM in the 1980s increased to Rs. 23,357 crore from Rs. 992 crore in 1970s and a mere Rs.285 crore in the 1950s (See Table 1) The Society for Capital Market Research and Development, which carries out periodic surveys to find the number of investors, found that the number has been steadily rising since 1990 (See Table 2) One thing is clear from the above table that the number of investors grew since 1990 but then it declined. The free pricing regime which followed the abolition of the
Yearly average 28.5 72.8 99.2 2,335.70 13,349.80 Growth rate (Per Cent) 155.4 36.3 2254.5 457.2

Table 1: New capital raised from the market by public limited companies
S.L.No 1. 2. 3. 4. 5.

Source : based on data in the The Report on Currency and Finance, RBI, India, various years

142

Vilakshan, XIMB Journal of Management

Table 2: Number of investors


S.L. No 1. 2. 3. 4. Year 1990 1993 1997 1999 No. of Investors (in lakh) 90-100 140-150 200 128

In 1995, the BSE closed for three days in the context of payment problems on M.S.Shoes. In 1997 there was a scandal where CRB mutual fund defrauded its investors, which cast doubts upon the supervisory and enforcement capacity of SEBI and RBI. In summer 1998 there was an episode of market manipulation involving three stocks (BPL, Sterlite and Videocon). In this case a variety of questionable methods were employed at the BSE to avoid a failure of settlements. The actions partly led to the dismissal of the BSE President by SEBI. The most recent crisis, in march 2001, led to the second dismissal of a BSE President, the dismissal of all elected directors on the BSE and the Calcutta stock exchange(CSE), and payment failures on the CSE (Thomas 2001)
OBJECTIVES OF THE STUDY

Source: The Report on Currency and Finance, RBI, India, 1990 to 1999

Controller of Capital Issues Act in 1992, enabled issuers to freely access the market and enabled a flurry of activities in the primary market which attracted a large number of households to invest in equity issues, but there were also a plethora of poor quality public issues both at par and at premium. These issues saw a rapid decline in valuations on the stock market when trading commenced and there was a substantial loss of wealth of the households who had invested in them. In some cases there were companies who vanished completely after gobbling peoples hard earned money. Such companies were termed as fly by night operators. By 199596 there was worrisome erosion of investor confidence and investors turned away from direct investment in equity shares to safer fixed income instruments and bank deposits. Primary market activity diminished significantly and the market remained dull till about the third quarter of 1999. The high interest rates prevailing since 1995-96 further encouraged this trend. In order to gain investor confidence a lot of initiatives was taken and the SEBI was bestowed with more power. Some of the major crises, which occurred in the equity market during the period were:

3.

The major objective of this study is to find the changes that have occurred in the investors after liberalization. It has been tried to study whether changes in the capital market policies and the new protectionist measures that have been taken have been effective in raising investors confidence. 1. How risky do the investors feel about the capital market after strengthening of the SEBI (1995-96)? What have been the major changes in the problems that were associated with the brokers?

2.

Nayak, Analysis of the Indian ...

143

3.

How has the transaction system differed after introduction of dematerialization of shares? What has been the effect of SEBI on insider trading? What is the reaction of the investors on premium charged on primary issues and how well are they being informed about it?
METHODOLOGY

4. 5.

After collecting the data, editing and coding was done and finally analyzed. The SPSS package was used for analyzing the data. Most of the respondents were service holders (67.5%), businessmen (25%), housewives (5%) and students (2.5%). According to the age group, 22.5% People were in 20-30 age group, 37.5% were in the 30-40 group, 33.8% in 40-50, 5.0% in 50-60 and 1.3 % in the 60-70 age group. According to their marital status, 80 % of the people surveyed were married and rest 20% were unmarried (See Table 3).
Table 3 : Respondents profile
N Mini Maxi Sum Mean Std. mum mum Devia tion Occu pation 80 Age 80 Marital status 80 1 1 0 4 5 1 148 180 64 1.85 2.25 .80 .62 .91 .40

4.

The method followed for this study was survey method. A questionnaire was prepared after doing an extensive literature review on investor grievances and protection. This questionnaire was sent for cross checking of reliability and validity to experts who were mostly academicians and also to a few corporate people. People from corporate included five employees of banks and another five from share broking agencies. Some questions were reworded to improve validity and clarity. The pretest questionnaires were not used for subsequent analysis. After final ratification, this survey instrument was tested on people who had made some investments in the equity market or had some knowledge about it. The samples were chosen in a nonprobabilistic and convenience method. The sample size was ninety-nine in number, out of which nineteen questionnaires were rejected due to lack of proper information. This size was maintained due to time and cost constraint. A five point Likert scale was used where 1=not at all, 2=slightly, 3=moderately, 4=much, 5=very much.

The preferred mode of investment was first equity, banks, mutual fund and then any other in a descending order. By studying the different methods of investment it was found that there has been a large number of people who have invested in equities. It means that the government policies after liberalization has been beneficial for the equity market. Investors faith has increased and their risk taking ability has also increased. Investments in banks have ranked second which is little surprising, since banks have been the largest sector for investments in India for ages. Then it was followed by mutual funds and lastly by other sectors like post offices. (See Table 4)

144

Vilakshan, XIMB Journal of Management

Table 4 : Preference of investment


N Mini mum 0 0 0 0 Maxi Mean mum 1 1 1 1 .80 .74 .53 .39 Std. Devia tion .40 .44 .50 .49

5.1 Descriptive Statistics

Equity Bank Mutual fund Other

80 80 80 80

5.

STATISTICAL ANALYSIS

The reliability of the scales for investor grievances and capital market issues was evaluated using Cronbachs alpha. If internal consistency is high (above 0.70) then the scale items have a strong relationship with each other. It is desired that alpha be above 0.70. However, alpha levels between 0.50 and 0.60 are acceptable for exploratory research (Churchill, 1979). For this study coefficient alpha levels range between 0.54 and 0.69. The alpha value showed an increase when some items, such as risk involved, amount of knowledge and degree of happiness were deleted.
N Brokerage Premium Broker problems Transfer Odd_Lot Education Risky Insider trade Delay Non receipt Knowledge 80 80 80 80 80 80 80 80 80 80 80 Minimum 1 1 1 1 1 1 1 1 1 1 0

The table - 5 gives the mean and standard deviation of variables used in the study. By observing the mean responses for the 11 variables it was found that the mean ranged from 0.91 to 3.55, though a higher mean cannot be interpreted as statistically more important than others. Surprisingly, issues such as transfer of shares certificates, delay in receipt of dividends and insider trading, which used to be serious issues earlier, did not show up as the top ones. The findings were quite encouraging since it depicted the positive mentality of investors towards the equity market. One thing that could be drawn from this study was that problems were mostly broker related and therefore that is one area were reforms are required. The investors felt that the brokerage charged is still very high and the amount of knowledge available on the equity market was not satisfactory. Investors, it appears, need to be educated more (Table 5)

Table 5: Investors Percetion at the Capital Market


Maximum 5 5 5 5 5 5 5 5 5 5 1 Mean 3.55 3.43 3.35 3.10 3.09 3.08 3.00 2.94 2.84 2.60 .91 Std. Deviation 1.05 1.05 1.03 1.05 1.07 1.16 1.19 .90 .85 1.13 .28

Nayak, Analysis of the Indian ...

145

5.2 Correlation Analysis

The correlation matrix was drawn to find the degree of association among the variables. (See Table 6) From the above correlation matrix it was evident that education, broker related problems, insider trading, delay and nonreceipt of dividends were positively
Non receipt 1.000 .314** .312** .240 .111 .219 .154 .234 Delay Odd lot Trans fer

correlated with all the other variables. Brokerage with premium, brokerage with odd lot and broker problems with insider trading were strongly correlated. Nonreceipt of dividends was correlated with delay and odd lots. Delay was also positively related to transfer of shares, insider trading and broker problems. Transfer of shares was negatively related
Premi um Insider trading Broker Broker Edu age problems cation

Table 6: Correlation amont Capital Market Variables

Non receipt Delay Odd lot Transfer Premium Insider trading Brokerage Broker problems Education

1.000 .197 .246*

1.000 -.233 .382** .241 .406 .213 .179

1.000 -.268*

1.000 .307** .551** .199 .348** 1.000 1.000 .430** .186 .181 .142 1.000 .190

-.142 .154

.210 .167

1.000

Note: values less than 0.1 have been omitted. ** Correlation is significant at 0.01 level (2-tailed) * Correlation is significant at 0.05 level (2-tailed)

with premium charged on the issues. Premium charged was positively related with insider trading and education of investors. These findings are consistent with the previous studies on investor grievances.
5.3 Factor analysis

0.610, was used to validate the use of factor analysis. The rotated component matrix (varimax rotation) was used for the study (See Table 8) Three items loaded significantly on the first factor. All the three items brokerage, premium and odd lot were mostly finance related. Although the third item was not directly related, it had an indirect effect on the overall money spent by an investor. So the first factor was named finance. The second factor was loaded with two items, insider trading and broker problems.

In this research, principal component analysis with Eigen values greater than one was used to extract factors. The Bartlett test of sphericity, which was 134.643, and the Kaiser-meyer-olkin (KMO) of sampling adequacy, which was

146

Vilakshan, XIMB Journal of Management

Table 8: Factor analysis of investor grievances


Rotated Component Matrix Finan Comp Comm cial1 onent unica Brok tion3 er2 Brokerage .774 Premium .760 .296 -.119 Odd Lot .711 .177 .164 Insider trade .112 .871 Broker problems .125 .728 .203 Delay .187 .799 Transfer -.483 .111 .59 Non receipt .470 .585 Experience .189 Knowledge -.266 Aware ness4

share certificates, difficulty in transfer of shares and finally non-receipt of dividends and interests. Since it was all communication related issues this component was termed as communication. The last factor consisted of experience of the investors and knowledge about a particular equity. This was an awareness related item thus this component was named as awareness.
5.4 Effect of Pre and Post Liberalisation

-.228 .825 .777

Method: Principal Component Analysis. Rotation Method: Varimax with Kaiser Normalization. a Rotation converged in 6 iterations.

Since both were broker related issues we termed this component as broker. The broker problem includes availability and knowledge about a broker. The third factor included delay in receiving letters,
Table 9. Regression analysis
Dependent Variable Happiness Liberalization Non-receipt Education Risky Transfer Insider trading Invest Brokerage Delay Broker problems Odd lot Premium independent variable F 4.017

In order to check the effect of government regulations and find out the change in investor mindset before and after liberalization , a regression analysis was done taking happiness as the dependent variable and non-receipt of dividend, delay in getting information , odd lot, transfer, premium charged on new issues, insider trading, brokerage, broker problems and education as the independent variables (See Table 9) The results of the regression analysis suggests that the overall model is
Sig. of F .000 R R .647 2.345 -.1.014 -.1688 2.937 1.068 1.588 .471 -2.000 .276 .349 2.287 -.747 t .418 .022 .314 .096 .005 .289 .117 .640 .050 .783 .729 .025 .458 Sig. of t

Nayak, Analysis of the Indian ...

147

significant at the 0.005 level (F=4.017; p=0.000) and these items explain nearly 42% of the variance (R=0.418). Further analysis indicates that out of the independent variables, only liberalization, risk, brokerage, insider trading, odd lot and education are significant at the 0.005 levels.
6. CONCLUSION

quite largely with a change in movement of the market. Although this research was done on investor grievances before and after liberalization, it has not touched upon several areas, such as effect of online trading, role of SEBI etc. The study could also have been extended to the mutual fund industry and banks.
REFERENCES
Bal R K, Mishra B B (1990), Role of Mutual Funds in Developing Indian Capital Market, Indian Journal of Commerce, Vol. XLIII, p. 165. Bhole L M (1992), Proposals for Financial Sector Reforms in India : An Appraisal (Perspectives), Vikalpa, Vol. 17, No. 3 (JulSep), p. 3-9. Chandra Prasanna (1990), Indian Capital Market : Pathways of Development, ASCI Journal of Management, Vol. 20, No. 2-3 (SeptDec), p. 129-137. Churchill, G.A. Jr. 1979. A Paradigm for developing better measures of marketing constructs. Journal of Marketing Research. 16, 64-73 Francis C K (1991a), Towards a Healthy Capital Market, Yojana, Vol. 35, Mar.1-15, p. 11-13. Francis C K (1991b), SEBI - The Need of the Hour, SEDME, Vol. 18(3), p. 37-41. Gupta L.C (1992), Stock Exchange Trading in India: Agenda for Reform, Society for Capital Market Research and Development, Delhi, p. 123. Hanke and Alan Walters, eds., Capital market and development, ICS press, San Francisco, 1991. International Securities Consulting (2000), Moving from account period settlement to

The study revealed that the new issue market in the post liberalization era was embedded with numerous problems. Although the problems have been less in comparison to the pre liberalization period, still they exist. Some of the major ones are as follows: The brokerage charged is still high and it is evident from the descriptive statistics. The mean value was the highest for brokerage and then broker related problems. Investors still considered the capital market as highly risky. The t-value (2.937, p=0.005) suggests that it is significant. But from the investment pattern from the descriptive statistics it seems that the number of people willing to invest in capital market has increased.
LIMITATIONS & SCOPE FOR FURTHER RESEARCH

7.

The researcher faced several problems while conducting this research. Finding the samples was difficult, since people were not aware or they were not interested in extending their help. Another problem was that the opinion of people about the capital market vacillated

148

Vilakshan, XIMB Journal of Management

rolling settlement, Technical report, World Bank. Mohanty Deepak, 1994, Stock of Financial Assets in India An Estimate, 1961-1990., The Journal of Income and Wealth, Vol. 16, no.2, July, pp. 1-14. Narasimham Committee, Report of the Committee on the Financial System, 1991.p.29. Pandya V H (1992), Securities and Exchange Board of India: Its Role, Powers, Functions and Activities, Chartered Secretary, Vol. 22, No. 9 (Sept), p. 783. Raghunathan V, 1994, Stock Exchanges And Investments, Tata McGraw-Hill Publishing Company Ltd, New Delhi. Shah, A. & Thomas, S. (2000), David and Goliath: Displacing a primary market, Journal of Global Financial Markets 1(1), 1421. Sharma J L (1983), Efficient Capital Markets &

Random Character of Stock Prices Behaviour in a Developing Economy, Indian Journal of Economics, Vol. 63, No. 251 (Oct-Dec), p. 395. Tarapore, S.S., 1986, Financial Sector Reforms: Retrospect and Prospect, RBI Bulletin , pp.299-306. Thomas, S. (2001), The anatomy of a stock market crisis: Indias equity market in March 2001, Technical report, IGIDR. Thomas, S. & Shah, A. (1999), Risk and the Indian economy, in K. S. Parikh, ed., India Development Report 1999-2000, Oxford University Press, chapter 16, pp. 231242 Varma J R & Venkiteswaran N (1990), Guidelines on Share Valuation : How Fair is Fair Value?, Vikalpa, Vol. 15, No. 4 (OctDec), p. 3-10. Zahir M A(1992), Factors Affecting Equity Prices in India, Chartered Accountant, Vol. XL, No. 9, p. 743-748.

APPENDIX
Non-receipt Delay Odd lot Transfer Premium Insider trading Brokerage Broker problems Education it relates to the untimely receipt of share certificates, dividends and other essential documents it is the delay in listing of securities in the stock exchange these are the shares in odd numbers. Such as nineteen fifty-seven etc. the difficulties in transferring ownership the premium charged on new issues obtaining undue benefits by company insiders it is about the brokerage charged per transaction it related to difficulties with brokers such as availability, providing right information etc. providing knowledge about the equity and the market.

Forecasting Methods and Forecast Errors An Appraisal*


Sarita Supkar1 & P. Mishra2

Abstract The forecasting exercises have gained importance recently in the field of economics and management as well as in other disciplines for decision-making purposes. The present note summarizes and critically appraises the literature on forecasting methods related to different disciplines. Views of different authors on the relative advantages and disadvantages in the uses of different methods have been highlighted. Since forecasts are judged on the basis of forecast errors an attempt has been made to highlight the different methods used to estimate the forecast errors.

1.

INTRODUCTION

Since early twentieth century, use of forecasting methods in different fields has taken the centre place all over the world while making decisions. Application of forecasting method is not limited to private-organisations but is extended to Government sector as well as to the economy. Researchers have studied various types of forecasting methods used in different disciplines, errors involved in the process of forecasting and have compared various forecasting methods on the basis of forecast accuracy. The
*

undertone in the classification is the important areas of operation vis--vis forecasting techniques, the nature and behaviour of variables and the endeavour of the forecaster to study the data pattern with a quest for forecasting method of best fit. In this paper, we have critically appraised and summarized the views of different researchers relating to forecasting techniques relating to population forecasting, financial forecasting, economic forecasting and use of forecasting techniques in other areas.

1. 2.

Received May 15, 2006; Revised July 26, 2006. The present paper is based on the Ph.D. thesis of the first author submitted in Utkal University in June, 2005 under the supervision of the second author. The authors are thankful to the anonymous referee for his valuable comments and suggestions on an earlier version of the paper. Lecturer (Senior Scale), R.D.Womens College(Autonomous), Bhubaneswar, e-mail: saritasupkar@rediffmail.com . Professor, Xavier Institute of Management, Bhubaneswar, e-mail: pmishra@ximb.ac.in.

150

Vilakshan, XIMB Journal of Management

2.

FORECASTING IN DIFFERENT DOMAINS

2.1 Population Forecasting

Researches have described two main philosophical approaches to population forecasting. At one extreme, a forecaster may acknowledge that, a particular theory of population growth dictates a specific growth pattern and at the other extreme, a forecaster may rely totally on the historical growth pattern to project further growth, while ignoring the insides of the theory. From 1891, through 1920s, a number of population forecasters used extrapolative method to forecast population. Instead of applying the growth rate of the immediately preceding period, the forecasters searched for the growth curve established by the entire historical trend. Extrapolation signifies complete dependence of forecasts on the existing trends without any additional input from the forecasters understanding or intuitive judgment of the underlying processes. In contrast to the above, Raymond Pearl (1925) introduced the practice of deriving the type of curve from the theoretical assumptions of the nature of population growth. He advocated the logistic law of population growth, which calls for an initial acceleration in population increase, followed by a symmetrical deceleration of the increase, ultimately resulting in stable

population level. Pearl noticed a problem in the use of logistic curve as a method of population forecasting. The problem was in the form of arrival of the inflection point i.e. if turning point appears late, the forecasts will be on higher side and if the turning point comes early, forecasts will be on lower side. In other words, the date at which the accelerating growth becomes decelerating, is crucial for establishing the magnitude of population growth. Whelpton and Thomson, (1928) advocated Component method for population forecasting i.e. they separated the formerly undivided growth rates into its obvious components of birth rates, death rates and migration rates. They used separate and explicit projections of these rates and then combined them to reconstruct the overall growth rate. Division of growth into components provided the framework for projecting specific trends but remained silent as to how to forecast these components with certain degree of accuracy. Dublin & Lotka, (1930) introduced a new method known as the Cohort approach where there was explicit breakdown of population by age. This enabled the forecaster to follow each cohort through its life-stages and apply appropriate birth rates and death rates for each generation. It has been observed by many researchers that the combination of Component and Cohort method gives projections, which are less susceptible to the exaggeration of

Supkar et al, Forecasting Methods and ...

151

temporary fluctuations because, a more precise focus of change can be identified and adjusted. In the 50s, Hajnal, (1954) reviewed the population projections made in the late 40s & early 50s and found them to be disastrous. He analysed that the factors which are identified by researchers to have their impact on future growth of population, were likely to be outweighed by the unpredictable forces. It thus accounted for failure of more complex techniques to yield more accurate results than simple techniques, which cast doubt on the value of forecasting. He stressed that; new and more complex techniques were just as liable as past techniques to be fairly often upset by the unpredictability of history. As it is evident, researchers till 50s have ignored the impact of technological innovation on the growth of population. In the 60s, Gordon & Helmer of Rand Corporation, (1964) made a study of future technological innovations on effective birth control and dramatic medical advances. They identified that the problematic aspect of population forecasting had been fertility rates, since mortality and migration changed very gradually. The alteration of fertility rates involved, not only strict technical development, but also changing social environment concerning contraception and abortion. They made a median prediction of population for the year 1970, considering the effective fertility control by introduction of oral contraceptive. It

was observed that, introduction of technological innovation, as one of the important factors influencing population growth has given more realistic forecast results. However, Isserman, (1977) made a study on the accuracy of population projections and observed that extrapolation of population gave forecasts at least as accurate as complex demographic and structural models. He suggested a hybrid approach to forecast population of the areas comprising sub-areas growing at different rates. To increase the accuracy of forecasts, he advocated the use different models like: exponential model, linear model and double log models for different sub-areas with population growing at different rates. In early 80s, Mandell, (1982) attempted to study the selection of proper forecasting method for population estimation. He suggested the following criteria for selecting among regression-based models to forecast population. i) ii) Lowest MAPE Random pattern residuals

iii) Lowest value of the stability measure ( F statistics based on Chows test for structural change) iv) Largest adjusted R2 They stressed that, the third criteria was strongly related to estimate accuracy. Alhburg, Dennis A, Land and Kenneth discussed the application of stochastic models to assess the uncertainty of

152

Vilakshan, XIMB Journal of Management

population forecasts. They suggested, stochastic models should be developed for vital rates and then stochastic matrices be used to generate probability distributions for the future population. Though many factors were identified to have their impact on growth of population and several forecasting techniques were discussed; immigration as a factor having major impact on population was ignored by most of the researchers. Ronald D Lee and Shripad Tuljaparkar, (2000) have dealt with the issue of the basic difference in population forecasting compared to other kinds of forecasting, should warrant its own special methods. In retrospect, it appears that over the last fifty years, the census and social security forecasters attached too much importance to the most recently observed levels of fertility and morality. Demographers typically approach forecasting through dis-aggregation. Their instinct to break the population down into skillfully chosen categories, each with its own corresponding rate, forms the basis of population forecasting. Certain kinds of dis-aggregation inevitably raise the projected total, relative to moreaggregated projections. Most of the researchers agree with the view that there is considerable uncertainty involved in population forecasts. The standard method for dealing with uncertainty in demographic forecasts is the use of high, medium and low

scenarios. This approach is based on very strong and implausible assumptions about the correlation of forecast errors over time and between fertility and morality rates. Stochastic population forecasts based on time series models of vital rates appear to offer some important advantages, although long forecast horizons in demography far exceed the intended use of these models. It is necessary to impose external constraints on the models in some cases, to obtain plausible forecast behaviours. On these accounts, one should not rely on mechanical time series forecasts; in any case, they should be annexed in relation to external information. A parsimonious time series model for mortality rate appears to perform well within sample applied in various countries. Ramachandran and Singh, (2000) observed demographic transition to be a global phenomenon, which is accompanied by growth in population. For India, demographic transition is both a challenge to ensure human development and optimum utilization of human resources. To assess population growth, the Planning Commission of India, therefore, since 1958 has been constituting expert groups for population projections prior to preparation of each five-year plan. There has been consistent refinement in methodology used for population projection and on the prediction accuracy as well. For the purpose of demographic transition, factors like crude death rate (CDR) crude birth rate (CBR) and infant

Supkar et al, Forecasting Methods and ...

153

mortality rate (IMR) have been considered. Interstate differences for size of the population and population growth rate emerged from the analysis. Subsequently. in Indian context, several methods have been used in different plan periods. A major change in the methodology used to forecast population is observed in the 6th Plan period (198085). During this Plan, population projection for the period 1971 to 1996 were worked out, considering fertility and mortality as vital factors contributing to population growth. A summary of the methods used for population forecasting is presented in the following table.
Summary of methods used for population forecasting
Sl. No 1 2. 3. 4. 5. Important methods adopted by researchers Growth curves/ Extrapolative methods Component approach Cohort approach Regression based models Models with CDR/CBR/IMR

is observed with gradual inclusion of the above vital factors. In this context,Cohort and Component approach is one of the popular forecasting technique used in population estimation which takes care of these deficiencies. In the Indian context also, this method has been used for population forecasting during the different plan periods. The factors like technological innovation and impact of policy changes on population growth are yet to get their share of importance by the researchers.
2.2 Financial Forecasting

The methods like growth curves and extrapolative techniques were tried by researchers in the early part of twentieth century. However, it was observed that the methods lacked the treatment of vital factors of population growth like birth rate,death rates,age distribution, technological innovation,infant mortality rates etc. A progressive change in the application of the method of forecasting

Financial forecasting has been an area of concern in the economy, particularly in the financial market for the decision makers. Much of the early work in financial forecasting concerns developing business barometers i.e. use of forecasting methods in determining the earnings of firms in an economy. Such forecasts related to variables like earning, helps in the decision making process of the financial managers. Many researchers in this field used different forecasting techniques. Some of the studies conducted in the area of financial forecasting can be summarized as follows: In the early 60s, Little, (1962) had conducted the first systematic analysis of the behaviour of reported earnings of firms of United Kingdom. Later on, Little and Rayner, (1966) had made the same type of study on financial forecasting and concluded that annual earnings of U.K. firms follow a random walk. In other words, the changes in the earnings were

154

Vilakshan, XIMB Journal of Management

largely unsystematic or simply a matter of chance. Deviating from this method, Forster, (1977 initiated a study on the use of forecasting models on quarterly earnings. He stressed on the use of Box-Jenkins autoregressive integrated moving average (ARIMA) technique to develop quarterly earnings generating models. He advocated the alternative way to evaluate a predictive model, is to examine the relationship between earning surprise and abnormal share price movements and then correlate the two. Earning surprise is defined as the difference between actual earnings and expectations of earnings according to a specific predictive model and abnormal share price movements as the difference between actual share price movement and expectation of the movement according to a returngenerating model. His conclusion emphasized the fact that ARIMA models were better than seasonal and nonseasonal models in two ways: (1) It gives more accurate predictions of future quarterly earnings. (2) It shows high correlation with abnormal share price movements. Dharan, (1983), observed from his study of quarterly earnings of firms and their generation process, that the process is more complex than what could be represented by single firm ARIMA model. His conclusion was based on the fact that, the theory of firm was needed to identify and estimate earning models.

Bathke & Lo rek, (1984) based their research on forecasting non-seasonal quarterly earnings and stressed that univariate time-series models were better than other models in forecasting quarterly earnings. They observed the forecasts made by financial analysts or managers to be better than forecasts by time-series models. According to them, even the best single form of ARIMA model would be inferior to an expert as a proxy for capital markets expectation of future earnings. Syed S., (1994), illustrated the use of forecasts in business and planning. He had identified different forecasting methods and advocated that ignorance of suitable forecasting method and improper application might lead to erroneous results. He analysed different forecasting methods and suggested rules for proper application of methods, to forecast the earnings of firms, which according to him would lead to accurate results. Satyanarayana and Savalkar, (2003) analyzed short term forecast of corporate investment over the last three decades in India with twin objectives of examining as to how these short-term forecasts of corporate investment have performed over the last three decades and to what extent the objectives of forecasting exercise have been fulfilled. Various approaches to forecasting based on data sources of funds for corporate investment as well as forecasting corporate investment with data of term lending

Supkar et al, Forecasting Methods and ...

155

institutions were systematically explored. Utility of behavioural and nonbehavioural forecasting schemes were examined. The fact emerged was that data on investment intentions were found to be more useful in making short term forecast of corporate investment. Some interesting facts emerged from the annual studies on short-term forecast of corporate investment, which are: The top five industry groups in India claimed a lions share (bulk pertaining to engineering, chemical and infrastructure industries) of the total projects and it was in the case of 68-75% over the years 1973 to 2000-01. The study also revealed that corporate investment was taking place in five or six large states and mostly confined to the western and the southern regions of the country. Yadav, (1994) in his work on Monetary modeling in India observed that macroeconomic modeling has come a long way in India. He extensively dealt with various monetary modeling in the area for macroeconomic forecasts. Over the years, macroeconomic models for the Indian economy have acquired technical sophistication as well as diversity while broadening their structural basis. The evaluation of monetary sector modeling in India by Yadav, reveals two distinct phases. The early models constructed during 1960s and 1970s, which constitute the first phase, made pioneering contribution for economy wide models with general objectives. The second phase,

which began in early 1980s, has been marked by specificity of objectives. Having attained the analytical sophistication during the first phase, the modeling effort in the second phase became more purposeful and test oriented. Yadav opined that the shortterm forecasts models developed by Rao, Venkatachalam & Vasudevan and Mathur, Nayak & Roy focused on developing macroeconomic framework for forecasting macroeconomic aggregates as a useful input into policy formulation. These models seem to have gone beyond a mere forecasting of monetary aggregates and have made an attempt to develop methodology of forecasting the impact of government budget, on key macroeconomic aggregates. It is observed that models vary in their objectives, formulations and applications. The fact emerged from the above study is that, after three decades of modeling effort, a reasonable policy oriented model is still conspicuous by its absence. Thornton, (2004) has remarked that, as part of the Feds daily operating procedure, the Federal Reserve Bank of New York, the Board of Governors and the Treasury make a forecast of that days Treasury balance at the Fed. These forecasts are an integral part of the Feds daily operating procedure. Errors in these forecasts can generate variation in reserve supply and, consequently, the federal funds rate. This paper evaluated the accuracy of these forecasts. The evidence

156

Vilakshan, XIMB Journal of Management

suggested that, each agencys forecast contributed to the optimal, i.e., minimum variance forecast and that the trading desk of the Federal Reserve Bank of New York incorporated information from all three of the agencies forecasts, in conducting daily open market operations. Moreover, these forecasts encompassed the forecast of an economic model. Mishra, (2004), observed that several forecasting Methods are available for both short term as well as long term forecasting and efficiency of forecasting methods are often evaluated by forecast errors. He made a study to compare three different time series methods such as Moving Average, Exponential Smoothing adjusted for trends (Holts method) and Auto Regressive Integrated Moving Average (ARIMA) for forecasting the share prices of ICICI Bank with reference to the forecasting error and examine the relative efficiency of a forecasting model. Mean Absolute percentage error (MAPE) has been used to compare the efficiency of different Time Series forecasting models. He concluded that there is no thumb rule for testing the effectiveness of any forecasting methods. Technical analysis should always be supplemented by judgmental analysis to make better forecasts with respect to errors in estimation, which may help in the future decision-making process of the company.

The following table summarizes the different forecasting techniques used by different researcher in the area of financial forecasting.
Forecasting methods used in the area of financial forecasting
Sl No 1 2. 3 4 Important methods used by researchers Univariate time series models Exponential smoothing methods Autoregressive Integrated Moving Average Methods(ARIMA) Monetary modeling

A synthesis of the financial forecasting methods suggests that most of the researchers have used time series extrapolative methods to forecast the finance related variables and compared different methods to identify a proper forecasting technique. It is also observed that differences in the growth of macroeconomic aggregates during several time periods having different characteristics may affect the forecasting, unless they are addressed in the concerned forecasting models. The factors like policy changes, impact of global financial reforms need to be stressed appropriately in the process of selection of effective methods to forecast finance related variables.
2.3 Economic Forecasting

Forecasting economic variable is essential for policy making, as it requires accurate and timely information. Policy making takes time for institutional reasons and also for the time gap required for policy

Supkar et al, Forecasting Methods and ...

157

decisions to take effect. For all these reasons policy makers have to take decisions not on the basis of actual data but of a forecast of current and future events. It can be inferred that, as policy formulations and implementation take time and it takes further time to take effect upon the economy, policy settings have to be made in response to expected value rather than actual circumstances. All these confirm the need and significance of economic forecasting in policy making. Gupta G.S., (1973) emphasized that forecasting plays an important role in decision making in the sense that the use of best available technique could minimize the forecast inaccuracy. However, he could not specifically identify the forecasting technique that could be described as the best. He stressed that the choice of a method was often dictated by data-availability or urgency of forecasts. He made an attempt to classify various forecasting techniques in ascending order of sophistication. They were: a) Historical analogy method b) Trend method, c) End use method d) Survey method e) Regression method f) Leading indicators method g) Simultaneous equation method .He stressed that each forecasting technique had its own advantages & limitations. The simultaneous equation method was more popular in advanced countries and it has its limitation in less developed countries. The limitation in less developed countries was identified as unavailability of data. He also explained the importance of

forecast accuracy in decision-making and discussed the evaluation of forecast accuracy for which he recommended four methods. They were: a) Coefficient of determination test b) Root mean-square error test c) Percentage mean-absolute error test d) Percentage absolute error test. His conclusion was based on the fact that Expert judgement played a very important role in obtaining forecasts of any variable using any forecasting technique. However, Barker in the mid 80s (1985) examined and compared the forecasts from five organisations made in United Kingdom during 1979-80. They were Cambridge Econometrics (CE), the London Business School (LBS), the National Institute of Economic and Social Research (NI), the Cambridge Economic Policy Group (CEPG) and the Liverpool Research Group in Macroeconomics (LPOOL). He compared the forecasts of all groups in 1979 and also examined the accuracy of the forecasts for macroeconomic variables like GDP, unemployment and consumer priceinflation. He observed that various organisations failed to predict the timing and depth of recession correctly. He stressed the importance of availability of accurate and timely data for forecast accuracy and observed that the organisations groups, which used annual data, have performed less accurately than those organisations, which used quarterly data. However, this conclusion would be appropriate when a researcher uses either

158

Vilakshan, XIMB Journal of Management

annual or quarterly data. It may be mentioned here that his conclusion cannot be extended to forecast of macroeconomic variables, which are expressed as annual relating to forecasting data. Holden & Peel, (1986) attempted to forecast growth and inflation over years for United Kingdom. They examined the forecasts of different forecasting organisations like London Business School (LBS) and National Institute of Economic & Social Research (NI). They evaluated the performance of various forecasting techniques used by these organizations to forecast growth and inflation on the basis of forecast accuracy and concluded that forecasts produced by econometric methods were more accurate than forecasts of nave models. This was consistent with the evidence on forecast accuracy for the U.K and also with U.S.A. McNees, (1986) made an attempt to compare the forecasts from conventional econometric models like Bayesian Vector Autoregressive Model (BVAR) and Vector Autoregressive Model (VAR). He observed that in VAR models, a large number of variables were included in each equation & hence suffered from multicollinearity, with the coefficients being imprecisely determined. However in BVAR model, initially each variable had to follow a random walk with the objective of determining the impact of other variables. So estimated BVAR models had fewer parameters than VAR models. He advocated that both the

models generate unconditional forecasts, as they do not require any explicit assumptions about future-course of the economy. The variables considered by him were Nominal GNP, Money-stock, Real non-residual fixed investment and Unemployment and it was found that BVAR forecasts for the variables were better than that of VAR models. But he rightly stressed that both these models should be used as complementary tools providing different kinds of informations to forecasters. Gill & Kumar, (1992) observed several quantitative methods were available for forecasting such as ARIMA model and VAR models, which had brought Time series model and econometric models close together. They also observed that if the data series were non-stationary, then the use of VAR model might result in unstable econometric relationships, hence use of Bayesian VAR model was more precise. Their study aimed at forecasting macroeconomic data like Real GDP, Consumer PI, 90 days banks accepted bill rates (BAB). The forecasts were generated by the use of ARIMA, Multivariate VAR and Bayesian VAR models. A comparison of forecasts of Univariate model and Multivariate time series model brought out the fact that both VAR and BVAR models performed better than Univariate ARIMA for 50% and 100% of the time. For short-term forecasts, they stressed the use of BVAR model, as the forecasts of BVAR model were more accurate. They

Supkar et al, Forecasting Methods and ...

159

emphasized that the forecasting performance of the VAR model could be improved by imposing Bayesian priors on its parameters. The conclusion emerged from the overall forecasting results showed that the univariate ARIMA model could not perform better than the multivariate VAR & BVAR time series models, which allowed multivariate interaction among variables. In the 90s ,Funke, (1992) also attempted to use time series forecasting technique to forecast unemployment rate in Germany. Main issues dealt by the researcher were: (a) Alternative methods of short-term time-series forecasts were examined, (b) The forecasting performance of univariate model taking the possibility of structural change was explored, (c) Application of the forecasting methods to monthly German Unemployment rate. He made an attempt to use multiple impacts of different types to improve the forecast accuracy of univariate Box-Jenkins model in the presence of non-homogeneous data. It was observed that the multiple impacts

ARIMA model outperformed the univariate ARIMA model in both a fitting and a predictive sense. However, Clements & Hendry (1995) stressed that there are many ways of making economic forecasts. They suggested on four criteria for any model based forecasting method. They are: a) Regularities on which models are based, (b) Whether regularities were informative about the future, (c) Encapsulation of the regularities in the selected forecasting model, (d) Exclusion of non-regularities. They enumerated a number of distinct forecasting methods including Guessing, Extrapolation, Leading indicators, Surveys, Time-series models ARIMA, Vector autoregressive and Econometric system (which rely on the model containing the invariants of the economicstructure). But they emphasised the role of Leading indicators to forecast macroeconomic variables. They advocated three possibilities for reduction of forecasting error, which are:

1. Parameterisation 2. Parsimony 3. Intercept corrections

Multicollinearity

Over fittings excluding non-constant Features

160

Vilakshan, XIMB Journal of Management

Their empirical findings suggested that econometric analysis could help to improve macroeconomic forecasting procedures. They advocated intercept corrections for increasing forecast accuracy against structural breaks. Upadhyay, (1992) observed that usually random variables in time-series data were assumed to be stationary & follow stochastic process, but almost all time series data were non-stationary i.e., they were characterised by some type of trend, hence it becomes difficult to build an ARMA model. He examined the timeseries data for non-stationarity and developed models for forecasting six economic time-series. He used two methods of forecasting. They were: 1) An appropriate trend was fitted by OLS technique and residuals were estimated. Then an appropriate ARMA was developed on the residuals. Both the trend part and residual part were forecasted separately and superimposed on each other to give final forecast. A model was developed using Box Jenkins (ARIMA) method.

forecasts for the time series belonging to DS class. He concluded that as the data series belonging to TS class moved on a deterministic path with stationary fluctuation, so the series could be forecasted over for very long periods with bounded uncertainity. On the other hand as the other data series belonging to DS class had stochastic trend with cyclical component, the uncertainty in the distant future is unbounded. Sethi, (1998) based his research on shortterm forecasts. He made an attempt to prepare sufficiently precise short-term forecasts of different components of Indias domestic savings. He tried to determine the trend stationarity in time series data with different forms i.e. Simple linear, Quadratic, Cubic, Exponential cubic, Modified exponential, Gompertz and Logistic. He observed that savings had traced a non-linear growth paths. Empirical tests suggested Exponential cubic to be the function of best fit for main-aggregates of Indias savings. BoxJenkins method with four stages of identification, estimation, diagnostic checking and forecasting were executed. The forecasted structural composition revealed that the largest chunk of domestic savings would continue accruing from household sector and the least from public sector. As per the forecasts, the relative share of the household sector would consistently decline and that of the private sector would continue to gain momentum towards the generation of domestic

2)

He grouped the data in two groups a) TS group, which contains data series moving on a deterministic path with stationary fluctuation b) DS group, containing data showing stochastic trend with cyclical component .He observed that all economic time-series belonging to TS class had done better with first method and second method had given better

Supkar et al, Forecasting Methods and ...

161

savings. On the basis of forecasts of savings of different sectors of India he emphasized that the policy implication should be to curtail the size of public sector to enhance the overall efficiency of the economy. Sims & Zha, (1998) observed that if dynamic-multivariate models were to be used to guide decision-making, probability assessment of forecasts or policy projections should be provided. They developed methods to introduce prior information in reduced form and structural VAR models without introducing substantial new computational burdens. They concluded that Bayesian methods could be extended to larger models and to models with over identifying restrictions, which according to them would increase the transparency & reproducibility of Bayesian methods and be more useful for forecasting and policy-analysis. Clements and Krolzig (1998) evaluated the forecast performance of two leading non-linear models that had been proposed for US-GNP i.e. the self-exciting threshold autoregressive model (SETAR) and Markov-switching autoregressive model (MS-AR). They observed that construction of multi-period forecasts was difficult in comparison to linear models. They had referred to the earlier study made by Clements & Smith which compared a number of alternative methods of obtaining multiperiod forecasts including normal forecast error.On the basis of their comparative

analysis they suggested that SETAR model forecasts of US-GNP were superior to forecasts from linear AR models, particularly when forecasts are made during a recession. Their findings based on empirical studies suggested that the MS-AR and SETAR models have done better than linear models in capturing features of business cycles. Diebold, (1998) attempted to study the past and present era of macroeconomic forecasting and observed that structural economic forecasting was based on postulated systems of decision rules and had enjoyed a golden age in the 50s and 60s, following advances in Keynessian theory in 1930s. The two then declined together in the 70s & 80s.The evolution of non-structural forecasting has outweighed the importance of structural forecasting and continued towards vast increase in use and popularity at a rapid rate. While comparing the role of both structural and non-structural macroeconomic forecasting with logical reasoning, he explored that the future of structural and non-structural forecasting was intertwined. He stressed that the ongoing development of non-structural forecasting, together with recent developments in dynamic stochastic general equilibrium theory and associated structural estimation methods bode well for the future of macroeconomic forecasting. He concluded the hallmark of macroeconomic forecasting over the next 20 years would be a marriage of the best

162

Vilakshan, XIMB Journal of Management

of the non-structural and structural approaches, facilitated by advances in numerical and simulation techniques that would help the researchers to solve, estimate simulate the forecast with rich models. Samanta, (1999) observed that over the years, non-linear model building became an integral part of any forecasting exercise dealing with time-series data. To him, any model essentially tries to approximate the generating process of the time-series, in its own-way. Estimation of the model also requires making some simplified specific assumptions about the behaviour of the series. Thus appropriateness in capturing the behaviour of a series and accuracies in forecasts by a particular model depends heavily on the validity of the assumptions. He stressed that the forecasts performance of any model could be judged by estimating forecast errors where lowest forecast error would indicate better performance. He identified two methods for comparing forecast performance of various forecasting models. First method was about calculation of probable error values for the variables in different time period, for which the forecasting exercise might be repeated for a number of times including one extra observation in each repetition, forecasts might be generated for time points where actual data are already available. The author observed that the above method helped in comparing the forecast performance of

various models but fails to quantify the extent of percentage errors in forecasts. It could only indicate relative performance of the various models & rank them qualitatively. The second method was about the calculation of Root-Mean-Square-Percentage errors (RMSPE), which suggested that the lower the value of RMSPE, better would be the forecast performance. He had estimated four different univariate time-series models i.e. ARMA, Bilinear modeling, RCA and SETAR. Empirical results showed that the performance of SETAR model was found to be effective for forecasting a few time-series data. Overall performance of the models indicated that Bilinear modeling was the best for generating one month ahead forecasts, followed by SETAR & ARMA. The SETAR model was found to be more efficient in generating multi-step forecast, which ensures the capability of SETAR models to capture the behaviour of a wide-class of time-series. Thus it was concluded that SETAR could at least be considered as potential alternative for modeling and forecasting any timeseries. Bhattacharya, Ria & Agarwal, (1999) made an attempt to forecast some macroeconomic variables of Indian economy for the year 1999-2000. They forecasted for the variables like GDPs growth rate, growth rate of the Indian economy, industrial growth rate, imports, deficit on trade-account, money supply & interest rates. The methodology used by them were:

Supkar et al, Forecasting Methods and ...

163

1.

Comput able general equilibrium models (large blocks of simultaneous equations) were used to generate short-term forecasts. Macro-econometric models were used for medium or long term forecasting.

1.

Construction of a theoretical model of macro-economy on the basis of appropriate framework, with chosen degree of dis-aggregation. Acquiring time series data for all variables for the period to be studied. Estimation of behaviour equations for which usually OLS methods were used. The whole model including technical equations, identities and behavioural equations were solved using Gauss Seidel method to generate the values of endogenous variables. Then the model was validated by examining the behaviour of errors in terms of statistical measures like Root-Mean-Square Error (RMSE) and Inequality statistics. The validated model could be used to forecast values of variables.

2.

2. 3.

The technique of regression estimation method was used in the Macro econometric model to create four interrelated blocks of equations: the production block, the monetary block, the fiscal block & the external block. These methods were used by the authors to forecast the selected macroeconomic variables on the basis of the time-period of forecast. Bhattacharya & Kar (1999) analysed the usefulness of Macro-econometric modeling in forecasting in many ways such as: 1. It provided an opportunity to test alternative theories about different aspects of the economy. Policy simulations based on macro econometric models could provide the net-effect of stimuli. Macro-econometric exercises could be used as a useful technique for forecasting macroeconomic variables.

4.

5.

6.

2.

They also discussed some theoretical aspects of a macro-econometric model for the Indian economy, which according to them would be useful for forecasting macroeconomic variables. Bidarkota, (2001) experimented with the inflation rates of United States and found the rates to be shifted in its mean level and variability. He had evaluated the performance of 3 useful models for studying such shifts. They were: 1. Markov switching models, 2. Statespace models with heavy tailed errors,

3.

They described that Macro-modeling was based on the structural macro-modeling methodology associated with the Cowles Commission. The methodology adopted by them can be described in the following steps.

164

Vilakshan, XIMB Journal of Management

3.

State-space models with compound error distributions.

He observed that all the three models had similar performance when evaluated in terms of mean-squared or mean absolute forecast errors. He stressed that the later two models were more parsimonious and easily could beat the more profligately parameterized Markov-switching models in terms of model selection criteria. He concluded that these models might serve as a useful alternative to the Markov switching model for capturing shifts in time. Harvey, Leybourne and Newbold (2001), made their study in the spirit of exploratory data-analysis. Their main interest was focused on the forecasts made by a large panel. Their forecasts under went regular monthly revisions & the data set was rich & voluminous. On this line, the forecasts of GDP growth, inflation and unemployment in the UK made by a panel of forecasts had been analysed. Annual outcomes were predicted and forecasts were revised monthly over a period of 24 months. Consensus forecasts could be calculated as a simple average of all panel members forecasts at any point of time. They observed that the consensus forecasts evolved towards actual outcomes with diminishing cross-sectional standard deviations. Finally they attempted to assess the magnitude of eventual consensus forecast errors from the cross-

sectional standard deviations i.e. from the degree of consensus among individual forecaster. The conclusion, which emerged from empirical investigation, was that the forecaster variability played a limited role in anticipating the reliability of the consensus forecasts. Thus, the methodology adopted by them is a combination of qualitative and quantitative forecasting methods. Croushove and Stark, (2001), made an attempt to describe the reasons for the construction of real time data set. They described the importance of real time data set for macro-economists, explained how data were assembled and showed the extent to which some data revisions were potentially large enough to matter for forecasting. The empirical exercise suggested that, when evaluated over very long periods, forecast error statistics were not sensitive to the distinction between real time data and latest available data even though forecasts for isolated periods could diverge. Mishra (2005) observed that time series data often exhibit differential trends in different sub-periods, when examined either as a function of time or as a function of one or more determinants. In such cases, a large forecast error is generated, if attempt is made to forecast the variable using pooled data for the entire time period. Test of Structural stability of functions in different subperiods and addressing it while

Supkar et al, Forecasting Methods and ...

165

forecasting becomes a necessary condition in such situations. Structural stability is often examined with Chows Test and if instability is observed in two or more periods then the latest period data is used for forecasting. However, using this method causes loss of degrees of freedom for the researcher. Hence, dummy variable as an alternative method is suggested by the author to address the differences in the subperiods and forecast the values of the variable without any loss of degrees of freedom In Indian context, during Eighth Plan, The Planning Commission has used the mathematical and quantitative model like the Leontief input-output model for forecasting of economic variables. It became a powerful instrument in determining the economic interrelationship between different sectors of production. Input output tables came to be used in the projection of long term economic growth scenario and also for working out sectoral output. Similarly, during the Tenth five-year Plan (20022007) an exhaustive exercise was carried out on the forecast of labour force participation. Projections of labour force for this Plan has been estimated on the basis of age specific and sex specific study of labour force participation rates (LFPR). A summary of the forecasting techniques used for economic forecasting is presented in the following table.

Summary of forecasting techniques used for economic forecasting


Sl. No. 1 2 3 4 5 6 Important methods used by researchers Simple trend method Simple/Multiple regression technique Expotential techniques Leading indicator methods Vector Autoregressive Method (VAR) Autoregressive Integrated Moving Average Methods (ARIMA) Macroeconomic Modelling Input output Models

7 8

It is observed that wide variety of forecasting techniques are used to forecast economic variables with variations from simple trend method to much sophisticated ARIMA and econometric modeling technique. It may be mentioned that economic forecasts are used for planning purposes. In such ceases input output model along with regression technique have been used to arrive at the forecasts. However, a progressive trend in approach of the researchers for relatively more efficient methods is noticed. Consequently, they have emphasized that economic variables are affected by multiple external factors such as governmental policies, turning points in business cycles etc. To study the nature and behaviour of data, its stationarity/ non-stationarity and to select a befitting

166

Vilakshan, XIMB Journal of Management

forecasting techniques, these factors need to be addressed appropriately. A note of caution in this respect is that the choice of a model should not be based on the complexity of the model but on the reality of capturing the trend of the data. Very often it has been mentioned that much simpler method gives better forecasts then the much sophisticated ones. Therefore, if the purpose is short term forecasts much reliance has to be made on the least forecast errors.
2.4 Uses of Forecasting Methods in other areas

One approach to forecast company sales is, to forecast the market potential and then multiply it by a forecast of the percentage of this potential. This percentage known as the market share will be determined by the cumulative effect of previous marketing strategies for the company. It is known as the breakdown method of forecasting company sales. In this context, Hardie, Fader, Winneiwski, (1998), have observed that, though numerous researchers had proposed different models to forecast trial sales for new products, they lacked the systematic understanding about the working of the models. The major findings of the comprehensive investigation of eight leading models and three different parameter estimation methods were: 1. For consumer-packaged goods, simple models that allow relatively limited flexibility provide significantly better forecasts than more complex specifications. Models that explicitly accommodate heterogeneity in purchasing rates across consumers, tend to offer better forecasts than that do not. Maximum likelihood estimation appears to offer more accurate and stable forecasts than non-linear least squares.

Forecasting are also used in the following areas for decision making. a) Sales and demand forecasting b) Business related forecasting c) Other Miscellaneous areas 2.4.1 Sales and demand forecasting Strategic Corporate Planning operates in an environment of uncertainty and a good demand/sales forecasting reduces some of these uncertainties. The information regarding what (product and services) to whom (market segments) and when (time pattern), is a necessary input for planning in all functional areas of a firm. Sales forecasting has long run as well as short run needs. Long run forecast is needed for organizational changes such as divisional decentralization, opening new territories, acquiring new companies, changing advertising agencies, adding new products, extending product lines and dropping old products etc.

2.

3.

Hassens, (1998) has examined the problems of forecasting ongoing factory orders and monitoring retail demand

Supkar et al, Forecasting Methods and ...

167

with specific reference to high technology consumer durables. They advocated that, different data sources and models could be used to increase prediction accuracy of the forecasts. On the basis of their assessment of the relative efficiency of different forecasting model, they assured that extrapolation method with time series data could be most befitting for this area. They have used Extrapolative method to examine the order placement and retail demand process and focused on identifying short vs long run movements in orders. They have also used marketing mix data for improved retail demand tracking method in their study and proposed the use of conjoint measurement data to simulate a products utility over time with inclusion of the information in the demand model. Similarly Chen, Ryan & David Simchi (2000), advocated that an important phenomenon often observed in supply chain management, known as the bullwhip effect, implies that, demand variability increases as one moves up the supply chain, i.e., as one moves away from customer demand. They have tried to quantify this effect for simple, two-stage, supply chains consisting of a single retailer and a single manufacturer. They have considered two types of demand processes, a correlated demand process and a demand process with a linear trend. They demonstrated that the use of an exponential smoothing forecast by the retailer can cause the bullwhip effect and contrast these results with the increase in

variability due to the use of a moving average forecast. Steffens, (2001) has discerned that, forecasting industry-sales is vital component of a companys planning and control activities. Sales for most mature durable product categories are dominated by replacement purchases. Previous sales models, which explicitly incorporated a component of sales due to replacement, assumed that there was an age distribution for replacements of existing units, which remained constant over time. However they stated that changes in factors such as product reliability/ durability, price, repair costs, scrapping values, styling and economic conditions would result in the mean replacement age of units. They developed a model for such time varying replacement behavior and empirically tested that for an Australian automotive industry. The study confirmed a substantial increase in the average aggregate replacement age for motor vehicles over years. Much of this variation could be explained by real price increase and a linear temporal trend. Consequently, it was found that the time varying model significantly outperformed previous models, both in terms of fitting and forecasting sales data. The above studies indicate that, estimates of sales potential are a prerequisite for companys planning and future decisions. The most frequently used approaches for forecasting sales and demand are extrapolative methods and probabilistic models.

168

Vilakshan, XIMB Journal of Management

Important Methods used for Sales and Demand Forecasting


Sl. No. Methods adopted by researchers 1 Market share method Regression/Maximum Likelihood estimation method Extrapolative method Exponential smoothing method Time varying models Probabilistic models

2 3 4 5 6

economy, Industry and business. The methods that could be used without preanalysing the data were linear models and models using quadratic, cubic, exponential, modified exponentials, Gompertz logistic form of equations. However, the authors remained silent regarding the choice of best model. Lubecke and Thomas H, (1995), examined the performance of ten Mathematical objective (composite) models in terms of accuracy and correction. These composite models were employed to generate onemonth forecasts of U.K. pound, the Deutsche mark, the French franc, the Japanese yen, and the Swiss franc over the period 1986-89. The results indicated that, the two composite models i.e. the constrained linear combination model and the constrained multiple objective programming model, performed well according to correction criterion. It was observed that, in terms of accuracy, the focus forecasting and the technical model performed better. However, they could not identify any forecasting method to be the best under all circumstances. Bloom, Mitchel F, (1995), had prepared trend line projections of the selected variables for United States. Trends were prepared using simple methods. The cases where past trends were approximately linear, extrapolation method was resorted by using constant increment per year. The cases where past trends were found to be exponentially increasing, extrapolation was resorted to, assuming constant growth rate per year. They had projected

Various components like market share, trial sale for new product, retail demand, supply-chain management are considered by researchers to forecast sales/demand. As the market boundaries are becoming global and the competitive edge sharper their impact needs consideration for the twenty-first century sales and demand forecasting. Moreover, with the advent of the fast moving information technology, qualitative methods such as Delphi and expert opinion are gaining importance now a days. The consensus in the area of demand/sale forecast is the integration of both quantitative and the qualitative methods for better forecasts. 2.4.2 Business related forecasting Wa, Chao-yen, and Jin, (1993), made a study on the use of forecasting methods for industry and business. They stressed that, forecasting involves the presentation of a statement concerning uncertain events, which helps in decision-making. They identified several methods designed for forecasting variables concerned in the

Supkar et al, Forecasting Methods and ...

169

the foreca sts of two major growing occupations in United States over the period 1988 to 2000 and identified the two areas of high growth to be Communication & Computers and Health care. Cherunilam (2001) observed that business decisions, particularly strategic ones, need a clear identification of relevant variables and a detailed in-depth analysis of them in the form of environmental analysis and forecasting. They identified the first important step in environmental forecasting is identification of the environmental inputs to the firm. The next step is the collection of the needed information and choice of appropriate forecasting technique. One issue often debated is the quantitative versus qualitative techniques. But the fact is that each has oits own merits and limitations. It is often pointed out that, the differences in the predictions using each type of approach, is often minimal. Various forecasts, which emerged as important forecasts of business environment, are economic environment, social environment, political environment etc. Short-term economic forecasts are important for demand and sales forecasting and marketing strategy formulation. They suggested the use of time series methods. Besides economic forecasts, there are number of social factors which have profound impact on business, like population growth/decline, age structure, occupational pattern, rural

urban distribution of population, expenditure pattern social attitudes etc. It has been observed that social trends have significant implications for business strategy. Quantitative techniques like time series analysis and econometric methods and qualitative methods like Delphi method or a combination of both qualitative and quantitative techniques may be used for social forecasts. Political forecasts have an important part in envisioning properly the future scenario of business. Changes in the relative power of political parties, political alliances and political ideologies are important factors having influence on business environment. Pre-election polls may help certain political forecasts. Dua & Banerjee, (2001) have observed that with the recent increase in globalisation of the economy, policy makers, businessman and financial analysts are closely tacking the external sector. The key driver in the external sector is the level of export because it directly impacts the domestic economic performance. The study attempts to construct a leading index incorporating real exports, rise of exports and the value of exports. The authors incorporating the index have used different components affecting exports, which converses to an explanatory method.The findings of the study indicate that the level of leading index for exports leads the quantum index, the unit value index and the total value index. The lead profile analysis shared that the lead

170

Vilakshan, XIMB Journal of Management

profile of leading index for exports is compared with the reference cycle of the growth rate of unit value index and the later performs better. Limitation of the study as mentioned by the author in Indias exports is that a significant volume of exports constitutes barter trade. Composition of exports basically in the form of primary product has its adverse effects on the predictive ability of the models. Sen and Swain, (2002) have provided a realistic projection of the pension liabilities of the Central Government, after the implementation of Vth Central Pay Commission. The pension though, is a small component compared to salary bill, displays an increasing trend and therefore apprehended to be of some concern for the future. Keeping the above factors in view the study had been taken up to provide a realistic picture on the future position of Government employment and pension liabilities. They have used the methods adopted by the Planning Commission so that, judicious decisions could be taken on manpower planning by the Central Government.
Methods used for business related forecasting
Sl. No. 1 2 3 Important methods used by researchers Mathematical composite models objective

The methods used for business related forecasting mostly relate to mathematical/statistical models. It may be mentioned here that economic and socio-political policies/factors may affect the forecast in the present fast changing business environment. Therefore, to increase the accuracy of forecasts these factors need to be built in into the models. 2.4.3 Other Miscellaneous areas: Forecasting techniques have also been used in other areas namely, energy, mineral, weather, tourism, etc. Researchers have used several techniques to forecast the variables relating to the above areas. A detailed analysis of the techniques used in such areas is beyond the scope of the study. However, some recent studies in the area of energy have been conducted by Planning Commission, Government of India (1995) Bartels R and Fiebig G.D and Banerjee (2004). The studies of Jorquera, Wilfredo & Jose (2002) and James Taylor & Buizza (2004) in the areas of weather forecasting are noteworthy. A study in the area of tourism by Davies, (2003) is also worth mentioning.
3. Forecasting Accuracy

Extrapolative methods Econometric techniques/ Explanatory methods

Economic forecasting is not an end in itself. It is rather an input, an aid to those who make economic decisions, whether in the private or public sector. The value and usefulness of a forecast, therefore depends fundamentally on the extent to which it contributes constructively to that end. It is natural to wish to evaluate this

Supkar et al, Forecasting Methods and ...

171

contribution. In practice, however this is not easy to do, because it is almost impossible to establish precisely what role a forecast has played in any given policy decision. In short, it is surprisingly difficult to specify precisely what constitutes a good or right forecast. Nevertheless, the task is important. Policymakers need to know how dependable forecasts are and what degree of reliance is appropriate to place upon them. Researchers from time to time have given weightage to measure forecasts accuracy. In this section, an attempt has been made to highlight different methods adopted by researchers for measuring forecast error. Ray (1988) attempted to evaluate the forecast performances of three methods namely, (i) Box-Jenkins, (ii) Bi-linear and (iii) Threshold Autoregression on the basis of ten Indian economic time-series (which were related to finance, price and production of Indian economy). Ray advocated that in the ever-changing and complicated world, no mathematical model could take care of all changes and the best way to test the efficiency of a forecasting model is to see how best it could predict the future accurately. The author tested the ten Indian time-series for linearity and found them to be nonlinear. The mean-square errors formed the basis of comparison of performances of the three models. It was found that the performance of Bi-linear model was better than the other two methods for all lead periods and Box-Jenkins model had outperformed Threshold Autoregression

model. The author concluded on the basis of results that the performance of Bi-linear model was best followed by Box-Jenkins and Threshold Autoregression in that order. Scott, Collopy, Fred (1992) evaluated measures for making comparisons of errors across time-series. They analysed 90 annual and 101 quarterly economic time-series. Error measures were judged on reliability, construct validity, sensitivity to small changes, protection against outliers and their relationship to decision-making. On the basis of results, they recommended the geometric mean of the relative absolute error (GMRAE) which compares the absolute error of a given method to that of a random-walkforecast for judging accuracy. Collopy and Fred (1992) had attempted to examine the reliability or feasibility of rule-based forecasting. For this, a rulebase was developed to make extrapolation forecasts for economic and demographic time-series. The rule base combined the forecasts from four extrapolation methods, which were the random walk, regression, Browns linear exponential smoothing and Holts exponential smoothing. They evaluated the accuracy of these four methods with rule-based method and found improvement in accuracy of the rulebased forecasts over equally weighted combined forecasts. Makridakis (1993) expressed his concern regarding the most appropriate accuracy measures used for evaluating forecasting

172

Vilakshan, XIMB Journal of Management

methods and for reporting error statistics. The best accuracy measures must not be unduly influenced by outliers. MAPE (Mean Absolute Percentage Error) is a relative measure that incorporates the best characteristics among various accuracy criteria. It can be used both for evaluating large-scale empirical studies and for presenting specific results. Wang, George, Akabay and Charles (1994) advocated that regression based models based on ordinary least square method could be good for forecasting if the following assumptions were true: 1) The residuals of one period were not correlated to the residuals of previous periods i.e. no autocorrelation. The residuals were normally distributed with zero mean and constant variance i.e. no heteroscedasticity. Independent variables were not correlated i.e. No multi-collinearity.

2)

3)

These assumptions relate to the classical linear regression model. The authors examined the effect of autocorrelation on forecasting efficiency. They identified the sources of autocorrelation as habit formation, institution traditions, missing variables and incorrectly specified model. They adopted the standard methods of identifying autocorrelation and its correction like visual test, runs test and D-W statistic. They preferred Generalised least square model to an ordinary least square model, for reducing the effect of autocorrelation.

Cassuto & Alexander E, (1995) examined the presence of autocorrelation & heteroscedasticity in the data and attempted to solve the two problems. In the study, they also addressed misspecification error. They emphasized that if equations in a model were not correctly specified, then adjustments for autocorrelation, heteroscedasticity is not required. They identified 2 forms of specification errors.(i) Missing variable, (ii) Incorrect model. They had given emphasis on inclusion of all important variables in the model & selection of correct structure of a model. For correct structural model they identified three important aspects to be taken care-of. 1) The lagged structure of the model .2) Functional form of the model. 3) An OLS model must not have feedback effect between independent and dependent variables. They advocated the use of ARCH correction method for the simultaneous presence of autocorrelation and heteroscedasticity in the data. Chase & Charles (1995) made an attempt to find the best measure of forecast accuracy. They described the most frequently used measure of forecast accuracy in the corporate world as percent attainment of forecast, which is (Actual Value/Forecast x 100), which resembles the MAPE. They suggested that forecast accuracy measurement should be a learning process, not a tool to evaluate performance. Hence, the best way to improve forecast accuracy is by measuring the outcome. For this, they

Supkar et al, Forecasting Methods and ...

173

defined a number of specific measures of accuracy such as: a) Mean error, b) Mean absolute deviation, c) Mean absolute percent error (MAPE), d) Weighted mean absolute percent error (WMAPE). The authors observed that the commonality of these algebric measures is that they all relate to the difference between actual value and forecasted value. For increasing forecast accuracy, each of this measure should be close to zero. Armstrong, Scott, Robert (1995) referred to Clements & Handy, who proposed the Generalized forecast error second moment (GFESM) as an improvement to the Mean-square error (MSE) in comparing forecast performance across data series. This interpretation was based on the fact that ranking based on GFESM remain unaltered if the series were linearly transformed. Prior empirical studies had observed that the mean square error was not an appropriate measure to serve as a basis for comparison and this had undermined the claims made for GFESM. Roy & Dual (1995) stressed in their study that the expected error variance of the combined forecast was necessarily lower than that of individual forecasts. A measure to combine forecasts, to reduce the error-variance was developed and applied to the data. A panel of U.S. economists using different forecasting methods developed the forecasts. Attempt was made to determine the benefits of combining forecasts. It was observed that the benefits of combining

vary with the number of forecasts combined and with the diversity in theories and techniques among the component forecasts. Filardo (1999) examined the reliability of five popular recession prediction models, they were: 1. 2. 3. Composite indicators index of leading

Neftcis model a formal statistical model of the probability of recession. Porbit model allows the analyst to assess the importance of multiple indicators simultaneously. GDP forecasting model regression based model. Stock Watson model which uses an index as leading indicator which is the weighted average of industrial production, real personal income less transfer payments, real sales in manufacturing & trade, and total employee hours in non-agricultural establishments. The forecasting performances of all models were examined using two measures. a) Timeliness, b) Accuracy.

4. 5.

The author used two different kinds of data sets for all models. (a) Recently published data sets. (b) A real time data set. He observed that all five prediction models could provide reliable information about future recession. Some models missed spotting some past recessions, some had sent more signal than others, some were more accurate at

174

Vilakshan, XIMB Journal of Management

certain forecast horizons and some were more robust to real time data than others. Jordi (2000) made an attempt to analyze the size and nature of the errors in GDP forecasts in the G7 countries from 1971 to 1995. The forecasts of GDP under consideration were produced by the organization for economic co-operation and development by IMF. The evaluation of accuracy of the forecasts was based on the properties of the difference between the realization & the forecast. He stressed that a forecast could be considered to be accurate if it would be unbiased and efficient. He also examined that tests of accuracy and offered a non-parametric method of assessment.
4. CONCLUSION

Forecasting is essentially an endeavour to predict efficiently to make the inevitable error as small as possible. It is highly important for decision making process in a variety of areas, such as, demography, economics and business. Various forecasting techniques are available for forecasting of variables relating to the above areas. Forecasting techniques in different areas suggest that reasons for forecasting are multi angled. However researchers have emphasized that forecasting methods are chosen on the basis of their suitability for dealing with uniqueness of the problem in question. For example for population forecast there has been consistent refinement in methodology used for population projection and on the prediction accuracy as well. Researchers have looked into the

relevance of the demographic variables or matter while forecasting.. For the purpose of demographic transition, factors like crude death rate (CDR) crude birth rate (CBR) and infant mortality rate (IMR) have been considered. In this context, Cohort and Component approach is one of the popular forecasting techniques used in population estimation. In the Indian context , this method has been used for population forecasting during the different plan periods. So far as financial forecasting methods are concerned most of the researchers have used time series extrapolative methods along with the autoregressive methods to forecast the finance related variables and compared different methods to identify a proper forecasting technique. Similarly, researchers have used several econometric as well as time series models for forecasting macroeconomic variables and have addressed several econometric problems while arriving at the correct forecasts. However, comparisons with respect to forecast errors, to identify a suitable forecasting technique have not been made by all the researcers. For economic planning, input output model along with regression technique (both single and simultaneous equations) have been used to arrive at the forecasts in different developing countries including India. The methods used for forecasting of macroeconomic variables have also been used for demand/supply and other business related variables. However it may be pointed out that no unique method can be identified as the best

Supkar et al, Forecasting Methods and ...

175

method of forecasting. It has been experimented and shown by researchers that the presently used sophisticated models like ARIMA for the time series forecasting do not reign over the domain of accurate forecasting, rather relatively simpler methods like regression models (testing structural stability/ instability) and exponential smoothing methods can produce better results in well diagnosed areas. This could be,for example, due to the severety of stationarity in time series data and its remedy while using an ARIMA model. Similarly,often simpler models like exponential models or time series decomposition give better forecasts while forecasting time series data. Therefore one is inclined to conclude that there is no thumb rule for the choice of the best forecasting method; rather the researcher has to use an insightful trial and error approach to arrive at the minimum forecast error while selecting an appropriate model. It may be pointed out that to judge forecast accuracy, a measure of forecast errors such as RMSPE, MAPE, MAD, MSE, WMAPE etc. have to be examined. Needless to mention that lower is the error component higher will be the accuracy. Therefore the researcher has to use his judgment in selecting the appropriate tool for measuring the error. Very often it has been observed that more than one measure of error are used by researchers and comparison of the forecast made for decision making. Therefore, technical analysis should always be supplemented by judgmental analysis to make better forecasts with

respect to errors in estimation and thus help in the decision-making process.


REFERENCES
Armstrong Scott J., Fred, Collopy, (1992), Error measures for generalizing about forecasting methods, June 1992,International Journal of Forecasting, vol 12. Armstrong, Scott J., Filder, Robert, (1995), Correspondence on the selection of error measures for comparison among forecasting methods, Journal of Forecasting, Vol 14. A Technical note on the Sixth Plan of India, Planning Commission, Government of India, page 23-25 A technical note to the Eighth Plan of India (199297) , chapter 2, Planning Commission, Government of India, page 27 A Technical Note to Eighth Plan of India (1995), Govt. of India, Planning Commission Ascher William (1978) Forecasting: an appraisal for policy maker and planner, John Hopkins University Press, London. Banerjee S. P., (2004), Mineral availability and Environmental challenges- vision 2020 statement The Indian Mining and Engineering Journal, vol 43 no 11, November 2004 Barker, (1985), Forecasting the 1980-82 recession, Economic forecasting: an introduction by Holden K and others, 1990, New York CUP. Bartels R. ,Fiebig G. D, (1996), Metering and modeling residential end-use electricity load curves,Journal of forecasting, vol-15, issue6 Bathke & Lorek, (1984), A time-series analysis of Non-seasonal quarterly earnings data: Journal of Accounting Research, Spring 1984

176

Vilakshan, XIMB Journal of Management

Bhattacharya B. B., Ria V.P.O & Agarwal M. M., (1999), Forecast 99 Business today. Bhattacharya B. B. & Kar S., (1999), A macroeconometric model of Indian economy: some theoretical aspects. Discussion paper. Bidarkota V.Prasad, (2001), Alternative regime switching models for forecasting inflation , Journal of Forecasting, vol-20. Cassuto, Alexander. E, (1995), Non-normal error patterns How to handle them, The Journal of Business Forecasting: Methods & Systems, vol-14. Chase, Charles W. jr (1995), Measuring forecast accuracy, The journal of business forecasting methods and systems vol 14. Chen Frank, Ryan K. Jennifer, Simchi David (2000), The impact of exponential smoothing forecasts on the bullwhip effect, Naval Research Logistics. Cherunilam Francis (2001), Environmental Analysis and forecasting Global Economy and Business Environment, Himalaya Publishing House, New Delhi. Clements Michael P. & Hendry P., (1995), Macroeconomic forecasting & modeling, The Economic journal, 105. Clements and Krolzig, (1998), A comparison of forecast performance of Market-switching and Threshold Autoregressive models of UN-GNP, Econometrics journal vol-1. Collopy, Ford, Armstrong, Scott J, (1992), Rule based forecasting Development and validation of an Expert Systems, Approach to combining time-series extrapolation Management Science vol-xxxviii. Croushove Dean and Stark John, (2001), A real time data set for macroeconomists, Journal of Econometrics, 105.

Davies Brian, (2003), The role of quantitative and qualitative research in industrial studies of tourism, International Journal of Tourism Research Dharan B. G., (1983), Identification and issues for a casual earnings model: Journal of Accounting Research, Spring 1983 Diebold Francis X, (1998), The past, present and future of Macroeconomic forecasting, Journal of Economic Perspectives, vol-12. Dublin L. I. and Lotka A.J, (1930), The present outlook for population growth, American sociological society publication.24, pp 106114. Dua Pami and Banerjee Anirvan May 2001Leading index for Indias exports, Occasional paper, Centre for Development Economics, N.Delhi Filardo J. Andrew, (1999), How reliable are recession prediction models ?, Economic Review, Second Quarter 1999. Forster, (1977), Quarterly accounting data Time Series Properties and Predictive ability results. January 1977 pp 1-21. Funke Michael, (1992), Time Series forecasting of the German unemployment rate, Journal of forecasting, vol-11. Gill D. B. S. & Kumar K, (1992), An empirical study of modeling & forecasting macroeconomic time-series Gordon T.J., Helmer Olaf of Rand Corporation, (1964) Report on long range forecasting study, paper no.p-2982. Gupta G.S., (1973), Note on forecasting techniques , Dissertation paper, IIM Ahmedabad. Hajnal J., (1954), Prospects for population forecasts, pp-311.

Supkar et al, Forecasting Methods and ...

177

Holden & Peel (1986), Macroeconomic Forecasting, Short term forecast of inflation & growth 197584- Current issues in Macroeconomics Edited by David Greenway, Cambridge University Press. Hardie G.S, Bruce, Fader, Peter S., Winneiwski Michael, (1998), An empirical comparison of new product trial for forecasting models, Journal of Forecasting, Vol-17, Issue_3-4. Harvey D, Leybourne J.S. & Newbold P., (2001), Analysis of a Panel of U.K. Macroeconomic forecasts, Econometrics journal vol-4. Hassens M. Dominique, (1998), Order forecasts, retail sales and the marketing mix for consumer durables, Journal of Forecasting, Vol-17, Issue_3-4 Isserman Andrew M, (1977) Projection, forecast, and plan: on the future of population forecasting, Journal of American Panning Association, Vol 50. Jordi Pons, (2000), The accuracy of IMF & OECD forecasts for G7 countries , Journal of Forecasting, vol-19. Jorquera H., Palma Wilfredo, Jos Tapia, (2002) A ground-level ozone forecasting model for Santiago, Chile, Journal of Forecasting, Volume 21, Issue 6. Little, ,(1962) Higgledy Piggledy growth: Bulletin of Oxford Institute of Economics and Statistics, Nov-1962 pp 389-412 Makridakis Spyros, (1993) Accuracy Measures: Theoritical and practical concerns, International Journal of Forecasting, vol 9 Mandell, Maryloce & Jaymun Jeffery, (1982) Measuring stability in regression models of population estimation, Demography, Vol-10 Manes, (1986), Accuracy of Macroeconomic forecasts , Economic Forecasting: an

introduction by Holden K and others, 1990 New York CUP. Mishra P., (2004), A comparison of forecast errors using technical analysis An experiment with ICICI share prices, -The Journal of Business Prospective, Vol.8 Mishra P. (2005), Forecasting Time Series Data with special reference to Structural Stability A comparison of three forecasting methods Vilakshan XIMB Journal of Management, Vol.2 Pearl Raymond, (1925), The biology of population growth, Forecasting: an appraisal for policy maker and planner by William Ascher, John Hopkins University Press, London. Paul Steffens R. (2001), An aggregate sales model for consumer durables incorporating a time varying mean replacement age Journal of forecasting, Vol-20 Projection of electricity generation by the Eighth Plan (1995) , Govt. of India, Planning Commission Ramachandran Prema and Singh Mohan (2000) Trends, Projections, Challenges and Opportunities, Working paper series Planning Commission, Government of India Ray D., (1988), Comparison of forecasts: An empirical investigation, Sankhya: The Indian journal of Statistics. vol 50 Report of working Group on Employment Planning and Policy for Tenth 5 yr Plan 2002-07, Para 3.5, page 27-28 Planning Commission, Government of India Report of the working group on employment planning and policy for the Tenth 5 Yr Plan 200207, Planning Commission, Govt. of India Ronald D Lee, Tuljaparkar Shripad, (2000), Population forecasting for fiscal planning- Issues

178

Vilakshan, XIMB Journal of Management

& innovations , Institute of Business & Economic Research, New Delhi Roy B., Dua Pami, (1995), Forecaster diversity and benefits of combining forecasts, Management Science, vol XLI. Samanta G. P., (1999) On Forecast Performance Of Setar Model: An Empirical Investigation, Journal Of Quantitative Economics vol-15. Satyanarayana R. and Savalkar S. V. (2003), Short term forecasts of corporate investments since 1970s Occasional Papers, Reserve Bank Of India, vol-2 Sen Pronab and Swain Sibani, (2002), Technical study on retirement and pensions projections of the Central Government, Planning Commission, Govt. of India, Working Paper series, Paper no. 1/2002-PC. Sethi A. S., (1998), Forecasting savings in India in post liberalisation scenario A note Indian journal of quantitative economics. Side Shahabuddin, (1994), Forecasting, is it a technique or a random walk: International Journal of Management, vol 11 Sims C. A. & Zha Tao, Bayesian methods for dynamic multivariate models, International Economic Review vol-30.

Taylor W.James and Buizza Roberto (2004), A comparison of temperature density forecasts from GARCH and Atmospheric models, Journal of forecasting, Vol-23, Issue-5 Thornton Daniel L, (2004), Forecasting the Treasurys balance at the Fed, Journal of Forecasting, vol-23 data, Indian Journal of Economics, vol-80. Upadhyay Ghanashyam, (1992), Modeling nonstationary macroeconomic time-series, RBI Occasional Papers.vol-1 War, George C.S., Lakebay, Charles K, (1994), Autocorrelation: Problems and solutions in regression modeling , T h e J o u r n a l o f Business Forecasting: Methods & Systems, vol-13. Whelpton P.K., Thompson I.W.S., (1928), Population of United States, 1925-75, Journal of the American Statistical Association, Vol 23 Wu, Chao-Yen, Jin, Jonggian, (1993), Forecasting methods for Industry and Business, Computers and Industrial Engineering vol-25. Yadav Narendra, (1994), Monetary modeling in India , McMillan Publications, New Delhi.

You might also like